Top Banner
Inside: Visual testing | Test Qualifications | Regression testing Chris Livesey on the massive potential of testing IT's Invisible Giant Visit TEST online at www.testmagazine.co.uk Volume 3: issue 2: apr il 2011 I nnOVaTiOn F OR S OfTwaRE Q uaLiTy
52

TEST Magazine - April-May 2011

Mar 09, 2016

Download

Documents

31 Media

The April-May 2011 issue of TEST Magazine
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: TEST Magazine - April-May 2011

Inside: Visual testing | Test Qualifi cations | Regression testing

Chris Livesey on the massive potential of testing

IT's Invisible

Giant

Visit TEST online at www.testmagazine.co.uk

Volume 3: issue 2: april 2011

InnOVaT iOn FOR SOfTwaRE QuaL i Ty

TE

ST

:

IN

NO

VA

TI

ON

F

OR

S

OF

TW

AR

E

QU

AL

IT

YV

OL

UM

E

3:

IS

SU

E

2: A

PR

IL

2

01

1

Page 2: TEST Magazine - April-May 2011
Page 3: TEST Magazine - April-May 2011

Feature | 1

April 2011 | TESTwww.testmagazine.co.uk

It can be quite an event when you suddenly notice a particular technology coming of age. and so it seems to be with the

social networks. Take them or leave them, you certainly can’t deny their all pervasive presence in the modern world.

From cyber-bullying to flash-mobs to global events they all are now organised, exposed, commented upon and endlessly tweeted about on the networks. With multi-awards winning Hollywood blockbusters dramatising their birth they have perhaps become the ultimate post-modern fad. We can only guess what their impact on productivity is, well as it happens one survey suggested that they can actually increase productivity by as much as nine percent! Although it has to be added that the same report urged caution and a fairly firm hand when sanctioning their use in the workplace.

I remember back in the day, reading about worries that the widespread introduction of email would cause all sorts of issues as staff spent all day firing off personal emails to friends and family. Of course this didn’t happen and if it did, the impact was negligible. And these days, how could you actually stop a member of staff from using social networks, without confiscating a whole pile of mobile technology from them when they enter the office?

In the business sphere, to give some idea of just how important social media are becoming, Gartner predicts that spending on social software to support sales, marketing

and customer service processes alone will exceed a billion dollars worldwide by 2013

So, can we agree that by and large the social networks are a benign presence? Perhaps even a force for great good? Across North Africa and the Middle East and increasingly beyond, ordinary people have been using the power of the social networks – Facebook and Twitter in particular – to help them organise demonstrations, rally support for their causes and as an important medium for updating their fellow demonstrators with the real impartial news of what is going on in the streets, untroubled by state interference, bias and censorship. Then there is also of course the reassurance that the wider world is watching, receiving the uncensored version of events from those witnessing them at first hand. And those ‘at the other end’ of the Internet wherever they are can show their support and encouragement for the various struggles, which must give them a tremendous boost.

So successful were the social networks recently that Egypt’s erstwhile president Hosni Mubarak felt it necessary to shut down the Internet in the country. This proved to be too little too late for his regime and the social networks claimed their first popular victory.

On that revolutionary note, until next time...

Matt Bailey, Editor

Leader | 1

a social revolution

In the business sphere, to give

some idea of just how important

social media are becoming,

Gartner predicts that spending on

social software to support sales,

marketing and customer service

processes alone will exceed a

billion dollars worldwide by 2013.

So, can we agree that by and large

the social networks are a benign

presence?Perhaps even a force

for great good?

Matt Bailey, Editor Matt Bailey, Editor

Editor Matthew [email protected] Tel: +44 (0)203 056 4599

To advertise contact:Grant [email protected]: +44(0)203 056 4598

production & designToni Barrington [email protected] Cook [email protected]

Editorial & advertising Enquiries 31 Media Ltd, Three Tuns House109 Borough High StreetLondon SE1 1NLTel: +44 (0) 870 863 6930Fax: +44 (0) 870 085 8837Email: [email protected] Web: www.testmagazine.co.uk

printed by Pensord, Tram Road, Pontllanfraith, Blackwood. NP12 2YA

© 2011 31 Media Limited. All rights reserved.

TEST Magazine is edited, designed, and published by 31 Media Limited. No part of TEST Magazine may be reproduced, transmitted, stored electronically, distributed, or copied, in whole or part without the prior written consent of the publisher. A reprint service is available.

Opinions expressed in this journal do not necessarily refl ect those of the editor or TEST Magazine or its publisher, 31 Media Limited.

ISSN 2040-0160

InnOVaT iOn FOR SOfTwaRE QuaL i TyInside: Visual testing | Test Qualifi cations | Regression testing

Chris Livesey on the massive potential of testing

IT's Invisible

Giant

Visit TEST online at www.testmagazine.co.uk

Volume 3: Issue 2: April 2011

INNOVAT ION FOR SOFTWARE QUAL I TY

TE

ST

: I

NN

OV

AT

IO

N F

OR

SO

FT

WA

RE

QU

AL

IT

Y

VO

LU

ME

3

: I

SS

UE

2

: AP

RI

L

20

11

Page 4: TEST Magazine - April-May 2011

2 | Feature

Can you predict the future?

Don’t leave anything to chance.Forecast tests the performance, reliability and scalability

of your business critical IT systems backed up by Facilita’s

specialist professional services, training and expert support.

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

Page 5: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Contents | 3

1 Leader column A social revolution. Social networks as a revolutionary force for good?

4 news

6 cover story – iT’s invisible giant The growing importance of the software testing market has not gone unnoticed among

executives and investors alike. chris Livesey takes a look at the possibilities, threats, challenges and opportunities for the testing industry.

12 Visual testing Raspal chima looks at visual testing as a way of addressing the communication

problems that are commonly encountered between software developers and testers.

16 watching the defectives With its inaugural event under its belt, The Defectives, a sort of software testing social

club, looks set to be the scourge of dodgy code and sloppy software across the UK. Matt Bailey reports.

20 Raising the standard for testing With its recently announced collaboration with software testing training consultancy

Pinta, the Institution of Engineering and Technology (IET) is targeting testers with its ICTTech award standard. TEST spoke to the Institution’s ICTTech product manager, Jane black.

24 in defence of regression testing Regression testing has a bad reputation, but gary gilmore is here to set the record

straight and rehabilitate this most maligned of methods.

28 The ten strongest infl uences on software product engineering in the last ten years

Raja bavani takes a look at what he believes are the ten most signifi cant factors affecting software development over the last decade.

32 The art of throw-away test automation Test automation has failed to date simply because we cannot afford to throw it away

when it is no longer relevant. To address this issue, george wilson says that business agility requires disposable test assets.

36 are you using live customer data outside of your production database? If you are using ‘live’ production data for testing you could be entering a world of pain.

Richard fine reports.

42 TEST directory

48 Last word – angelina Samarooangelina Samaroo steps in for Dave Whalen with some helpful training advice.

Contents...APRIL 2011

Can you predict the future?

Don’t leave anything to chance.Forecast tests the performance, reliability and scalability

of your business critical IT systems backed up by Facilita’s

specialist professional services, training and expert support.

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

12

24

6

36 32

Page 6: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

Reports have suggested that the London Stock Exchange's migration to a new Linux-based trading platform has been

causing it problems. The problems affect price feed data and have manifested in incorrect share pricing and also delays in critical end of day pricing.

The LSE shifted to the Millennium Exchange Suse Enterprise Linux-based platform for its UK cash markets in February this year. The platform has a chequered history. In November last year a trading pool based on the platform was knocked offline for two hours due to 'human error', causing the general Millennium Exchange shift to be postponed until this February.

ZDNet stated that the system and data vendors have had trouble interfacing with the new data feeds from the new platform

despite having had access to the platform's test environment and network connections for a year.

“This has the hallmarks of a poorly controlled project lacking in appropriate process and quality gates,” speculated Stephen Johnson, director, at testing consultancy ROQ IT. “Large migration and integration projects are notoriously difficult and must be treated with respect. The only way to deliver these systems is to have clear process and communication and the QA team having a respected place at the top table. It’s amazing to think that either these issues weren’t discovered in a year of testing or that the decision was made to go live despite these show-stopping issues. This clearly shows the value of having an independent testing team or function that is willing to stand up and stop chaos prevailing.”

OuTSOuRcE TESTER OpEnS chicagO OfficE

Philadelphia headquartered, appLabs, which claims to be the world's largest software testing and quality management company has

announced the opening of a chicago office, serving appLabs customers in uS Midwest region. according to the company the office will support professionals including field territory managers, consultants and support staff.

“AppLabs is aggressively responding to the needs of our customer base in the United States,” said AppLabs senior vice president John Carmody. “As North America continues along its growth path, it is only natural that we also expand our physical footprint as well. Our new Chicago office provides convenient access to our customers in the Midwest and is strategically located in an area where we predict future business growth.”

Most of the company’s 150 customers and ten strategic alliances are US based. The US also accounts for more than 60 percent of the company’s revenues. Continuing with its regional ramp-up the company is also planning to open more offices in the US over the next three quarters to meet existing and future demand for its software testing and quality assurance services.

London Stock Exchange has problems with migration to Linux

St. James’s palace has chosen to use google’s computing infrastructure to power the only official royal wedding website, which went live in March at www.officialroyalwedding2011.org.

The site is hosted on Google App Engine, which lets developers run their web applications on the Google infrastructure, and according to Google is secure, easy to maintain and designed to handle large, global peaks in web traffic – and thus perfect for the happy occasion.

It will be used to update people on the latest news relating to the wedding, with links to other resources such as Twitter, YouTube, Flickr and Facebook.

ROyaLS chOOSE gOOgLE TO hOST wEdding wEbSiTE

Stress testing partnership

Reflective Solutions has announced a new technology partnership agreement with,

Vetasi, a leading Enterprise asset Management (EaM) consultancy.

Reflective will supply Vetasi with its flagship product, StressTester, which is used by major organisations to test that their Intranet and Extranet applications and their e-commerce Web sites are fully optimised and scalable to support their business requirements.

The agreement is a part of Vetasi’s strategic development plans aimed at further establishing its position as a leading IBM Maximo EAM consultancy service provider. This will include the launch of an integrated performance testing and application ‘tuning’ service for existing Maximo deployments based on Reflective’s StressTester tool.

Vetasi also plan to launch a

managed application monitoring service based on Reflective’s application availability and performance monitoring tool, Sentinel. This will provide Maximo users with real-time performance information designed to trigger an alert whenever a functional or performance problem is identified, before it impacts the application’s users.

“We have been genuinely impressed with the ease of use of the Reflective tools and the depth of experience the company already has in the Maximo space,” comments Jim Mullan, principle technical consultant at Vetasi. “StressTester and Sentinel will help to give us an edge in the market and offer our customers highly valuable services that can deliver rapid ROI. This signals an exciting new phase in the development of our business.

Page 7: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

NEW

S

Centrix Software, a leading provider of software for unified end-user computing, has announced that it has doubled its quality assurance and development teams to support what

it says is increasing demand for its centrix workSpace solutions in strategic desktop transformation, cloud and virtualisation initiatives.

The expansion includes the appointment of Craig Henderson as director of engineering who will oversee the company’s test, architecture, support and development teams. Prior to joining Centrix Software, Henderson held management positions at Rockwell Collins and IRIS Software, and most recently was director of product development at JDA Software, a leading international supply chain software company.

“Large organisations at the moment are faced by a perfect storm of changes: new ways of delivering services and

managing IT are impacting established internal processes, while external economic forces are putting pressure on organisations to work in more flexible ways. IT decision-makers have to know more about the dynamics of their user environments to make the right choices, which is why we are seeing demand for our solutions in the market. With the influence of the Web, desktop virtualisation and cloud-based services, the move from desktop to end-user computing is well underway; we help organisations make the transition faster to more agile service delivery," commented Lisa Hammond, CEO at Centrix Software. “The expansion of our team, and the recruitment of more partners in the UK and Germany, is a great next step in establishing ourselves at the forefront of this new market opportunity.”

cEnTRix dOubLES Qa and dEVELOpMEnT wORkfORcES

Ofqual, the uk’s examinations regulator, has blamed three iT problems with the marking system that caused thousands of students to receive incorrect

gcSE and a-level marks last year. Ofqual pointed to issues with the way the examination board, the assessment and Qualifications alliance (aQa), dealt with project management, user acceptance testing (uaT), and software training for its onscreen marking system.

“Any regulatory body worth its salt should be involving themselves in the development process / lifecycle,” comments senior testing consultant Mark Shilling. “These entities are the guardians to products going live, so should help define the business benchmarks in success criteria for quality gates, as well as actively reviewing what is produced throughout to ensure this is adhered to. Instilling standards in a ‘reactive’ rather than ‘proactive’ manner seems more like a post mortem than a helping hand in achieving the goal. In summary, they’ve sneakily slipped the hangman’s noose around AQA’s necks rather than their own.”

“It looks like Ofqual has hit the nail square on the head here. It highlights where and what went wrong but I think you have to go back to the planning phase to where the decisions were made,” comments SQS senior consultant Darrell Roarty. “Someone obviously made these decisions. Surely Agile should have been employed. With more BA and End User Input, they would be closer to the product.”

Ciaran Butler, SQS test analyst adds: “It is up to the client to make clear what level of quality is required: I worked for Swisscom (in Switzerland) for five years and quality was expected and achieved. In the UK, most people do not expect quality, so poor quality is delivered. Most UK employees would be sacked by most quality Swiss companies in a couple of weeks (though in reality, most of them would not be employed by them in the first place).”

Ofqual blames flawed testing for incorrect A level grading

Back to the ’60s with virtualisation

a leading iT security expert claims that, despite all the media hype, virtualisation is actually not a new technology, and dates all the way back to the 1960s. professor John walker,

member of the Security advisory group of iSaca’s London chapter and cTO of Secure-bastion, said that, although it’s not a new technology, it has recently come to the forefront again and offers organisations many benefits to the enterprise iT environment.

During an online presentation Professor Walker said that while virtualisation's benefits include reduced server sprawl and a quicker build time, there are clear security issues. As with any system, or application configuration, he said, control is vital to security, and its professionals should remember that this security principal applies to the on-line and off-line images alike. IT professionals, he went on to say, should take care to ensure that new builds are tracked, and that, again, as with conventional systems and applications, virtualised environments need to be patched up and fixed and also suffer from vulnerabilities.

Despite the potential security headaches associated with virtual networks, Professor Walker said that VLANs have become a great security enabler for the enterprise and that VM environments are ideal platforms for IT testing. VM systems are also ideal tools for the mobile security tester, he went on to say, adding that this is because they support the running of multiple operating systems, multiple applications and multiple tools, “And if you break it, you just recopy the image,” he explained.

Page 8: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

The idea that a practice which has been part of the IT industry for decades could now be seen as one of its most important new growth areas may raise a few eyebrows. Yet the growing importance of the software testing market has not gone unnoticed among executives and investors alike. chris Livesey presents his ‘state of the testing nation’.

IT’s invisible giant

6 | Test cover story

Page 9: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Test cover story | 7

In the fast-moving information technology industry, barely a year goes by without a new ‘next big thing’; a radical new

discovery which will transform the way businesses conduct their everyday operations, while saving them a small fortune in the bargain. although the truth is rarely so clear-cut, in recent years we have seen innovative technologies such as SOa, virtualisation and cloud computing generate significant hype before their business benefits were fully realised.

When compared to these innovations, the idea that a practice which has been part of the IT industry for decades could now be seen as one of its most important new growth areas may raise a few eyebrows. Yet the growing importance of the software testing market has not gone unnoticed among executives and investors alike. The past eighteen months has seen software testing buck the downward trend in M&A activity, with vendors looking to gain a foothold in fast-growing areas of an industry which, ten to fifteen years ago, would barely have been recognised as a discipline in its own right.

While testing as a practice dates back as far as software development itself, it is only recently that it has become recognised as a distinct expertise and an area which can provide true competitive advantage to a development team and the business as a whole. This change in perception, allied to a growing need for software testing in an increasingly applications-reliant world, has made testing, and the broader process of software quality, a growing concern.

And there is no reason why it shouldn’t be. The testing process is believed to consume between one third and one half of all software development budgets, so it is crucial that its efforts are a success. In an industry in which only 32 percent of projects are successful and half of all development efforts are wasted , there is little margin for error.

Most importantly of all however, the growing importance of applications to modern society is driving the growth of testing. Increased levels of demand for

applications, particularly Web 2.0 and online applications, has made their value to the business even greater. Yet it has also made the cost of their failure all the more tangible.

Every software application requires testing in one form or another, from a simple Web 2.0 widget through to a major enterprise system. With both consumers and businesses demanding smarter, faster and ever more sophisticated applications, and the cost of application failure becoming ever greater, the need for testing solutions and services which can ensure this is achieved is more pressing.

Testing is now big business. Testing software tools alone are worth an estimated $2 billion worldwide each year. These tools have, in turn, enabled the rapid growth of the testing services market, which one analyst has estimated to be worth in the region of $30 billion per year, rising to $46 billion in the next four years . This underlines why testing can legitimately be termed ‘IT’s invisible giant’.

why test? For a growing number of organisations today, their software is the first or only means of interaction with their customers. It is, therefore, essential that it works.

Most organisations have now realised that the effect of poor or insufficient testing can be disastrous. A 2002 study conducted by the US-based National Institute of Standards and Technology (NIST) calculated that software errors account for $59.5 billion worth of loss each year in the US alone. This, despite any improvements in software quality practices which may have taken place in the meantime, is a shocking statistic, and one which illustrates exactly why this is an issue. Indeed if the cost of software errors has increased in line with economic growth over the past eight years then we can estimate that its current cost to the US alone is somewhere in the region of $90 billion.

But the cost of defective software is not only financial. The effects of poor testing now stretch way beyond the back office, to the boardroom and even to the brand.

Toyota’s recent recall of hundreds of thousands of its hybrid vehicles (such as the successful Prius model) due to

While testing as a practice dates back as far as software development itself, it is only recently that it has become recognised as a distinct expertise and an area which can provide true competitive advantage to a development team and the business as a whole. This change in perception, allied to a growing need for software testing in an increasingly applications-reliant world, has made testing, and the broader process of software quality, a growing concern.

Page 10: TEST Magazine - April-May 2011

8 | Test cover story

a problem with braking is one such example. What was believed to be little more than a small software glitch not only left the car giant millions of dollars out of pocket, but also risked the brand’s reputation on a global scale. The effect this may have had on the brand’s long-term standing will not be truly understood for at least another decade.

Senior executives, too, are not immune to the side-effects of software failure. In 2010, the CFO of one of Britain’s most prominent travel companies felt compelled to offer his resignation, following an accounting error (caused by faulty software) which forced the company to write down over £100 million worth of sales. This problem, like Toyota’s, would likely have been detected had rigorous software testing procedures been in place.

The potential cost of IT failure is simply too high to be ignored. In the Internet age, applications are the primary way in which individuals, both within the organisation and outside it, interact with a business. Put in these terms, rolling out an application which has not been through adequate testing procedures would be equivalent to sending out an untrained salesperson to meet customers, or instantly promoting a newly-recruited graduate to the role of CEO. For a business to function, so must its software, which is why testing is growing in stature.

who tests?Software testing is no longer the domain purely of large organisations running major enterprise systems. From independent software vendors through to one-man-band developers, testing is now an essential element of the IT function.

With organisations of all sizes now looking to move their business critical systems onto cloud platforms, this will create new challenges for IT professionals, requiring testing at every stage of the development process. The adoption of Agile practices, which allow testing to occur alongside development, will also serve to increase

the volume of tests being conducted. And with testing taking place earlier and more frequently within the development process, there is a greater need for test automation.

With so much testing taking place, it is little surprise that a number of the world’s largest technology companies are major players in the sector. The key players in testing services, which accounts for the lion’s share of testing revenues, include some of the world’s most established IT brands, but the market is increasingly being dominated by large outsourcing providers. India is the new home of testing services, and both this and the increased strategic importance of testing have meant that the typical profile of a tester is changing.

In the past, the common misconception was often that software testers were lower-skilled than other IT professionals such as developers. However, professional qualifications in testing are now commonplace and the majority of practitioners are now degree-educated. The emergence of a new generation of highly-skilled and educated testers across the globe is proof of the industry’s growing importance. With testing now being incorporated into agile development practices, quality is becoming a critical element of any software development effort.

Test where?It is hard to begin any conversation about the geography of software testing without first noting two of its centres: Silicon Valley, California, and Bangalore, India. Yet, when discussing today’s testing market, one would be equally justified in mentioning emerging markets such as Poland, Egypt or Brasil.

Silicon Valley, home to the majority of the world’s largest technology companies, is the spiritual home of software, and as such has also nurtured the testing methods and tools which software required as it rose to the prominence it today enjoys. Bangalore, by contrast, is the IT outsourcing capital

With organisations of all sizes now looking to move their business critical systems onto cloud platforms, this will create new challenges for IT professionals, requiring testing at every stage of the development process. The adoption of Agile practices, which allow testing to occur alongside development, will also serve to increase the volume of tests being conducted. And with testing taking place earlier and more frequently within the development process, there is a greater need for test automation.

TEST | April 2011 www.testmagazine.co.uk

Page 11: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Test cover story | 9

of the world – a five million-strong technology hub within a nation which currently produces over 400,000 technology graduates each year.

While estimates of India’s overall share of the testing market vary greatly, most analysts would now agree that it is now higher than that of the US. And the traditional balance of power in this relationship may also be shifting.

Since the turn of the century, testing vendors in developed nations have traditionally focused on strategic, value-added services and IP-rich testing tools, while labour-intensive manual testing has been outsourced to areas such as Bangalore. Now, however, with a number of years of experience and an increasingly skilled workforce behind them, India’s testing suppliers are beginning to climb the value chain. The subcontinent’s outsourcing giants are increasingly capable of offering sophisticated testing services which rival those of their American and European counterparts.

However, an increasingly diverse range of competitors is emerging, challenging the established leaders at both end of the value chain. Perhaps inspired by the track record of India, China has made concerted efforts to establish itself as a major player within the technology industry over the past decade. A recent report estimated that the Chinese software industry already accounts for revenues in excess of $50 billion each year, with testing among the fastest-growing areas of this. As Indian vendors expand to challenge European and American rivals at the higher end of the value chain (and as average wages for professionals increase accordingly), China (and, to a lesser extent, Malaysia) may well emerge as a new hub for labour-intensive, outsourced testing services.

In addition, emerging nations across Central and Eastern Europe, North Africa and Latin America are successfully taking advantage of the trend towards ‘nearsourcing’ among businesses in Western Europe and North America. The Czech Republic and Poland both now attract BPO

investment in excess of £500 million each year, while the likes of Mexico and Venezuela are becoming increasingly attractive outsourcing destinations for companies in the USA. Egypt, the Philippines, Tunisia and Bulgaria are also among a group of 14 ‘non-BRIC’ (Brasil, Russia, India, China) nations identified by the London School of Economics as potential hotbeds for BPO growth over the coming decades, and as a result are all areas where we would expect investment in testing services to increase.

Testing times ahead?If there is one certainty to be gleaned from current trends in software, it is that testing and other areas of quality assurance will continue to grow in importance for the foreseeable future. As the potential impact of software failure increases, so too do the resources available to prevent such problems from occurring. This, in turn, is increasing levels of professionalism within the industry, making testing a more strategically-important and lucrative practice.

However, to suggest that testing will simply continue its current trajectory without any major changes to the landscape of the market would be foolhardy. The market as it now stands is virtually unrecognisable from that of ten or fifteen years ago, and it is reasonable to assume that the rate of progress over the coming decade will be even more rapid.

Here are what we believe will be five major trends in software testing between now and 2020:

1. Testing’s rightful place in the cloudCloud computing will be the single greatest influence on IT practices in the years ahead, and testing will, like every other facet of technology, be affected.

In addition to the obvious benefit of flexible pricing, the cloud model has a great deal to offer to the testing industry due to what is likely to be an exponential increase in demand for load testing. Cloud is a compelling option for companies conducting

load testing, due to its ability to conduct short bursts of tests without requiring significant outlay on hardware or maintenance.

In a world where more and more organisations rely on applications (whether web-based or private), businesses require reassurance that these systems are durable enough to handle thousands of individuals using the application simultaneously and from multiple points of access. Companies who can offer the products and services required to do this look well-placed to prosper from the cloud age.

Testing-as-a-service is estimated to grow by over 33 percent each year between now and 2013, meaning that the market for cloud-based testing tools will reach over $700 million by 2013. This represents a huge opportunity for outsourcers, ISVs and software vendors alike, and shows just why many analysts have described cloud computing as the biggest step change in IT since the adoption of the Internet itself.

2. Testing skills shortage a likelihoodA 2010 survey found that almost three quarters of testing professionals in the UK felt that there was a skills shortage within the industry.

Gartner has estimated that, within non-software companies, the highest ratios of testers to developers is around 1:3, meaning that many companies may have a ratio of four or five to one, or even more. When one considers that between a third and a half of the total cost of application development is accounted for by the testing process, this seems ominously low. Such discrepancies between demand and supply show why the testing stage often becomes a bottleneck in the software development process.

While automation tools are capable of reducing much of the tester’s workload, it is clear that software testing, as a growing area of the IT industry, will require more skilled professionals in the years ahead. While the growing status of the industry will no

Page 12: TEST Magazine - April-May 2011

doubt help in attracting new graduates and school leavers into the profession, as with all skills shortages, this will not be solved overnight. Rather, it will require the co-operation of government, business and academia to identify the areas in which shortfalls are the highest and to then tailor curricula to meet these needs. While this shortfall is being addressed, responsibility for ensuring software quality will fall upon the shoulders of every stakeholder involved in the project, from analysts through to developers.

Nevertheless, working to improve the ratio of testers to developers, and identifying and tackling skills shortages in testing will be deciding factors in whether the sector, on the one hand, flourishes over the coming decade, or on the other, becomes stifled by a lack of available talent and an ongoing reputation as the major bottleneck in the development cycle.

3. an automated and continuous approach to testingOne trend which will undoubtedly shape the testing tools market over the coming years is that of test automation. While test automation tools have been available for a number of years, it is only comparatively recently that businesses have truly begun to appreciate the value which they can add to the development process.

The increased automation of testing supports a more ‘continuous’ approach to software quality. Previous modes of assuring quality tended to focus on the last mile of the development process, where testing would only commence once development was complete, leading to frequent delays and re-work. By automating testing processes, quality can be emphasised at the start of development and problems can be addressed before they become too difficult or costly to remove.

Automation tools eliminate much of the laborious nature of testing and also remove exposure to human error within the process. This has led some to suggest that increased adoption of such tools could eventually replace the majority of manual tests. This in turn would have a huge effect on the testing services market, much of which relies on an ongoing requirement

for labour-intensive, manual tasks. However, in a survey conducted by Micro Focus, manual testing still ranked as a far higher priority among testers than automation, showing that there is still a long way to go before automation becomes the norm.

In reality, there will always be a need for both manual and automated testing, and that will be as true in 2020 as it is today. Agile development practices require higher levels of automation, meaning increased adoption of Agile will lead to a growth in demand for automation tools. However, increased demand for testing services as a whole will more than compensate for any reduction in manual testing required as a result of increased adoption of automation techniques.

4. increased agilityAgile methods are becoming increasingly important in software development. Companies value their flexible and extremely effective procedures, which work even for large projects, enabling products to be completed early and subsequent adaptations to be made as well. Agile testing plays an important part in this: testing at an early stage, and in parallel with software development, ensures that the quality of the software satisfies requirements more closely.

Relevant test procedures can now be carried out earlier in the course of the project, meaning problems can be identified in good time and rectified accordingly. Combining prompt testing with automation will also lead to greater efficiency: the inaccuracies of manual processes can be eliminated and tests can be repeated.

The ability to test earlier in the development process also means that more testing can now be ‘requirements driven’. Aligning the testing and requirements processes more closely is yet another way of ensuring that software quality is built into the development process from the start, rather than being undertaken only once an application nears completion.

As Agile continues to grow as a practice, the ability to test throughout the development process becomes

ever more essential, and testing tools will need to change in order to meet this demand.

5. applications economyWhile it is by no means a new trend, one process which will continue rapidly over the coming decade is the growing importance of applications to the businesses they serve.

Even more so than today, applications, be they web-based widgets or back office batch processing systems, will be the lifeblood of the business. As Internet adoption becomes more prevalent throughout the developing world, greater strain will be placed on online applications, and with this increased demand will come greater business value and risk of failure. Further, as applications become more complex they consume more resources and can lead to increased loads. They also become more difficult to test due to the complexity of simulating interactions like those via Web 2.0 applications.

The financial and reputational cost of application failures will continue to skyrocket. The need to stay available and functional will drive a greater understanding of peak demand times and software quality processes, meaning that website outages or system failures, while not altogether becoming a thing of the past, should not occur as frequently as today. Organisations simply must test to ensure that applications perform, even under massive peak loads.

The invisible giantWith much of the IT industry still struggling to overcome the economic malaise of recent years, testing is one of the few areas in enterprise technology which is bucking the trend and showing strong growth prospects. With a market size which will soon exceed $50 billion, and prevailing trends showing that demand in the sector shows no signs of slowing, the potential rewards for companies and individuals which can succeed in this space are huge. While it may never make front-page headlines or appear at the top of many board agendas, software testing is now, and will continue to be, IT’s invisible giant.

chris Livesey Vice president, application management and quality, EMEA and Latin America Micro Focuswww.microfocus.com

TEST | April 2011 www.testmagazine.co.uk

10 | Test cover story

Page 13: TEST Magazine - April-May 2011

For more informationContact Grant Farrell on +44 (0) 203 056 4598

Email: [email protected]

Website: www.testfocusgroups.com

28th June 2011� One Day Event � 80 Decision Makers � 10 Thought Leading Debate Sessions

Peer-to-Peer Networking � Exhibition � Cutting Edge Content

Helping you overcome obstaclesF O C U S G R O U P S

Event Sponsor: Sponsors:

Can you predict the future?Forecast tests the performance, reliability and scalability

of IT systems. Combine with Facilita’s outstanding

professional services and expert support and the future is

no longer guesswork.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

4th October Guoman Tower Hotel. London

7th December Plaisterers Hall, London

WINTER 2010

Page 14: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

Software testing is a time-intensive process which is often overlooked during the planning of a software project. Yet this is a critical stage to ensuring product quality – and resolving hard-to-fix issues can easily turn an otherwise successful project into a disappointment. Here, Raspal chima looks at visual testing as a way of addressing the communication problems that are commonly encountered between software developers and testers.

Visual testing

12 | Visual Testing

Page 15: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

The goal of a visual testing tool is to provide developers with the ability to examine what was happening at the point of software failure. Ideally, the testing tool should present the data in such a way that the developer can easily find the information he requires, and the information is expressed clearly. Visual representation is clearly the most effective way of conveying this information in practice, but finding a visual test tool that meets these requirements is one of the main problems in implementing this sort of methodology.

Professional software development companies understand the need for strict regimes of software

testing and quality assurance to ensure a product is fi t for purpose before it is sold.

There are a number of ways to go about implementing these processes – however, they can all be affected by problems that increase costs and reduce efficiency. Furthermore, market pressures can force a project to be developed with unskilled programmers or insufficient time or money, resulting in increased errors, particularly when designing complex software.

Common problems stem from a lack of communication between developers (charged with fixing faults) and testers (charged with identifying faults), and they are caused by two scenarios. The first is that, quite simply, information and descriptions of faults may get lost in translation when working in a multi-national team, or when outsourcing testing-related tasks to foreign countries. The second scenario, which is more common, is that the tester is not sufficiently familiar with the deployed technology to convey the details of the problem accurately. This is particularly likely in black-box testing, where testers of software are not required to have a thorough understanding of the software and are therefore unlikely to be as technically-minded as their developer colleagues.

fault replicationAnother problem with traditional methods of software testing is the replication of faults. In order to implement a fix for a software failure, the developer needs to confirm that it exists in the first place. This is done by mimicking the set of operations performed by the tester, which led to the fault. It is a time-intensive process, and suffers from its own problems: • What if the fault does not occur under

the same circumstances for the developer?

• Does he assume the tester described the fault incorrectly?

• What if it’s a platform-dependent issue that affects the tester's system but not the developer's?

The replication of faults is not an absolute science, and the ambiguity and uncertainty it could bring will cost time and money to overcome.

Visual testingThe goal of a visual testing tool is to provide developers with the ability to examine what was happening at the point of software failure. Ideally, the testing tool should present the data in such a way that the developer can easily find the information he requires, and the information is expressed clearly. Visual representation is clearly the most effective way of conveying this information in practice, but finding a visual test tool that meets these requirements is one of the main problems in implementing this sort of methodology.

One solution is to use a screen recording tool designed specifically for software developers and testers. At its core, it’s built on the idea that showing someone a problem (or a test failure), rather than just describing it, greatly increases clarity and understanding.

Programs like BB TestAssistant enable a software tester to record the entire test process – capturing everything that occurs on their system in video format. These videos can be supplemented by real-time input via webcams (which appear as picture-in-a-picture) and audio commentary from microphones.

A screen recorder for developers provides solutions to two fundamental problems mentioned earlier:• Quality of communication is

increased dramatically because testers can show the problem (and the events leading up to it) to the tester as opposed to just describing it.

• The need to replicate test failures will cease to exist in many cases. The developer will have all the evidence he requires of a test failure and can instead focus on the cause of the fault and how it should be fixed.Visual testing tools provide useful

features like QA system integration (complete with open API) and parallel reviewing of system event logs and external log files. They should integrate fully with QA systems such as JIRA and Bugzilla. In other words, after

Visual Testing | 13

Page 16: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

Using a screen recording tool eliminates the need for testers to remember exactly what they did before they hit a bug because the tool creates a permanent record of the test as it is performed. This allows the tester to continue testing in a more informal, creative way.

recording a test, the software tester is able to use them to automatically create defect reports which comply with industry-standard QA system formats.

Of course, not everyone uses the same QA systems. For this reason, the integration API should be left open to make it easy for programmers to enable integration with other defect tracking systems.

Screen recorder-based visual test tools allow developers to review system event logs and application log files side by side with the video of the test. This means developers can draw from multiple sources of information relating to the software failure, such as:• The video itself;• Any features such as webcam;• Annotations and audio commentary

added to the movie by the tester;• Properly synchronised event logs

and log file information.The inclusion of synchronised

access to log files and system event logs means that the developer is able to analyse what's going on inside the program at the same time as viewing the tester's descriptions. This combination of low-level detail with high-level overview of the problem assists the developer in pinpointing the cause of the error quickly.

agile methodsVisual testing is particularly well-suited for environments which deploy Agile methods in their development of software. Agile methods require greater communication between testers and

developers, which plays to the strengths of visual testing for creating very detailed records of failures with very little effort.

Visual testing using a screen recorder tool is particularly well suited to Agile methods because of the improved communication between developer and tester via webcam and voice recording on videos, which encourages collaboration within small teams. In particular, it improves the effectiveness of ad-hoc testing.

Exploratory and ad hoc testingAd hoc and exploratory testing are important methodologies for checking software integrity, because they require less preparation time to implement, whilst important bugs can be found quickly.

In ad hoc testing, where testing takes place in an improvised, impromptu way, the ability of a test tool to visually record everything that occurs on a system becomes very valuable. This is equally the case with exploratory testing, which is a more structured approach to software testing and is usually carried out by skilled, independent testers.

Both of these test methods rely on an adaptable and learning approach to testing. For this reason, it is sometimes difficult to replicate a failure faithfully from memory and report it effectively.

Using a screen recording tool eliminates the need for testers to remember exactly what they did before they hit a bug because the tool creates

a permanent record of the test as it is performed. This allows the tester to continue testing in a more informal, creative way.

acceptance and usability testingA screen recorder tool is also ideal with customer acceptance and usability testing, because it can be used by anyone involved in the development process. For the customer, it becomes easy to provide detailed bug reports and feedback, and for usability testers, the tool can record user actions on screen, as well as their voice and image, to provide a complete picture of the tester’s experience.

Examples of use include:• By developers to demonstrate

current functionality of the product to the customer representative;

• By dedicated software testers to produce thorough test reports on the product, highlighting potential failures for developers to fix;

• By the customer representative to test the product according to their expectations and also to convey additional, desired functionality: “it would be helpful if the program did x when I clicked y”;BB TestAssistant for example, also

complements version control software like Subversion, in that it can be used to document development progress by showing how the product performed and behaved on a given date or on completion of a given iteration or sprint.

14 | Visual Testing

Page 17: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Visual Testing | 15

Stand out from the crowd with ICTTechAre you supporting users of ICT hardware and applications?

Do you want to prove you are worth more than just your vendor qualifi cations?

If the answer is yes then ICTTech could be what you are looking for.

As an ICTTech you will enjoy a huge range of benefi ts, including:

■ recognition of your expertise and hard work

■ globally established professional qualifi cation

■ improved career prospects

Simply submit your CV to fi nd out more: [email protected]

The Institution of Engineering and Technology is registered as a Charity in England & Wales (no 211014) and Scotland (no SCO38698). The Institution of Engineering and Technology, Michael Faraday House, Six Hills Way, Stevenage, SG1 2AY

www.theiet.org /icttech

Raspal chima Marketing manager Blueberry Consultantswww.bbconsult.co.uk

Example of a visual testing: bb Testassistant

• Event log: View log files, the windows event log, and user input (keyboard and mouse) side by side with the movie to show exactly what was happening at any instant in time.

• Qa system integration: It integrates fully with current industry-standard QA systems such as JIRA and Bugzilla. The integration API is open, so it's easy to add support for a preferred QA system as well.

• Record everything: Everything the tester sees on their screen will be recorded, even complex Windows Aero animations.

• Editing: Once the video has been recorded, you can edit it from within BB TestAssistant to include annotations and audio commentary (which can also be supplied in real-time). You also have access to other video editing options such as clipping, cropping and quality adjustment.

• Skip to edits: Each edit made by the tester acts as an anchor within the video that the developer can choose to skip to. This is particularly useful in long videos, or where the developer knows exactly which part he's looking for.

• projects: This new UI feature allows users to define projects. A project holds common configuration settings related to a single application or set of applications which a tester is working with.

• Remove inactive periods: This feature allows the user to identify and remove periods of inactivity within a movie.

• hide other processes: This security / data protection feature allows the user to restrict BB TestAssistant to recording only a specific set of processes.

• Export to word: This feature allows a user to mark important points in a movie and add notes, then automatically produce a Word document containing screenshots of all these points, together with the notes.

Summary of benefi ts• Visual testing, using a screen recorder

program negates the requirement for QA staff to have special training or skills in order to create detailed movies that show defects;

• It improves tester-developer communication by enabling the problem to be shown quickly and clearly, so simplifying the testing process;

• It is applicable to different kinds of testing:

usability Testing – where you can see exactly how your user intuitively interacts with the program via the program’s support for webcams, microphones and system event-logs.

automated Testing – leave the program recording an automated test session to obtain a permanent record of exactly what happened while you were away.

• Its use in projects complements Agile methods of software development by allowing you to involve customer representatives in a more natural fashion.

Page 18: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

With its inaugural event under its belt, The Defectives, a sort of software testing social club looks set to be the scourge of dodgy code and sloppy software in the UK. Matt Bailey reports.

Watching the Defectives

T he defectives is a software testing community that aims to bring testing professionals together

for fun and social events where they can work as a team and put their skills to the test. The community is both on-line (through Linkedin) and off-line through regular face-to-face events where testers can share testing knowledge and experiences that could benefit everyone involved. The events are based around testing real applications and prizes

are awarded for successful outcomes, for example, most defects, best or unique defects etc.

Anyone with an interest in testing can become a Defective. You could be a professional tester who wants to learn some new testing skills, a project manager who wants to find out what testing is all about or someone looking to meet new testers. Many organisations field teams, bringing a competitive edge to the events and enabling the fun, celebration (or commiserations) to carry on after the event.

16 | Social testing

Page 19: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Social testing | 17

There were five professional testing teams tasked with testing the functionality of three market-leading supermarket websites for an hour. The results were staggering; ranging from a host of simple usability issues, rogue text like, spookily, ‘tester’ in the middle of web pages and even some fundamental security issues where passwords were given away over emails.

competitive testingAt a typical event, teams will have one hour to find as many defects as they can in an application which is only revealed to the participants on the night. The approach and techniques used to find the defects is up to the individual teams. Following the event, there will be the opportunity to share with others the techniques used to test the application and find out just how the winners did it.

“We hope these events will enable testers to have a bit of fun testing, outside of the work place pressures, in a relaxed, collaborative environment,” comments Stephen Johnson, founding director of the event’s organiser ROQ IT. “It's also a great chance to catch up with colleagues and meet some new testing gurus.”

The background is very simple, “We started The Defectives as we recognised there was no such community for testing professionals to get together and share ideas / working practices etc,” explains Johnson. “As such, we really wanted to create this, but recognised we needed to add a twist to make it more interesting – hence we thought we would make it a competition (with prizes) focused around testing. The detail really evolved from that base principle.

“It is all currently driven and sponsored by ROQ IT, although in time, we hope that the network group will start to generate its own ideas and discussions etc, so that it can really take off. We will continue to have regular (quarterly) events and we have even had a blue chip offer to host our event in October this year, although as we are just finalising the details we can’t name names just yet.”

The eventThe inaugural Defectives event took place on February 15 this year and was according to the organisers a huge success; although for the three supermarkets chosen as the event’s

subjects and unknowing guinea pigs, there was a little less glory.

There were five professional testing teams tasked with testing the functionality of three market-leading supermarket websites for an hour. The results were staggering; ranging from a host of simple usability issues, rogue text like, spookily, ‘tester’ in the middle of web pages and even some fundamental security issues where passwords were given away over emails.

“In the first ten minutes there were very few defects found,” says Johnson, “but after that we were receiving well over two a minute from the teams. We chose theses sites as we thought they had enough depth to find the odd defect and we are genuinely surprised by the amount of issues found in such a short space of time. The event was a success for the testing industry and testing professionals; which is what we intended it to be. We plan to run two or three more events similar to this throughout 2011, and we look forward to reporting back on the success of these.”

The testing professionals came from organisations like Money Supermarket, iSoft, Provident Financial, Co-operative Group, Martin Dawes, CSC, HDNL and Shop Direct Group amongst other large companies. They had no prior knowledge of what they would have to test until it was revealed on the night of the event.

“What this event shows is that having the knowledge of the retail industry had no bearing on being able to find defects – good testers with an eye for detail will always find the weaknesses in any system,” observed testing consultant Toby Sinclair, “All the teams had a slightly different approach but they all managed to find a route towards the mass of defects.”

The grocery sectorInterest in the supermarket industry sites was driven by the food Industry’s

Page 20: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

18 | Social testing

One issue encountered was that after expressing interest in applying for a job at one of the supermarkets in question, a screen was displayed stating the supermarket would be in touch, however it did not ask for contact details. Perhaps they’d had enough of The Defectives by that point!

trade body, IGD, which states that the UK online grocery retail sector is now worth £4.8 billion. It predicts that the value of the online grocery market will nearly double in size by 2014, with a forecast annual growth rate more than three times faster than that of the wider grocery sector. However, evidence also suggests that online shoppers show little brand loyalty (Evolution Group 2010). With the leading supermarkets unable to rely upon their branding alone, they must work hard to deliver a great online customer experience and maintain a stake in this lucrative market.

A Forrester report from 2009 showed that 79 percent of users will drop a site after a dissatisfying visit. As such, supermarkets invest in their websites to produce an attractive, high performance system that delivers on speed, efficiency and ease of use. The Defectives’ test was to see if they actually get this route to market right, and if not, what challenges they present for online shoppers.

northern exposureThe event was held in Warrington and attracted over 30 independent IT professionals, representing a wide spectrum of roles, each with differing technical backgrounds and levels of testing experience. They all relished the challenge presented to them; to find as many defects as possible in a 60 minute testing period with no preparation or prior knowledge of the chosen websites.

Given the limited time, the testers took an exploratory approach to typical browsing, shopping basket and transactional activities. They used their experience to focus on areas of such

sites which often cause problems. A total of 107 defects were detected and logged.

It is perhaps surprising that such a high number of defects were found in such a short amount of time, especially considering that 17 high severity defects were found. These included navigation faults and broken links which prevented transactions being completed. The Defectives also found a number of problems with store locators, including one which showed the closest store was in the USA.

One issue encountered was that after expressing interest in applying for a job at one of the supermarkets in question, a screen was displayed stating the supermarket would be in touch, however it did not ask for contact details. Perhaps they’d had enough of The Defectives by that point!

Perhaps worryingly, most of these systems are provided by proven ecommerce platforms, which each supermarket configures to their own requirements and should therefore be fit for purpose; but this result shows that this is not the case.

Perfection is too much to ask for – and anyway, testers would be out of a job in a perfect world – but the results from the event add weight to the importance of releasing high quality applications, especially when the online retail market has very low brand loyalty. “Your precious brand is being compromised by poor quality” is the Defectives’ message to the supermarkets.The next event is due to take place in Leeds in May. for more information please email: [email protected]

Matt Bailey Editor31 Media Limitedwww.31media.co.uk

Page 21: TEST Magazine - April-May 2011
Page 22: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

With its recently announced collaboration with software testing training consultancy Pinta, the Institution of Engineering and Technology (IET) is targeting testers with its ICTTech award standard. TEST spoke to the Institution’s ICTTech product manager, Jane black.

Raising the standard for testing

20 | TEST organisation profile

Page 23: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST organisation profile | 21

T he institution of Engineering and Technology is a professional body for those working in engineering

and technology in the united kingdom and worldwide. it is the largest professional society for engineers and technologists in Europe with over 150,000 members in 127 countries around the world, making it a truly global organisation. its aim is to be a professional home for life for its members supporting them throughout their careers.

The IET has teams of experts looking at issues such as energy, transport, ICT, design and production, education, and the built environment. Recently its attention has been drawn to the increasingly important discipline of software testing. It is collaborating with software testing training consultancy Pinta on targeting its ICT Technician (ICTTech) award standard for those involved in the testing industry.

In the wider engineering and technology world, the IET assists the UK Parliament and government in making public policy and it is helping to plug the shortage of future engineers and technologists by working in schools to show young people the benefits of science, technology, engineering and mathematics (STEM) careers. It provides around £200,000 every year in awards and scholarships and gives undergraduates and post-graduates scholarships of up to £10,000. It grants Chartered Engineer, Incorporated Engineer, ICT Technician and Engineering Technician designations.

Not reliant on taxpayers’ money, the IET has an annual turnover of £55 million and most of its surplus is recycled back into its products and services. Bringing together engineering and technology expertise in an environment that is non-partisan and not-for-profit, the IET is based purely on the best evidence and research.

Taking it to the testersAs ICTTech Product Manager, Jane Black has responsibility at the IET for rolling the award out to testers. Like many people who find themselves

involved in IT she didn’t start out in the field, in fact she graduated in Biological Sciences and Education. Working with the IET for almost nine years, she started in the Academic Accreditation Department, going to UK and international universities as part of a team to give an independent review and international benchmark of the engineering and technology programmes (including IT). It was during this time that she started to gain an understanding of IT education.

In 2009, the IET along with the Engineering Council launched its then new professional award ICTTech aimed at individual practitioners supporting the users of IT systems and applications. Black started to become involved in the ICTTech award when she joined the Global Operations department focusing on developing the Indian and Chinese markets, including the professional awards. Her main focus now is supporting and developing the ICTTech award and the Institution’s interaction with the IT industry.

The IET helped to develop the SFIA framework, giving a common reference for the ICT industry, it continues to work closely with the Foundation Committee as it adopts the framework to meet the needs of industry. As the ICTTech is linked to SFIA Framework at level 3, Black has also been trained in implementing and using the SFIA Framework in the work environment.

TEST: What are the origins of the organisation; how did it start and develop; how has it grown and how is it structured?Jane black: The IET was formed in 1871 as a membership organisation for telegraph engineers. It has been through many changes in the last 100 years or so. It now has around 150,000 members around the world in 127 countries, in the information technology and engineering sectors. A typical member is an engineering or IT practitioner, which could cover anything from a humble technician right the way up to Nobel Prize winners.

The organisation as we know it today became the IET in 2006. Our focus is on the sharing of knowledge to support

Because testing is such a specialist field it can lose its voice in the wider IT community. Thanks to the IET’s involvement, the ICTTech qualification is now more accessible for testers. We are addressing the needs of that industry and their need to say that although they have a highly specialised role within IT, it is an incredibly important role – saving money, reputations, and sometimes, even lives.

Page 24: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

22 | TEST organisation profi le

the engineering and technology sectors, including IT. Our products include everything from specialist books, periodicals and publications, an extensive programme of technical talks on a range of subjects to a database for research. Our focus is firmly on sharing knowledge.

We also support professional development and run courses on subjects from project management to presenting skills and professional development. We support companies with graduate development schemes and go into universities to support their technology programmes. We also go into schools and colleges, offering information and running competitions. We organise and run four professional awards programmes plus a whole lot more.

We see the IET as the home for life for organisational support and development for engineering and technology professionals from start to finish through all the different stages. We aim to offer help and support throughout our members’ careers.

We have offices in Beijing and Hong Kong in China, Bangalore in India, New Jersey in the USA as well as in Stevenage here in the UK. The IET aims to support the development of people in the engineering and technology sectors – and the individuals and companies who interact throughout that sector across the globe, from childhood – supporting teachers in schools – through to universities with tools set up to help on courses and into careers and occupations.

TEST: What range of products and services does the company offer?JB: In the testing arena, the IET offers the ICTTech, a professional qualification aimed at practitioners and users in the IT field. The Engineering Council sets the standards which we are licensed to deliver, but within this broad IT category we are developing a version specifically for the software testing sector which is more applicable for this group’s specific needs – and specialist software testing training consultancy Pinta is helping us to deliver this.

The knowledge-sharing products we provide are also getting more specialised in the software testing area. As testing becomes more important

and more information is coming out of the sector we are seeing more interest in and becoming more focused on software testing.

The IT industry as a whole is at a very interesting point in its development; it is becoming a global industry and IT issues are coming over into the mainstream media every day. As never before there is now a need for IT practitioners to be seen as a professional workforce – a visible professional group across the globe – the ability to benchmark knowledge and experience is crucial to this process. We need to know that those working in, say, Bangalore have the same standards and abilities as those working at the same level in the US, UK or China.

Because testing is such a specialist field it can lose its voice in the wider IT community. Thanks to the IET’s involvement, the ICTTech qualification is now more accessible for testers. We are addressing the needs of that industry and their need to say that although they have a highly specialised role within IT, it is an incredibly important role – saving money, reputations, and sometimes, even lives.

TEST: Does the organisation have any specialisations within the software testing industry? JB: As an organisation, we are not specifically dedicated to software testing but it is covered under the ICTTech umbrella. But understanding how testing fits into the larger IT picture is valuable.

TEST: Who are the organisation’s main customers today and in the future?JB: The ICTTech standard is for individuals – to give them the chance to demonstrate they have the knowledge and experience of working in specific areas; but it is also suitable for companies and organisations. If they can say they have professionally registered staff and members, the benefit is there for the company, but it is really about individual competence. Our customers are individuals and also companies.

Beyond the ICTTech standard, we would like to create a version of Incorporated Engineer and Chartered Engineer standards specifically for software testers. The standards are

Beyond the ICTTech standard, we would like to create a version of Incorporated Engineer and Chartered Engineer standards specifically for software testers. The standards are written generically, so those working in the industry can apply their skills and knowledge to meet them. We put it in a context where those in the industry can understand and relate to it.

Page 25: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST organisation profi le | 23

Benchmarking on a wide global stage is increasingly important and this is where we fit in. I believe that the IT industry, including the software testing sector, is at an interesting point in its development. As companies become more global the need to create a more visibly professional work force across the globe will become more important and the ability to benchmark their knowledge and experience to their clients and the wider public and uphold professional and ethical standards becomes ever more important.

Jane black ICTTech product manager IETwww.theiet.org

written generically, so those working in the industry can apply their skills and knowledge to meet them. We put it in a context where those in the industry can understand and relate to it.

But we are starting off with ICT Tech at the practitioner level in the UK and rolling it out to India which is also a very important geographical area. It is a means of demonstrating that your workforce is at the right standard and I predict that people using outsourced testing expertise will start to demand professional registration in these contractors and independent benchmarking for their staff. And the organisations contracting should also be professionally registered themselves to lead the way and set the example.

TEST: What is the organisation’s commitment to corporate social responsibility, including green issues?JB: An important part of being a professional engineer is to sign a code of conduct stressing commitment to reducing your impact on the environment. They have to demonstrate their personal responsibility for their carbon footprint. They should be considering the impact of everything they do, even down to switching off a monitor at the end of the day or following the industry standards and codes. These are the things we will be looking for; making sure they are aware of them and abide by them too.

TEST: What is your view of the current state of the testing industry and how will the recent global economic turbulence affect it. What are the challenges and the opportunities?JB: We have noticed that the need and demand for professional registration increases in hard times because people need that extra advantage, something to put them a step ahead of the competition, to succeed in an era of cuts and redundancies.

Also in this era, the software development industry can really not afford to get things wrong and this fact too will fuel a drive for greater professionalism.

TEST: What are the future plans for the organisation?JB: As I mentioned before, we hope to move from the ICTTech standard to

assess whether there is a need for the Incorporated Engineer and Chartered Engineer standards in the software testing sector. But we are taking this a step at a time and making sure it is demanded and wanted by the industry and that we have the resources to do it properly. We are also always looking to develop our partnerships with companies and organisations, including large offshore testers.

TEST: Is there anything else you would like to add?JB: Benchmarking on a wide global stage is increasingly important and this is where we fit in. I believe that the IT industry, including the software testing sector, is at an interesting point in its development. As companies become more global the need to create a more visibly professional work force across the globe will become more important and the ability to benchmark their knowledge and experience to their clients and the wider public and uphold professional and ethical standards becomes ever more important.

TEST: Jane Black, thank you very much.

The iET at work

• The world was introduced to the idea of fibre optics through the IET. Fibre optics are now the back bone of the internet and mobile phone network.

• Every qualified electrician in the UK uses the IET wiring regulations.

• Right now IET members are working on major projects in China, India and the UK – on projects like the 2012 Olympics.

• The institution is at the forefront of thinking on the UK’s future smart grid system.

• Its volunteers work in hundreds of schools across the UK helping young people find out how exciting it is to work in technology or engineering

• The IET has the world’s largest tool for research and discovery in physics and engineering, called ‘Inspec’, with over 11 million abstracts.

Page 26: TEST Magazine - April-May 2011

Usually relegated to the end of a typical system test phase, it is often viewed as a relatively

worthless testing activity, yielding few defects, which threatens those ‘drop dead’ go live dates while, at the same time, tying up test resources that would surely be better utilised if they simply concentrated on fi nding defects in the new code. Regression tends to have a bad reputation amongst project managers because the testing community

has not been able to enumerate its benefi ts and express the true return on investment that good regression testing can deliver. compounding this is the fact that the actual creation of regression tests where none currently exist is often seen as a large time and cost overhead with no signifi cant, and certainly not immediate, return on investment.

Additionally, where regression does exist, the approach to test suite creation and execution can be inconsistent and haphazard across an

Regression testing has a bad reputation, but gary gilmore is here to set the record straight and rehabilitate this most maligned of methods.

In defence of regression testing

24 | Regression testing

TEST | April 2011 www.testmagazine.co.uk

Page 27: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

Regression testing | 25

organisation, without a real focus being applied to the objective of effectively addressing business risk. Furthermore, ongoing investment is not considered for the key tasks of maintaining and updating regression suites, with the result that they are quickly out of date (new code rapidly becomes old code after all) and execution unwittingly becomes a waste of time and effort. At the opposite end of the scale, non-prioritised regression suites may actually become overly large and cumbersome and simply can’t be executed in the time available.

Some of these issues amount to an incomplete understanding of the benefits of properly managed regression testing, which unfortunately is not necessarily restricted to the ‘non-testing’ disciplines outside of testing. Even where the benefits are fully understood, organisations may hold little enthusiasm for regression testing and therefore may not apply the correct level of planning, preparation, scrutiny and rigour that it deserves.

To a large extent these are the issues that are most easily solved and once a true understanding of the benefit of properly applied regression testing has been established, remedial action can be put into effect to optimise project lifecycles and the testing effort being applied.

The problem with regression testingSome of the other challenges associated with regression testing point to environments that prevail where short to medium-term focus is the order of the day. Regardless of any high level corporate vision, many organisations are driven by a project to project ‘survival’ strategy. The common obstacles in terms of lack of time, resource and budget, allied to a project lifecycle that tends to end up replicating the much-maligned ‘waterfall’ model, despite any buzz-word models being referred to at the time of project initiation, ultimately

mean that testing is always going to be the area squeezed the hardest. Regression testing is usually the main area of testing to suffer when projects are quickly running out of time, with a blind eye turned to the negative impact and cost of failure in the live environment, especially when there are other projects queuing up to be dealt with.

There can also be a lack of a true understanding of testing and acceptance of the benefits of investing early to save later. I’m sure we are all familiar with the universal law of software development and testing: the detection and fixing of a fault, or defect, before it is implemented is cheaper than finding and fixing the defect in the live environment – no argument there – but how do we ensure this becomes a reality and where does regression testing actually fit in with this? Is it even important?

Regression rehabilitatedAlthough the occurrence of defects in regression testing may indeed be considerably less when compared to other areas of testing, the defects are often of a high severity and can have significant consequences in terms of financial loss, cost to reputation, the breaching of regulatory or legal compliance or even Health and Safety issues. From a user perspective, and I refer to both the internal and external user communities, one of the biggest ‘pains’ is to discover that previously trusted and relied on functionality suddenly fails for no apparent reason. From feedback, this is one of the main reasons that businesses tend to have a poor opinion of their technology counterparts and, depending on the nature of your business, such failures may have a major impact on your organisation. Well managed regression testing, at all levels, is the one sure way of helping to prevent these types of scenario from happening.

As mentioned earlier, regression testing is traditionally viewed as a

Regression tends to have a bad reputation amongst project managers because the testing community has not been able to enumerate its benefits and express the true return on investment that good regression testing can deliver. Compounding this is the fact that the actual creation of regression tests where none currently exist is often seen as a large time and cost overhead with no significant, and certainly not immediate, return on investment.

Page 28: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

26 | Regression testing

distinct testing phase in its own right, often only performed during system testing and usually at the very end, time allowing. My philosophy is that regression testing is a valid form of testing to be estimated, planned for and performed across any and all test levels and phases, including static testing. A key aim of testing should be to actually eliminate and prevent, as much as possible, the actual introduction of errors which will later manifest as code defects and live failures. Regression testing should be fundamental to this process in that it provides confidence that existing documentation, code and functionality has not been adversely impacted by changes and fixes. The concept of ‘early testing’ does appear to be making a bit of a comeback, and about time too, but ‘early regression testing’ must be included in this philosophy.

There are no doubt organisations that do have a well-managed, robust testing methodology where regression testing is factored in, estimated and actually performed. But what about ‘business as usual’ or tactical initiatives? How many times have you witnessed an untested live ‘fix’ being implemented at any stage in a project before, during or after your live implementation? Does your organisation have a business as usual testing policy? Is regression factored into this? As we all realise, the introduction of any new code may result in failures in existing code and regression testing really should be performed to provide users with confidence that nothing has been broken or is going to break.

an excellent training machansimWhat about the mundane aspect of regression testing? It is certainly a mistake to assign experienced testers to never ending cycles of regression testing. An often unrecognised but important benefit of a well maintained regression pack is the value it holds as a reference tool for new and inexperienced testers and others unfamiliar with system functionality.

Under the management of experienced testing professionals,

it provides an excellent training mechanism for new testers, developers, analysts and business testers, experienced or otherwise. It can highlight key elements of the functionality to those unfamiliar with it, highlight appropriate testing methodologies to employ for each functional area, and can certainly save significant time and money when bringing new testers up to speed. From personal experience, it means your new recruits will not only immediately perform an important function but at the same time will get to see your test scripting and documentation standards and learn the functionality at the same time.

There is also the test automation aspect which, particularly for those of a technical mind-set, is certainly not a mundane activity. It is key that any test automation approach is return on investment focused and geared to preventing and avoiding the escalating costs all too common to many test automation programmes. However, effectively targeted, prioritised and managed test automation has been proven to very quickly realise major return on investment and find defects, despite some of the negative press, automation has seen over the last few years.

To summarise, regression testing is a major tool in the tester’s tool bag which goes a long way towards preventing the introduction of defects and improving the quality of both strategic and tactical deliveries which, in turn, provide the business with renewed levels of confidence in technology deliveries. There is certainly a need for re-education within many organisations and test teams can and should take a lead in ensuring the introduction of appropriate regression testing across all test levels and utilising sound test techniques including Risk-based Testing which will eliminate wasted time and effort and ensure the most important functionality has been tested the hardest. Statistics gathered during regular, structured retesting of previously ‘working’ functionality form compelling justification to further fund and resource regression activity and testing in general.

gary gilmore Principal consultant Edge Testing Solutionswww.edgetesting.co.uk

From a user perspective, and I refer to both the internal and external user communities, one of the biggest ‘pains’ is to discover that previously trusted and relied on functionality suddenly fails for no apparent reason. From feedback, this is one of the main reasons that businesses tend to have a poor opinion of their technology counterparts and, depending on the nature of your business, such failures may have a major impact on your organisation. Well managed regression testing, at all levels, is the one sure way of helping to prevent these types of scenario from happening.

Page 29: TEST Magazine - April-May 2011

Inside: Performance testing | Agile methods | Stress testing

Ian Kennedy on the challenges

of testing major IT projects

THE RACE WITH

COMPLEXITY

Visit T.E.S.T online at www.testmagazine.co.uk

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E R

Volume 2: Issue 4: December 2010

TESTEXPO

PREVIEW ON PAGE 36

T.

E.

S.

T T

HE

EU

RO

PE

AN

SO

FT

WA

RE

TE

ST

ER

VO

LU

ME

2: I

SS

UE

4: D

EC

EM

BE

R 2

01

0

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk

Subscribe to TEST free!

Published by 31 Media Ltd

www.31media.co.uk

Telephone: +44 (0) 870 863 6930

Facsimile: +44 (0) 870 085 8837

Email: [email protected]

Website: www.31media.co.uk

INNOVAT ION FOR SOFTWARE QUAL I TY

INNOVAT ION FOR SOFTWARE QUAL I TY

Performance testing | Agile methods | Stress testing

THE RACE THE RACE

COMPLEXITYCOMPLEXITY

Visit T.E.S.T online at www.testmagazine.co.uk

I N T O U C H W I T H T E C H N O L O G Y

I N T O U C H W I T H T E C H N O L O G Y

E S T E R

TESTEXPO

PREVIEW ON PAGE 36

Inside: Agile automation | Testing tools | Penetration testing

Raja Neravati on independent testing

Taking a strategic

approach

Visit TEST online at www.testmagazine.co.uk

Volume 3: Issue 1: February 2011

INNOVAT ION FOR SOFTWARE QUAL I TY

TE

ST

: I

NN

OV

AT

IO

N F

OR

SO

FT

WA

RE

QU

AL

IT

YV

OL

UM

E

3:

IS

SU

E

1:

FE

BR

UA

RY

2

01

1

Agile automation | Testing tools | Penetration testing

Raja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati onRaja Neravati on independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing independent testing

Taking a Taking a Taking a Taking a Taking a Taking a Taking a Taking a Taking a strategic strategic strategic strategic strategic strategic strategic strategic strategic strategic

approachapproachapproachapproachapproachapproachapproachapproachapproachapproachapproach

Visit TEST online at www.testmagazine.co.uk

OFTWARE QUAL I TY

Inside: Visual testing | Test Qualifi cations | Regression testing

Chris Livesey on the massive potential of testing

IT's Invisible Giant

Visit TEST online at www.testmagazine.co.uk

Volume 3: Issue 2: April 2011

INNOVAT ION FOR SOFTWARE QUAL I TY

TE

ST

: I

NN

OV

AT

IO

N F

OR

SO

FT

WA

RE

QU

AL

IT

Y

VO

LU

ME

3

: I

SS

UE

2

: AP

RI

L

20

11

Page 30: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

28 | TEST software product engineering

T he challenges and predictions that preoccupied the software engineering community

at the start of the last decade, especially the y2k problem, dot com bubble, and the events that followed the bubble burst had a signifi cant impact on software product engineering (SpE) community.

Many start ups had to shut down because of lack of both business growth and additional funding from investors. While it is a fact that the confidence and hope of independent software vendors (ISVs) and outsourced product development (OPD) organisations thrived right after the telecom burst, several other noteworthy

socio-economic and political factors challenged the software industry. In spite of all these challenges, during the past decade software product engineering witnessed several influencing factors. In my opinion, here are the ten most important influences that have transformed software product engineering:

growth of E-commerce and online applicationsAlthough the dot com boom of 2000 resulted in an immediate crash landing or dot com burst, within the next few years technology became a core corporate strategy for businesses to remain competitive in the Internet economy. Investments happened on two key avenues 1) Renewal or

Raja bavani takes a look at what he believes are the ten most significant factors affecting software development over the last decade.

The ten strongest infl uences on software product engineering in the last ten years

Page 31: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST software product engineering | 29

While it is a fact that email became a dominant form of communication by 2000, lots of innovations also happened in the area of collaboration. Collaboration tools for all types of users came into play. These include tools that enabled forums, surveys, networking, etc.

upgrade of IT infrastructure and 2) venture funding of start ups that pioneered in Web 2.0.

Meanwhile the growth in ecommerce and web technologies witnessed consolidations and mergers of several big players. Besides security standards such as PCI DSS (payment card industry data security standard) got enforced by regulatory authorities and the cost of conformance to such standards were valued by CXOs to ensure regulatory compliance, customer satisfaction and retention.

Specifically in the case of ISVs, online software distribution, upgrade and technical support reduced the cycle time of physical distribution and telephonic tech support. Popularity of websites and ecommerce businesses relied on the number of user visits per day. Indeed, the worldwide web related buzz, aspirations and predictions that engulfed the industry during mid ’90s transpired into viable business opportunities during the past decade and influenced SPE at a large scale.

Service orientation and new business modelsDuring the first five years after 2000, there was a significant spurt in the number of application service providers (ASP). Subsequently, a wide variety of web-based software products hosted at centre servers became popular for simultaneous use by multiple corporate entities. Also, this paradigm enabled a virtual market place for all categories of users such as vendors, suppliers, organisers and end users.

This led to new business models in industries such as hospitality, travel and transportation, internet media distribution, reality, finance and banking. Several major players emerged in these domains. Paradigms such as client-server architecture and three-tier architecture culminated in n-tier, component-based, layered, service-oriented architectures.

This transformation brought in new challenges to ISVs in terms of providing products that are scalable, available, reliable, robust and accessible to international customers. Also, product compatibility, integration,

internationalisation and localisation became as important as scalability and performance.

collaboration toolsWhile it is a fact that email became a dominant form of communication by 2000, lots of innovations also happened in the area of collaboration. Collaboration tools for all types of users came into play. These include tools that enabled forums, surveys, networking, etc.

With the dawn of Web 2.0, collaboration on the internet took several new forms such as video sharing, business networking, social networking, etc. Silicon Valley, the hot bed of technology that went through the pains of dot com burst flourished with product companies that introduced products with themes around sharing and collaboration. Collaboration on the internet became very popular and it matured to encompass multimedia.

agile software developmentThe success of Agile methodologies such as extreme programming, SCRUM, dynamic systems development method (DSDM), adaptive software development, Crystal, feature-driven development, pragmatic programming, etc, was evident among practitioners by the end of 2000.

The plethora of methodologies and buzz words related to this evolution had been waiting for an event of convergence. This event happened during February 2001 when 17 methodology experts convened at ‘The Lodge’ at Snowbird Ski Resort in the Wasatch mountains of Utah and defined ‘Agile Manifesto’ and ‘Agile Principles’. This invigorated that adoption of Agile and terms such as ‘time-box’, ‘sustainable pace’, ‘early and frequent deliveries’, ‘working software as a true measure of progress’, ‘reflection or retrospective’, and ‘distributed agile’, got more popular than ever before.

Several conferences, workshops and events got organised on Agile to evangelise and propagate its success. Lean software development and Kanban spiced up this evolution and

Page 32: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

30 | TEST software product engineering

Unit testing, an area that was unexplored until late ’90s, got strengthened with the advent of frameworks such as JUnit and NUnit. Automated unit tests enabled test-driven development. The level of automation grew in all aspects of testing from test data management to database testing and data migration testing.

got adopted as best practices in Agile methodologies.

What next? When I put this question to one of the Agile experts, he commented, “I always look at the present, the existing practices and make them better. Predicting future is something that I don’t do.” Indeed a modest response!

Modern software engineering (or modern software development)In spite of all the influences mentioned, a significant number of organisations still execute projects in traditional ways. It works for them. However a set of engineering best practices came into play and spruced-up traditional software engineering. These practices include configuration management, automated build, continuous integration, and application of IDEs and tools such as static analysers.

Iterative and incremental development transformed the definition of analysis, design, coding, and testing. All these became iteration activities rather than project phases. Object orientation, design patterns and reusable components were some of the noteworthy hot topics that impacted the industry. This trend continued and made positive impact on web development as well as other forms of engineering activities. Unified processes, model-driven development, domain-driven development, business driven development, aspect-oriented programming, are several other means of software development became popular.

All these put together gave birth to the modern software engineering (or modern software development) that transformed the mindset that treated software development simply as a coding venture. During this era practitioners realised that software engineering is more than just coding. Software development became an iterative or incremental process made up of several stages

with methodologies, tools, modelling languages, and techniques. Also, the importance of ‘architecture-first approach’ in mission critical projects was widely accepted as a best practice and techniques such as ‘architecture prototyping’ became popular in such projects.

Test automationTest automation, an exclusive competency of very few ISVs spread like a wildfire from the beginning of 2000. While Agile adoption promoted automation, there were several other aspects that motivated practitioners to reap the benefits of automation.

Unit testing, an area that was unexplored until late ’90s, got strengthened with the advent of frameworks such as JUnit and NUnit. Automated unit tests enabled test-driven development. The level of automation grew in all aspects of testing from test data management to database testing and data migration testing.

business intelligenceSupport for data warehousing, data mining and business intelligence increased with the release of advanced versions of hardware and software to support data processing of large volumes of raw data. Tools and frameworks for business intelligence were in the shopping list of many CIOs.

This trend impacted ISVs as well. Enterprise business application product suites incorporated BI modules or features. Popularity of paradigms such as data analytics and knowledge services became business strategies in the internet economy.

global software developmentGlobal software development (GSD) involves software engineering projects executed with virtual teams from different time zones and diverse cultures. Over the past decade GSD has become the norm influenced by

Page 33: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST software product engineering | 31

several factors such as optimal costs, availability of skilled pool of resources and globalisation trends such as mergers and acquisitions.

The vision of GSD for any given project is to deliver work products of high quality on schedule by means of engaging geographically distributed teams that define the right architecture for distributed development and to follow the right processes and tools for coordination and communication. There are several challenges encountered in areas such as requirement engineering, change management and project management to name a few. However, during the past decade GSD became a subject of interest among practitioners across geographies. GSD influenced ISVs in a significant way in leveraging distributed teams across time zones as well as in setting up globally distributed development centres to service local customers.

Open source movementBy late ’90s, the open source movement had gained considerable momentum. However it was restricted to operating systems and few other software packages used predominantly by the academia or small businesses. The Open Source Initiative (OCI) founded in 1998, formalised the open source movement and defined open source standards.

During the last decade the open source movement has gained a lot of momentum due to the above mentioned influences. Open source options for application servers, data bases, operating systems, content management systems, portals, collaboration tools, and project management tools became a viable choice in defining enterprise architectures. Small start-ups, academia and independent research groups leveraged this opportunity in building innovative cost-effective solutions or pilot versions of products

before finding angel investors to fund their business. Besides the concept of FOSS (free open source software) and FLOSS (free/libre/open source software) evolved and became very popular.

paradigm shift in delivery platformsLast but not the least, the paradigm shift in application delivery platforms is one among the significant influences that impacted software product engineering with an array of opportunities.

Virtualisation technology, one among the hi-tech research areas of researchers got productised by some of the niche players. Virtualisation laid the foundation for future computing systems and platform design. It enabled CIOs to reduce the total cost of ownership (TCO) by optimising the utilisation of IT infrastructure. ISVs provided solutions for platform virtualisation, application virtualisation, storage virtualisation and various other areas. Eventually, virtualisation coupled with service orientation rooted several initiatives related to cloud computing.

Meanwhile, the evolution in mobile phones, and other handheld devices stimulated growth in SPE and the past decade became very memorable with the introduction of smart phones as well as e-readers and tablets that carry lots of potential for growth in the SPE segment as we move forward.

The global economic and environmental crises necessitated corporate leaders to focus on energy efficiency and optimisation of IT infrastructure. Meanwhile, the SPE community had an eventful decade. Coincidentally it is quite promising that these influences will yield immense benefits not only to SPE but also to all computer users and eventually to the entire humanity during the current decade as long as the SPE community and regulators anticipate and manage risk such as security threats and cyber attacks.

Raja bavani Head of SPE delivery MindTree Ltdwww.mindtree.com

The SPE community had an eventful decade. Coincidentally it is quite promising that these influences will yield immense benefits not only to SPE but also to all computer users and eventually to the entire humanity during the current decade as long as the SPE community and regulators anticipate and manage risk such as security threats and cyber attacks.

Page 34: TEST Magazine - April-May 2011

Test automation has failed to date simply because we cannot afford to throw it away when it is no longer relevant. To address this issue, george wilson says that business agility requires disposable test assets

The art of throw-away test automation

32 | TEST Automation

TEST | April 2011 www.testmagazine.co.uk

Page 35: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TTEST Automation | 33

To put this in perspective, industry analysts state that the high water mark in automation success is when 20 percent of an application has been automated. This is the high water mark, mind you, not the average, 20 percent is the peak of what you can expect after a financial investment measured in hundreds of thousands of dollars and an effort investment measured in many man years.

Software test automation has been available for over a quarter of a century, but the

practice still has many sceptics, and the biggest barrier to adoption remains the level of maintenance required to sustain it. achieving just a moderate level of automation coverage requires considerable investment of budget and resource. with increasing software development complexity and more and more iT departments taking on an agile approach, traditional test automation has become too cumbersome for most to sustain.

why is test automation so cumbersome?Traditional test automation systems originated in a world that moved at a much slower pace, where waterfall developments were the only game in town and no-one attempted to tackle fast moving, mission-critical applications – they knew that the technology simply couldn’t keep up.

These products all get their capabilities from powerful scripting languages; something that sounds good in a presentation, but has become a horror in the real world, requiring a cult of high priests (highly skilled and paid test automation engineers) to communicate with the complex and mysterious deity; the test automation tool.

Worse still, the script library took weeks and months to develop. This was something that was rationalised as OK, because you could write the scripts in parallel with code development. The truth was somewhat different as the script required knowledge of how the developers were naming the visual components – something that was neither consistent nor predictable.

Because of this, the code-based tools reverted to a ‘record’ mode to establish the initial script, which made them only usable once the application was complete. This was more practical, but now the automation coding effort couldn’t even commence until sections of the code were complete and stable.

It got worse. Most of each script that needed to be coded had nothing to do with testing the application. The engineers had to overcome many challenges before they could even get that far – handling unpredictable response times, retrieving displayed data needed for validation, and establishing checkpoints to signify when the application had completed a logical step.

But the death knell was what happened when the application to be tested changed. Suddenly all these laboriously created ‘assets’ were worth nothing and would not execute until the entire process had been repeated.

What happened next ranged from the sane to the almost comical. The sane organisations did what came naturally and gave up. Others were not to be defeated and threw even more expensive resources at the problem, some hiding the failure by outsourcing the entire test burden – often to companies who did most of the testing manually. All this for an initiative which was meant to reduce the need for resources, save time and improve quality!

To put this in perspective, industry analysts state that the high water mark in automation success is when 20 percent of an application has been automated. This is the high water mark, mind you, not the average, 20 percent is the peak of what you can expect after a financial investment measured in hundreds of thousands of dollars and an effort investment measured in many man years.

Page 36: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk www.testmagazine.co.uk

The need for speedThe way we work has also changed. In the last decade the rate of business change has risen beyond anything we could have expected. The availability of new technology and the strategic advantage that it can potentially provide businesses has fuelled this, along with the need to adapt quickly to changing market requirements. The recession has arguably exasperated the situation. As the fortunes of markets change and move with a frighteningly sudden pace, every business finds itself needing to achieve more, with static or reduced budgets and resources.

Agile development is experiencing increasing popularity in IT departments, because today's fast-paced business environment requires an organisation’s development process to be flexible and adaptable to changing needs. The Agile model provides frequent delivery, increased customer involvement and helps to deal with the problem of rising complexity in systems. However, with all the benefits of a more fluid, flexible process, come challenges in how to assure the quality and governance of these ever-changing applications.

QA teams now have to accept that requirements can often change during and after each iteration, depending on the feedback from the customer. These changes in requirements are consequently reflected in the code and the tests that QA teams have to develop, which in turn can lead to a large amount of rework and script maintenance.

With delivery cycles getting shorter, and with security concerns and new regulations to manage, applications are becoming more like living things; beings that grow and mature, morphing from new-born status to an almost unrecognizable fully grown adult with all the associated trappings and documents that adults tend to collect throughout their lives.

How on earth is outdated and cumbersome test automation technology supposed to cope with this level of change and complexity? It simply can’t.

Stuck in the ’90sI read an interesting comment by James A Whittaker in his blog post ‘Still stuck in the ’90s’: “Don't get me wrong, software testing has been full of innovation. We've minted patents and PhD theses. We built tools and automated the crud out of certain types of interfaces. But those interfaces change and that automation, we find to our distress, is rarely reusable. How much real innovation have we had in this discipline that has actually stood the test of time? I argue that we've thrown most of it away. A disposable two decades. It was too tied to the application, the domain, the technology. Each project we start out basically anew, reinventing the testing wheel over and over. Each year's innovation looks much the same as the year before. 1990 quickly turns into 2010 and we remain stuck in the same old rut.”

HP in a refreshing burst of honesty now state that unless you will run a script a minimum of seven times, there will not be any payback from automation. That is one heck of a statement. Any part of an application that needs to be tested at least seven times, suggests an almost static application, not one that is a subject of active development efforts. This sort of automation is fine for regression tests, but will not make any impact on current QA bottlenecks. The need is for a solution that is faster, lighter and better able to respond to dynamic application developments.

In short, the modern business with all its need for speed and agility just has no place left for these types of solutions, regardless of how much organisations have already invested in them, and regardless of how much resource is tied up in trying to maintain them. The need for change is now.

“The rate of obsolescence outpaces the pace of change” (from http://www.altimetergroup.net/), so says Ray Wang, industry analyst and the author of the popular enterprise software blog “A Software Insider’s Point of View.”

He’s right. Technology changes too quickly for any one company to stay on

HP in a refreshing burst of honesty now state that unless you will run a script a minimum of seven times, there will not be any payback from automation. That is one heck of a statement. Any part of an application that needs to be tested at least seven times, suggests an almost static application, not one that is a subject of active development efforts. This sort of automation is fine for regression tests, but will not make any impact on current QA bottlenecks. The need is for a solution that is faster, lighter and better able to respond to dynamic application developments.

34 | TEST Automation

Page 37: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.ukwww.testmagazine.co.uk

TEST Automation | 35

top of it. New software is released so regularly that it is already out of date by the time it is launched – consider the frequency of SAP or even Microsoft updates, keeping on top of these pose a real headache to most IT departments. We’ve got to a state where traditional testing processes and tools are too cumbersome and development is pulling away. Testing becomes the bottleneck. We don’t need test assets which have cost companies thousands of dollars and man-hours to develop. What the business needs now is test assets that are quick and easy to develop, that can be re-used or adapted easily, or can be discarded without a second thought.

why throw such perfection away?Now it would be foolish to dispose of the entire concept purely based on what came before. We need to recalibrate our expectations and remind ourselves of excellent potential benefits from test automation, if only the capabilities were delivered in a form usable by all.

The ability of any automation technology to adapt to changes in the underlying application will always have some limitations. Is it reasonable to expect that a test script created for a legacy mainframe application to still be valid on the replacement .Net WPF architecture? Can you expect the test scripts created for the English version of your web site to be applicable to the newly developed Japanese version?

By freeing automation from the burden of a script based on code, we can begin to imagine a solution that could be used by subject matter experts and not limited to frustrated developers, a solution that could adapt to changes in the application under test, an intelligent solution that inherently understood the application under test, removing the need to develop logic in addition to the validation itself.

The exciting thing is that modern automation goes a surprisingly long way towards addressing these needs. But do not let that optimistic outlook hide the core issue – at some point the application, environment or business will change in such a fundamental way that the existing test assets have little or no value.

If that loss represents an investment in intellectual property, resource and time at a level so large that there is no appetite to redevelop those assets for the version of the application, then automation will have failed. Thus we arrive at the acid test – if it is deemed easier to return to a manual test approach, then automation has failed and deserves to be thrown away.

Throw-away test automationTest automation has failed to date simply because we could not afford to throw it away. Creating any form of automation takes effort and time, when the application under test changes and the automation ceases to work you are faced with a stark choice – either maintain it at additional effort and time or abandon it. If you abandon it, you are also writing off the effort and time you invested in creating it, thus bringing the whole concept into question.

The reality is that you have to be in a position to throw away the automation you have created, as sooner or later the application will change in such a way that no amount of automatic healing can tackle it. So by definition, the creation of the automation must have been so fast and painless and the investment minimal, that you can afford, both financially and emotionally, to throw it away.

Think about your own test assets? Can you even estimate the cost in time and effort in building them? How constricted are you by that investment? Can you hand on heart claim that you don’t grimace every time you have to throw them away?

The reality is that you have to be in a position to throw away the automation you have created, as sooner or later the application will change in such a way that no amount of automatic healing can tackle it. So by definition, the creation of the automation must have been so fast and painless and the investment minimal, that you can afford, both financially and emotionally, to throw it away.

george wilson Chief operating offi cer Original Softwarewww.origsoft.com

Page 38: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

36 | TEST data

If you are using ‘live’ production data for testing you could be entering a world of pain. Richard fine reports.

Are you using live customer data outside of your production database?

Say that you want to develop a new payment gateway for your online store. a team

of developers is hired, the improvements you want are designed, and the new system is created. finally, you need to test it all, to ensure that the improvements will work the way you intend them to, and also to ensure that all the old payment information still works correctly. what data do you use to test the system?

According to a recent report by the Ponemon Institute, 80 percent of companies use a copy of their ‘live’ production data. By ‘production data’ I mean they take real customer records and real credit card details, and give them to the developers. The developers run all the tests they want, send tests offshore and, once the system is working to an acceptable standard, deploy it and sign off. The test data is usually erased.

It seems obvious, but given that 80 percent of companies are doing it, maybe it isn’t. Given the amount of money most organisations spend to secure their live environments from external threats, it is puzzling that the same companies will take

direct copies of these systems and allow them to be used in non-secure environments. Imagine yours and my personal banking information being shared amongst numerous people in test and development teams. Scary.

accidents will happenYou’d hope that all of your developers are competent, friendly, moral people, but you can’t guarantee that. A disgruntled developer could use the data to severely damage your company’s reputation and public image. Maybe that’s not very likely – you look after your developers, of course, and ensure that they don’t end up so antagonistic – but what is likely is that somebody makes a mistake. Maybe somebody accidentally places a real charge on the credit cards, rather than just a virtual charge. Maybe somebody takes a copy of the data onto their laptop to work with it while commuting, and their laptop is stolen. No matter how good your developers are, occasionally, everybody has a bad day; and while you might be willing to take the risk, the law is not willing to let you.

There are a number of regulations governing the way that personal information is handled. Payment card

information, for example, is governed by the PCI DSS standard. It dictates the security measures and policies that must be in place, such as encrypting all card data on public networks, and having firewalls and up-to-date antivirus software installed. Other standards include the USA’s GLBA for financial data, HIPAA for healthcare information, the UK’s data protection act, the European Data Protection Directive, and many others. A common theme across all data protection acts is the principle that data should be kept on a ‘need-to-know’ basis.

Still, maybe you think you can justify that the developers ‘need-to-know’ the data, because the systems need to be tested. Even if you successfully argue that, you’ve then got another problem: now you need to take measures to protect the data on the developer machines, just as you have to for your production database servers. Just because it’s ‘non-production’ doesn’t mean it’s exempt from the regulations.

What does that mean in practice? If you’re complying with HIPAA regulations, you have to keep the development offices physically secure, with full sign-in and sign-out logs for developers (HIPAA §164.310), and

Page 39: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST data | 37

If you are using ‘live’ production data for testing you could be entering a world of pain. Richard fine reports.

Are you using live customer data outside of your production database?

Given the amount of money most organisations spend to secure their live environments from external threats, it is puzzling that the same companies will take direct copies of these systems and allow them to be used in non-secure environments. Imagine yours and my personal banking information being shared amongst numerous people in test and development teams. Scary.

provide and maintain a full training program to ensure developers are using the data appropriately (HIPAA §164.308 (5)(i)). The PCI DSS will require that your developers be fully audited (PCI DSS v2 10.2), and that the software they’re developing is perpetually secure (PCI DSS v2 6.3), even while it’s still in development.

The UK Data Protection Act actually states that data may only be used for the specific purposes for which it was collected (DPA98 Sch1 I.2), so unless at the time of collection you tell the user that you’ll use their data for testing purposes, then using it at all is a DPA violation.in short: unless you’re taking lengthy and expensive measures to ensure that your development and testing environment is just as secure as your production environment, then it’s not legal to use production data in development and testing.

SolutionsWhat can be done about this? You’ve got to test with some kind of data. The most popular approach is data masking. Data masking takes a copy of your live production data, and then de-identifies the sensitive content. The masked data no longer contains sensitive information, and so is not

covered by any regulations, and can be freely shared with developers.

Data masking is quick and fairly easy to understand, which is why it’s a popular method. However, it’s not without a fair number of problems, foremost of which is that successfully masking data to the point that sensitive information can’t be inferred or deduced is often extremely hard. Likewise, once highly sensitive data is masked to the level where it can never be traced back to individuals or re-engineered, it is pretty much useless. For this reason, there is an alternative called test data creation, which is the automated creation of completely synthetic data; it can mimic your production data, but is not directly derived from it, making it free from regulation.in conclusion: Using live data in non-production is either illegal or expensive. For the companies using it illegally, it’s only a matter of time before somebody slips up and the practice is discovered. For the companies paying extra to keep their developers compliant, they’ll find themselves resistant to new development and undercut by companies who’ve used their data in a strategic way. In the long run, the tiny benefit is just not worth the risk.

Richard fine Technical Writer Grid-Toolswww.grid-tools.com

Page 40: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.ukTEST | April 2011 www.testmagazine.co.uk

38 | TEST asset management

Lewis gee and colin gray discuss how companies can combine application asset management, testing and automation to manage and accelerate the successful deployment of new software from development through to production.

Making the change

Big application deployments, virtualisation and windows 7 roll-outs are likely to

be on most iT agendas this year. whenever you are making a change to your iT – from a new OS implementation through to new software updates – testing has a prominent role to play in ensuring a successful end result. however, to make the testing process more effective, companies have to address the gap that exists between development and iT operations.

Q: Testing and asset management are not normally thought of together. what value can application asset management bring to the testing table?

Lewis gee: To some extent, testing and asset management should go hand in hand right at the beginning of any project that involves a change of platform. Understanding the application landscape at the start of any change programme is essential in order to get a clear picture of the levels of testing that are likely to be required.

There is a tendency for companies to apply a ‘lift and drop’ approach around these change projects. Windows 7 is a good example:

companies have just been thinking about the OS and whether it will work with what they already have on the hardware side. Actually they should be looking at what applications are installed, how much those apps are being used, and what should be tested to check that it will run on the new OS environment.

This information makes planning what needs to be tested easier. For a legacy application that is not being accessed on a regular basis, does it make sense to move it over to Windows 7 in the first place? If you have to move it over, and it does not work on the new OS, then what is the most cost-effective way to deal with it? Getting this information early through using asset management and testing together can help IT make these decisions at the right stage of the project, and therefore avoid going too far down the wrong path.

colin gray: Application asset management is all about being able to make informed decisions through having an accurate view of an entire application portfolio. Testing becomes relevant to this bigger picture when you are looking to move assets (in this case an estate of applications) into a new environment or platform. With any big IT migration project it is likely there will be many applications in use by many users – if we’re considering a

With automated compatibility testing and remediation, a portfolio of thousands of applications can be processed in a matter of days rather than months, accelerating the migration programme and enabling the team to efficiently reallocate resources to focus on specific pain-points.

Page 41: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.ukwww.testmagazine.co.uk

TEST asset management | 39

Lewis gee VP of worldwide sales & marketing Centrix Software www.centrixsoftware.com

colin gray Director of global sales ChangeBASEwww.changebase.com

FTSE 100 size organisation, that’s many thousands of applications and many thousands of desktops at stake if the migration is deployed.

For every change you plan to make to your IT infrastructure, you want to be able to accurately predict what the outcome is going to be – and therefore identify where remedial action needs to be taken.

Not only that, you ideally want to also be able to look back at a later date and retrospectively track and audit exactly what happened during that change process. Such a granular level of reporting of problems and fixes means that those organisations undergoing a migration project can remain compliant with the necessary best practice standards and legislation, whilst ensuring relevant licensing regulation is adhered to during the transition.

bridging the gapQ: why is the gap between development and operations a challenge?

colin gray: Bridging the gap between application development and operations teams is often such a tough challenge because traditionally that gap has been so big – both in terms of approaches to the technology and the mentality and priorities of the teams behind them.

Ideally, developers need to be able to get guidance and direction from the business on what they want the final application to do, where it’s going to be used and who by. Taking that one step further, that means getting internal stakeholders involved through the development process, and not just at the very beginning and very end of development, as is currently all too common. The developers need to be able to build, test and implement something that is predictable – essentially being able to complete the development process without breaking what is already in place.

An organisation-wide collaborative approach to application development can save organisations serious time and money in the long run, but it’s not always an easy path to take. It can be difficult to get end users to contribute or feedback on a software build, so with that in mind, the greater the level of automation that can be applied to the testing portion of the process, the better.

Lewis gee: The operations side of IT often does not have the insight into the development and test cycle, how applications are developed, and why specific approaches are taken. Conversely, developers don’t often know or care what platform within the company’s data centre is going to be used to host the app in production. Having a combined approach to testing and asset management enables the software team to think about how the app will be deployed.

It comes down to getting a total picture. Achieving the best results will come through looking at the back-end infrastructure that exists and how it can be best utilised by the applications, as well as making sure that those applications are performing in the best way possible. The whole can provide a better solution than the sum of the parts.

This combination of development and operations or dev-ops is a focus point that I see increasing over the next couple of years within businesses. Testing has a big part to play in this going forward, as the role that testers play can cover both the infrastructure side and the development side.

adding valueQ. how can the testing process add more value for businesses?

colin gray: The key to adding value is making the time it takes an organisation to complete the migration of a portfolio of applications from one environment or platform to another

as quick as possible. A speedy and efficient technology deployment can therefore significantly improve a business’s competitive advantage. We’re going to be seeing a lot of migrations from XP SP3 to Windows 7 over the course of 2011 and for those organisations, the ability to be able to accurately predict what an application will do on the target platform can save months of manual testing work.

With automated compatibility testing and remediation, a portfolio of thousands of applications can be processed in a matter of days rather than months, accelerating the migration programme and enabling the team to efficiently reallocate resources to focus on specific pain-points.

Lewis gee: Testing’s role is to demonstrate that the effort and assets put into a project are actually working, and working well. Testing should therefore have an overview of both the application and infrastructure sides of IT – it can help feed back on where performance is good, and more importantly where there are more efficiencies that can be found. This ability to see across multiple parts of the IT stack can have enormous value for an organisation.

colin gray: It’s also important to bear in mind that environments are constantly changing – making accuracy in testing particularly challenging. A manual approach to testing can provide only a moderately accurate snapshot of that specific point in time. The same test the following week may throw up an entirely different set of results. In contrast, with an automated testing process the same test can be run as many times as needed, within a matter of minutes. Essentially, the more of the testing and remediation process you are able to automate, the more likely the results will be both consistent and accurate.

Page 42: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

40 | TEST company profile

Facilita has created the Forecast™ product suite which is used across multiple business sectors to performance test applications, websites and IT infrastructures of all sizes and complexity. With this class-leading testing software and unbeatable support and services Facilita will help you ensure that your IT systems are reliable, scalable and tuned for optimal performance.

Forecast, the thinking tester's power toola sound investment: A good load testing tool is one of the most important IT investments that an organisation can make. The risks and costs associated with inadequate testing are enormous. Load testing is challenging and without good tools and support will consume expensive resources and waste a great deal of effort.

Forecast has been created to meet the challenges of load testing, now and in the future. The core of the product is tried and trusted and incorporates more than a decade of experience but is designed to evolve in step with advancing technology.

Realistic load testing: Forecast tests the reliability, performance and scalability of IT systems by realistically simulating from one to many thousands of users executing a mix of business processes using individually configurable data.

comprehensive technology support: Forecast provides one of the widest ranges of protocol support of any load testing tool.

1. Forecast Web thoroughly tests web-based applications and web services, identifies system bottlenecks, improves application quality and optimises network and server infrastructures. Forecast Web supports a comprehensive and growing list of protocols, standards and data formats including HTTP/HTTPS, SOAP, XML, JSON and Ajax.

2. Forecast Java is a powerful and technically advanced solution for load testing Java applications. It targets any non-GUI client-side Java API with support for all Java remoting technologies including RMI, IIOP, CORBA and Web Services.

3. Forecast Citrix simulates multiple Citrix clients and validates the Citrix environment for scalability and reliability in addition to the performance of the hosted applications. This non-intrusive approach provides very accurate client performance measurements unlike server based solutions.

4. Forecast .NET simulates multiple concurrent users of applications with client-side .NET technology.

5. Forecast WinDriver is a unique solution for performance testing Windows applications that are impossible or uneconomic to test using other methods or where user experience timings are required. WinDriver automates the client user interface and can control from one to many hundreds of concurrent client instances or desktops.

6. Forecast can also target less mainstream technology such as proprietary messaging protocols and systems using the OSI protocol stack.

powerful yet easy to use: Skilled testers love using Forecast because of the power and flexibility that it provides. Creating working tests is made easy with Forecast's script recording and generation features and the ability to compose complex test scenarios rapidly with a few mouse clicks. The powerful functionality of Forecast ensures that even the most challenging applications can be full tested.

Supports waterfall and agile (and everything in between): Forecast has the features demanded by QA teams like automatic test script creation, test data management, real-time monitoring and comprehensive charting and reporting.

Forecast is successfully deployed in Agile ‘Test Driven Development’ (TDD) environments and integrates with automated test (continuous build) infrastructures. The functionality of Forecast is fully programmable and test scripts are written in standard languages (Java, C#, C++ etc). Forecast provides the flexibility of open source alternatives along with comprehensive technical support and the features of a high-end enterprise commercial tool.

flexible licensing: Geographical freedom allows licenses to be moved within an organisation without additional costs. Temporary high concurrency licenses for ‘spike’ testing are available with a sensible pricing model. Licenses can be rented for short term projects with a ‘stop the clock’ agreement or purchased for perpetual use.

Our philosophy is to provide value and to avoid hidden costs. For example, server monitoring and the analysis of server metrics are not separately chargeable items and a license for Web testing includes all supported Web protocols.

ServicesIn addition to comprehensive support and training, Facilita offers mentoring where an experienced Facilita consultant will work closely with the test team either to ‘jump start’ a project or to cultivate advanced testing techniques. Even with Forecast’s outstanding script automation features, scripting is challenging for some applications. Facilita offers a direct scripting service to help clients overcome this problem.

We can advise on all aspects of performance testing and carry out testing either by providing expert consultants or fully managed testing services.

facilita Tel: +44 (0) 1260 298109 Email: [email protected] web: www.facilita.com

facilitaLoad testing solutions that deliver results

Can you predict the future?Forecast tests the performance, reliability and scalability

of IT systems. Combine with Facilita’s outstanding

professional services and expert support and the future is

no longer guesswork.

visit Facilita at:

Powerful multi-protocol testing software

TM

Facilita Software Development Limited. Tel: +44 (0)1260 298 109 | email: [email protected] | www.facilita.com

4th October Guoman Tower Hotel. London

7th December Plaisterers Hall, London

WINTER 2010

Page 43: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 41

www.seapine.com phone:+44 (0) 208-899-6775 Email: [email protected] kingdom, ireland, and benelux: Seapine Software Ltd. building 3, chiswick park, 566 chiswick high Road, chiswick, London, w4 5ya uk

americas (corporate headquarters): Seapine Software, inc. 5412 courseview drive, Suite 200, Mason, Ohio 45040 uSa phone: 513-754-1655

With over 8,500 customers worldwide, Seapine Software Inc is a recognised, award-winning, leading provider of quality-centric application lifecycle management (ALM) solutions. With headquarters in Cincinnati, Ohio and offices in London, Melbourne, and Munich, Seapine is uniquely positioned to directly provide sales, support, and services around the world.

Built on flexible architectures using open standards, Seapine Software’s cross-platform ALM tools support industry best practices, integrate into all popular development environments, and run on Microsoft Windows, Linux, Sun Solaris, and Apple Macintosh platforms.Seapine Software's integrated software development and testing tools streamline your development and QA processes – improving quality, and saving you significant time and money.

TestTrack RMTestTrack RM centralises requirements management, enabling all stakeholders to stay informed of new requirements, participate in the review process, and understand the impact of changes on their deliverables. Easy to install, use, and maintain, TestTrack RM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Whether as a standalone tool or part of Seapine’s integrated ALM solution, TestTrack RM helps teams keep development projects on track by facilitating collaboration, automating traceability, and satisfying compliance needs.

TestTrack Pro TestTrack pro is a powerful, configurable, and easy to use issue management solution that tracks and manages defects, feature requests, change requests, and other work items. Its timesaving communication and reporting features keep team members informed and on schedule. TestTrack Pro supports MS SQL Server, Oracle, and other ODBC databases, and its open interface is easy to integrate into your development and customer support processes.

TestTrack TCM TestTrack TcM, a highly scalable, cross-platform test case management solution, manages all areas of the software testing process including test case creation, scheduling, execution, measurement, and reporting. Easy to install, use, and maintain, TestTrack TCM features comprehensive workflow and process automation, easy customisability, advanced filters and reports, and role-based security. Reporting and graphing tools, along with user-definable data filters, allow you to easily measure the progress and quality of your testing effort.

QA Wizard Pro Qa wizard pro completely automates the functional and regression testing of Web, Windows, and Java applications, helping quality assurance teams increase test coverage. Featuring a next-generation scripting language, QA Wizard Pro includes advanced object searching, smart matching a global application repository, data-driven testing support, validation checkpoints, and built-in debugging. QA Wizard Pro can be used to test popular languages and technologies like C#, VB.NET, C++, Win32, Qt, AJAX, ActiveX, JavaScript, HTML, Delphi, Java, and Infragistics Windows Forms controls.

Surround SCM Surround ScM, Seapine’s cross-platform software configuration management solution, controls access to source files and other development assets, and tracks changes over time. All data is stored in industry-standard relational database management systems for greater security, scalability, data management, and reporting. Surround SCM’s change automation, caching proxy server, labels, and virtual branching tools streamline parallel development and provide complete control over the software change process.

Seapine SoftwareTM

Page 44: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

42 | TEST company profile

The green hat differenceIn one software suite, Green Hat automates the validation, visualisation and virtualisation of unit, functional, regression, system, simulation, performance and integration testing, as well as performance monitoring. Green Hat offers code-free and adaptable testing from the User Interface (UI) through to back-end services and databases. Reducing testing time from weeks to minutes, Green Hat customers enjoy rapid payback on their investment.

Green Hat’s testing suite supports quality assurance across the whole lifecycle, and different development methodologies including Agile and test-driven approaches. Industry vertical solutions using protocols like SWIFT, FIX, IATA or HL7 are all simply handled. Unique pre-built quality policies enable governance, and the re-use of test assets promotes high efficiency. Customers experience value quickly through the high usability of Green Hat’s software.

Focusing on minimising manual and repetitive activities, Green Hat works with other application lifecycle management (ALM) technologies to provide customers with value-add solutions that slot into their Agile testing, continuous testing, upgrade assurance, governance and policy compliance. Enterprises invested in HP and IBM Rational products can simply extend their test and change management processes to the complex test environments managed by Green Hat and get full integration.

Green Hat provides the broadest set of testing capabilities for enterprises with a strategic investment in legacy integration, SOA, BPM, cloud and other component-based environments, reducing the risk and cost associated with defects in processes and applications. The Green Hat difference includes:

• Purpose built end-to-end integration testing of complex events, business processes and composite applications. Organisations benefit by having UI testing combined with SOA, BPM and cloud testing in one integrated suite.

• Unrivalled insight into the side-effect impacts of changes made to composite applications and processes, enabling a comprehensive approach to testing that eliminates defects early in the lifecycle.

• Virtualisation for missing or incomplete components to enable system testing at all stages of development. Organisations benefit through being unhindered by unavailable systems or costly access to third party systems, licences or hardware. Green Hat pioneered ‘stubbing’, and organisations benefit by having virtualisation as an integrated function, rather than a separate product.

• Scaling out these environments, test automations and virtualisations into the cloud, with seamless integration between Green Hat’s products and leading cloud providers, freeing you from the constraints of real hardware without the administrative overhead.

• ‘Out-of-the-box’ deep integration with all major SOA, enterprise service bus (ESB) platforms, BPM runtime environments, governance products, and application lifecycle management (ALM) products.

• ‘Out-of the box’ support for over 70 technologies and platforms, as well as transport protocols for industry vertical solutions. Also provided is an application programming interface (API) for testing custom protocols, and integration with UDDI registries/repositories.

• Helping organisations at an early stage of project or integration deployment to build an appropriate testing methodology as part of a wider SOA project methodology.

corporate overviewSince 1996, Green Hat has constantly delivered innovation in test automation. With offices that span North America, Europe and Asia/Pacific, Green Hat’s mission is to simplify the complexity associated with testing, and make processes more efficient. Green Hat delivers the market leading combined, integrated suite for automated, end-to-end testing of the legacy integration, Service Oriented Architecture (SOA), Business Process Management (BPM) and emerging cloud technologies that run Agile enterprises.

Green Hat partners with global technology companies including HP, IBM, Oracle, SAP, Software AG, and TIBCO to deliver unrivalled breadth and depth of platform support for highly integrated test automation. Green Hat also works closely with the horizontal and vertical practices of global system integrators including Accenture, Atos Origin, CapGemini, Cognizant, CSC, Fujitsu, Infosys, Logica, Sapient, Tata Consulting and Wipro, as well as a significant number of regional and country-specific specialists. Strong partner relationships help deliver on customer initiatives, including testing centres of excellence. Supporting the whole development lifecycle and enabling early and continuous testing, Green Hat’s unique test automation software increases organisational agility, improves process efficiency, assures quality, lowers costs and mitigates risk.

helping enterprises globallyGreen Hat is proud to have hundreds of global enterprises as customers, and this number does not include the consulting organisations who are party to many of these installations with their own staff or outsourcing arrangements. Green Hat customers enjoy global support and cite outstanding responsiveness to their current and future requirements. Green Hat’s customers span industry sectors including financial services, telecommunications, retail, transportation, healthcare, government, and energy.

green hat

[email protected] www.greenhat.com

Page 45: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 43

for more information, please visit http://www.microfocus.com/cqa-uk/

continuous Quality assuranceMicro Focus Continuous Quality Assurance (CQA) ensures that quality assurance is embedded throughout the entire development lifecycle – from requirements definition to ‘go live’.

CQA puts the focus on identifying and eliminating defects at the beginning of the process, rather than removing them at the end of development. It provides capabilities across three key areas:

Requirements: Micro Focus uniquely combines requirements definition, visualisation, and management into a single ‘3-Dimensional’ solution. This gives managers, analysts and developers the right level of detail about how software should be engineered. Removing ambiguity means the direction of the development and QA teams is clear, dramatically reducing the risk of poor business outcomes.

change: Development teams regain control in their constantly shifting world with a single ‘source of truth’ to prioritize and collaborate on defects, tasks, requirements, test plans, and other in-flux artefacts. Even when software is built by global teams with complex environments and methods, Micro Focus controls change and increases the quality of outputs.

Quality: Micro Focus automates the entire quality process from inception through to software delivery. Unlike solutions that emphasize ‘back end’ testing, Micro Focus ensures that tests are planned early and synchronised with business goals, even as requirements and realities change.

Bringing the business and end-users into the process early makes business requirements the priority from the outset as software under development and test is continually aligned with the needs of business users.

CQA provides an open framework which integrates diverse toolsets, teams and environments, giving managers continuous control and visibility over the development process to ensure that quality output is delivered on time.

By ensuring correct deliverables, automating test processes, and encouraging reuse and integration, Continuous Quality Assurance continually and efficiently validates enterprise critical software.

The cornerstones of Micro Focus Continuous Quality Assurance are:

• Requirements Definition and Management Solutions;

• Software Change and Configuration Management Solutions;

• Automated Software Quality and Load Testing Solutions.

Requirements Caliber® is an enterprise software requirements definition and management suite that facilitates collaboration, impact analysis and communication, enabling software teams to deliver key project milestones with greater speed and accuracy.

• Streamlined requirements collaboration;• End to end traceability of requirements;• Fast and easy simulation to verify requirements;• Secure, centralized requirements repository.

change StarTeam® is a fully integrated, cost-effective software change and configuration management tool. Designed for both centralized and geographically distributed software development environments, it delivers:

• A single source of key information for distributed teams;

• Streamlined collaboration through a unified view of code and change requests;

• Industry leading scalability combined with low total cost of ownership.

QualitySilk is a comprehensive automated software quality management solution suite which:

• Ensures that developed applications are reliable and meet the needs of business users;

• Automates the testing process, providing higher quality applications at a lower cost;

• Prevents or discovers quality issues early in the development cycle, reducing rework and speeding delivery.

SilkTest enables users to rapidly create test automation, ensuring continuous validation of quality throughout the development lifecycle. Users can move away from manual-testing dominated software lifecycles, to ones where automated tests continually test software for quality and improve time to market.

Take testing to the cloud Users can test and diagnose Internet-facing applications under immense global peak loads on the cloud without having to manage complex infrastructures.

Among other benefits, SilkPerformer® CloudBurst gives development and quality teams:

• Simulation of peak demand loads through onsite and cloud-based resources for scalable, powerful and cost effective peak load testing;

• Web 2.0 client emulation to test even today’s rich internet applications effectively.

Micro Focus Continuous Quality Assurance transforms ‘quality’ into a predictable managed path; moving from reactively accepting extra cost at the end of the process, to confronting waste head on and focusing on innovation.

Micro Focus, a member of the FTSE 250, provides innovative software that enables companies to dramatically improve the business value of their enterprise applications. Micro Focus Enterprise Application Modernization and Management software enables customers’ business applications to respond rapidly to market changes and embrace modern architectures with reduced cost and risk.

Micro focus

Page 46: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

44 | TEST company profile

Original Software

With a world class record of innovation, Original Software offers a solution focused completely on the goal of effective quality management. By embracing the full spectrum of Application Quality Management across a wide range of applications and environments, the company partners with customers and helps make quality a business imperative. Solutions include a quality management platform, manual testing, full test automation and test data management, all delivered with the control of business risk, cost, time and resources in mind.

Setting new standards for application qualityToday’s applications are becoming increasingly complex and are critical in providing competitive advantage to the business. Failures in these key applications result in loss of revenue, goodwill and user confidence, and create an unwelcome additional workload in an already stretched environment. Managers responsible for quality have to be able to implement processes and technology that will support these important business objectives in a pragmatic and achievable way, without negatively impacting current projects.

These core needs are what inspired Original Software to innovate and provide practical solutions for Application Quality Management (AQM) and Automated Software Quality (ASQ). The company has helped customers achieve real successes by implementing an effective ‘application quality eco-system’ that delivers greater business agility, faster time to market, reduced risk, decreased costs, increased productivity and an early return on investment.

These successes have been built on a solution that provides a dynamic approach to quality management and automation, empowering all stakeholders in the quality process, as well as uniquely addressing all layers of the application stack. Automation has been achieved without creating a dependency on specialised skills and by minimising ongoing maintenance burdens.

an innovative approachInnovation is in the DNA at Original Software. Its intuitive solution suite directly tackles application quality issues and helps organisations achieve the ultimate goal of application excellence.

Empowering all stakeholdersThe design of the solution helps customers build an ‘application quality eco-system’ that extends beyond just the QA team, reaching all the relevant stakeholders within the business. The technology enables everyone involved in the delivery of IT projects to participate in the quality process – from the business analyst to

the business user and from the developer to the tester. Management executives are fully empowered by having instant visibility of projects underway.

Quality that is truly code-freeOriginal Software has observed the script maintenance and exclusivity problems caused by code-driven automation solutions and has built a solution suite that requires no programming skills. This empowers all users to define and execute their tests without the need to use any kind of code, freeing them from the automation specialist bottleneck. Not only is the technology easy to use, but quality processes are accelerated, allowing for faster delivery of business-critical projects.

Top to bottom qualityQuality needs to be addressed at all layers of the business application. Original Software gives organisations the ability to check every element of an application - from the visual layer, through to the underlying service processes and messages, as well as into the database.

addressing test data issuesData drives the quality process and as such cannot be ignored. Original Software enables the building and management of a compact test environment from production data quickly and in a data privacy compliant manner, avoiding legal and security risks. It also manages the state of that data so that it is synchronised with test scripts, enabling swift recovery and shortening test cycles.

a holistic approach to qualityOriginal Software’s integrated solution suite is uniquely positioned to address all the quality needs of an application, regardless of the development methodology used. Being methodology neutral, the company can help in Agile, Waterfall or any other project type. The company provides the ability to unite all aspects of the software quality lifecycle. It helps manage the requirements, design, build, test planning and control, test execution, test environment and deployment of business applications from one central point that gives everyone involved a unified view of project status and avoids the release of an application that is not ready for use.

helping businesses around the worldOriginal Software’s innovative approach to solving real pain-points in the Application Quality Life Cycle has been recognised by leading multinational customers and industry

analysts alike. In a 2010 report, Ovum stated: “While other companies have diversified, into other test types and sometimes outside testing completely, Original has stuck more firmly to a value proposition almost solely around unsolved challenges in functional test automation. It has filled out some yawning gaps and attempted to make test automation more accessible to non-technical testers.”

More than 400 organisations operating in over 30 countries use Original Software solutions. The company is proud of its partnerships with the likes of Coca-Cola, Unilever, HSBC, FedEx, Pfizer, DHL, HMV and many others.

www.origsoft.com [email protected] Tel: +44 (0)1256 338 666 fax: +44 (0)1256 338 678grove house, chineham court, basingstoke, hampshire, Rg24 8ag

Delivering quality through innovation

Page 47: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 45

Spirent communications plc Tel: +44(0)7834752083 Email: [email protected] web: www.spirent.com

For over 20 years Parasoft has been studying how to efficiently create quality computer code. Our solutions leverage this research to deliver automated quality assurance as a continuous process throughout the SDLC. This promotes strong code foundations, solid functional components, and robust business processes. Whether you are delivering Service-Orientated Architectures (SOA), evolving legacy systems, or improving quality processes – draw on our expertise and award winning products to increase productivity and the quality of your business applications.

Parasoft's full-lifecycle quality platform ensures secure, reliable, compliant business processes. It was built from the ground up to prevent errors involving the integrated components – as well as reduce the complexity of testing in today's distributed, heterogeneous environments.

what we doParasoft's SOA solution allows you to discover and augment expectations around design/development policy and test case creation. These defined policies are automatically enforced, allowing your development team to prevent errors instead of finding and fixing them later in the cycle. This significantly increases team productivity and consistency.

End-to-end testing: Continuously validate all critical aspects of complex transactions which may extend through web interfaces, backend services, ESBs, databases, and everything in between.

Advanced web app testing: Guide the team in developing robust, noiseless regression tests for rich and highly-dynamic browser-based applications.

Application behavior virtualisation: Automatically emulate the behavior of services, then deploys them across multiple environments – streamlining collaborative development and testing activities. Services can be emulated from functional tests or actual runtime environment data.

Load/performance testing: Verify application performance and functionality under heavy load. Existing end-to-end functional tests are leveraged for load testing, removing the barrier to comprehensive and continuous performance monitoring.

Specialised platform support: Access and execute tests against a variety of platforms (AmberPoint, HP, IBM, Microsoft, Oracle/BEA, Progress Sonic, Software AG/webMethods, TIBCO).

Security testing: Prevent security vulnerabilities through penetration testing and execution of complex authentication, encryption, and access control test scenarios.

Trace code execution: Provide seamless integration between SOA layers by identifying, isolating, and replaying actions in a multi-layered system.

Continuous regression testing: Validate that business processes continuously meet expectations across multiple layers of heterogeneous systems. This reduces the risk of change and enables rapid and agile responses to business demands.

Multi-layer verification: Ensure that all aspects of the application meet uniform expectations around security, reliability, performance, and maintainability.

Policy enforcement: Provide governance and policy-validation for composite applications in BPM, SOA, and cloud environments to ensure interoperability and consistency across all SOA layers.

please contact us to arrange either a one to one briefing session or a free evaluation.

web: www.parasoft.com Email: [email protected] Tel: +44 (0) 208 263 6005

parasoftImproving productivity by delivering quality as a continuous process

Page 48: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

The TEST Focus Groups is a complimentary event specially designed and targeted at senior software testers, testing managers, QA & project managers, who wish to discuss and debate some of their most pressing challenges in a well thought out yet informal setting.

TEST Magazine, the TEST Focus Groups sister product, spends a lot of time spends a lot of time speaking and listening to its customers and then seeking out innovative ways to meet their needs. It has become apparent that senior decision makers wish to discuss their current challenges in a meaningful and structured manner with a view to finding pragmatic and workable solutions to what are invariably complex issues. Suppliers, who are naturally keen to meet these professionals want to gain a clearer understanding of these challenges and identify how, through meaningful dialogue, they can assist.

This logic coupled with TEST Magazine’s consistent desire to drive the market forward lead us to launch the TEST Focus Groups for 2011!

Due to the demands put on modern managers and the subsequent limited opportunities available to join together and voice opinions – the challenges consistently faced by today’s army of testers and testing management tend not to get resolved as quickly as enterprise would like. As a market-leading publisher and events business the organiser understands there should be a format that empowers meaningful debates to assist managers & directors overcome their issues. The TEST Focus Groups therefore provides ten specially designed syndicate rooms, each containing a specialist subject for delegates to discuss and debate the matter in hand with a view to finding pragmatic and workable solutions.

With some of the industry’s leading minds on hand to help facilitate and steer each session the TEST Focus Groups will quickly become a ‘must-attend’ event for anyone serious about software testing & QA. Add to this there are plenty of networking opportunities available in addition to a small exhibition, and each delegate is provided a fabulous opportunity to interact with their peers, source the latest products and services, and develop meaningful relationships in an informal yet professional setting.

Subjects to be debated are:

people or Technology – who gets the cash?The Value of Testing Requirementsdoes The user Matter?agile Testingcrowd TestingOutsourcingQualifications, accreditation, & ExamsEvent Sponsors Subjectidentifying Tester Related RisksTester Training

if you are interested in being a delegate at the TEST focus groups please visit: www.testfocusgroups.com/delegates.html

or to register visit: www.testfocusgroups.com/register.html

if you are interested in sponsoring this event and hosting a session please visit: www.testfocusgroups.com/sponsor.html

Or to discuss any aspect of the event please contact grant farrell on +44 (0) 203 056 4598 or email: [email protected]

www.testfocusgroups.com +44 (0) 870 863 6930 [email protected]

TEST focus groups

F O C U S G R O U P S

46 | TEST company profi le

Page 49: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

TEST company profile | 47

Our testing expertiseWe provide testing experts across the following disciplines:

functional Testing: including System Testing, Integration Testing, Regression Testing and User Acceptance Testing;

automated Software Testing: including Test Tool selection, evaluation & implementation, creation of automated test frameworks;

performance Testing: including Stress Testing, Load Testing, Soak Testing and Scalability Testing;

Operational acceptance Testing: including disaster recovery and failover;

web Testing: including cross browser compatibility and usability;

Migration Testing: including data conversion and application migration;

agile Testing;

Test Environments Management.

The testing talent we provide• Test analysts;

• Test leads;

• Test programme managers;

• Automated test specialists;

• Test environment managers;

• Heads of testing;

• Performance testers;

• Operational acceptance testers.

Our expert knowledge of the testing market means you recruit the best possible professionals for your business. When a more flexible approach is required, we have developed a range of creative fixed price solutions that will ensure you receive a testing service tailored to your individual requirements.

Our specialist networkWorking across a nationwide network of offices, we offer employers and jobseekers a highly specialised recruitment service. Whether you are looking for a permanent or contract position across a diverse range of skill sets, business sectors and levels of seniority, we can help you.

Tailored technical solutionsWith over 5,000 contractors on assignment and thousands of candidates placed into very specialised permanent roles every year, we have fast become the pre-eminent technology expert. Our track record extends to all areas of IT and technical recruitment, from small-scale contingency through to large-scale campaign and recruitment management solutions.

Unique database of high calibre jobseekersAs we believe our clients should deal with true industry experts, we also deliver recruitment and workforce related solutions through the following niche practices:

• Digital;

• Defence;

• Development;

• ERP;

• Finance Technology;

• Infrastructure;

• Leadership;

• Public, voluntary and not-for-profit;

• Projects, change and interim management;

• Security;

• Technology Sales;

• Telecoms.

We build networks and maintain relationships with candidates across these areas, giving our clients access to high calibre jobseekers with specific skills sets.

To speak to a specialist testing consultant, please contact: Sarah Martin, senior consultant, Tel: +44 (0)1273 739272 Email: [email protected] web: hays.co.uk/it

web: hays.co.uk/it Email: [email protected] Tel: +44 (0)1273 739272

HaysExperts in the delivery of testing resourceSetting the UK standard in testing recruitmentWe believe that our clients should deal with industry experts when engaging with a supplier. Our testing practice provides a direct route straight to the heart of the testing community. By engaging with our specialists, clients gain instant access to a network of testing professionals who rely on us to keep them informed of the best and most exciting new roles as they come available.

Page 50: TEST Magazine - April-May 2011

TEST | April 2011 www.testmagazine.co.uk

48 | The last word...

Having this issue’s Last Word, testing training consultant angelina Samaroo has got ‘fun’ on her agenda as she contemplates the serious business of testing and declaring today to be Tester Happiness Day.

HappYness, not TestYness

As testers, we were fully occupied with dates a few years back at the turn of the Millennium,

hunting down the y2k bug. and seeing as technology now allows us to have all this fun with our computerised gadgets and gizmos, we must have done a good job, notwithstanding the odd glitch here and there.

The job of a tester is not about fun though of course, it is a very serious business. So if I can detract from my remit to ‘keep it light’ just for a sentence or two, we carry the weight of responsibility, of hope that we left no bugs un-squished. We were the last to see the system before go-live, and we will be the first in the line of defence on go-dead. No buts.

So, back to the fun; have you heard the one about the man who was lost in some world without mobile phone masts, with only his smartphone for company? How did he survive? Well, he ate the dates from the calendar, and then on Sundays – you know the ending, as does your five year old...

And in the last year, we’ve had many tasty dates to behold. We started last year with 010110. We started this one with 010111. Last October we had 011010, then 101010; all these ones and zeros; yes or nos; true or falses. As a tester, this is easy to check. The happiest times at work for me were the times when I was just a tester – just as Dave Whalen has said in this very column in past issues*. The times as a tester when I understood exactly what the system was supposed to do. I had a properly written spec, a fully working test environment, test data under my control (it was my test after all), and a test I had spent days designing. I could turn the specification this way and that, figuring out how to wake up the sleepy spider so I could catch him mid-crawl, and then I would catch him, Bliss!

All this talk nowadays of budgets, timescales, writing this strategy and that plan are all necessary evils – to tell everyone that we know what we’re doing – but nothing is quite as satisfying for me as rolling my sleeves up and doing the job, or showing someone else how to do it. One poignant moment in my career springs to mind; I had completed a comprehensive one-to-one training programme that lasted three weeks. I did as I was told, and got through the tests. The next morning I was armed with all that knowledge and ready to prove my worth. The clue to my impending downfall was the smirk on my boss’s face, as I would recall later that day.

I started to run the tests, and then the equipment started bleeping at me, I had done something wrong, but what? In training, I was taught how to pass, not how to fail! I was unaware that they were running a book upstairs on how long it would be before the phone started ringing. Apparently they did this with every recruit – you have got to have a bit of fun after all, and fun they had! And what fun it was for me afterwards, when I was the trainer, well, after the panic attack, with my boss teaching me in the morning and me teaching my charge in the afternoon – nothing like flying (or sinking) by the seat of your pants.

So back to significant dates. On 3.14 (14th March) we now apparently celebrate Pi day. I confess, I don’t much like this one. Pi= 3.142 etc etc and the etceteras can go on and on into trillions of figures, into infinity. I can’t imagine there’s much fun to be had in finding out what the digits should be. In mathematics there is undoubtedly great beauty in the absolute proof, but the journey to get there is an arduous one and not much fun for us, not on Tester HappyNess Day. Join our petition now to have this declared a national holiday – we’re of age and they should know it.

* dave whalen returns next issue.

angelina Samaroo Managing directorpinta Educationwww.pintaed.com

All this talk nowadays of budgets, timescales, writing this strategy and that plan are all necessary evils - to tell everyone that we know what we’re doing - but nothing is quite as satisfying for me as rolling my sleeves up and doing the job, or showing someone else how to do it.

the last word...

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk

The Whole Story

www.31media.co.uk

Print Digital Online

INNOVAT ION FOR SOFTWARE QUAL I TY

Page 51: TEST Magazine - April-May 2011

April 2011 | TESTwww.testmagazine.co.uk

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazine.co.uk

The Whole Story

www.31media.co.uk

Print Digital Online

INNOVAT ION FOR SOFTWARE QUAL I TY

Page 52: TEST Magazine - April-May 2011

50 | Feature

TEST | April 2011 www.testmagazine.co.uk