Top Banner
Inside: Small-scale testing | Reporting | Enhanced application testing Phil Kirkham tackles technical debt DEALING WITH DEBT Visit T.E.S.T online at www.testmagazineonline.com IN TOUCH WITH TECHNOLOGY T HE E UROPEAN S OFTWARE T ESTER Volume 2: Issue 2: June 2010 Inside: 16-page T.E.S.T Digest
52

TEST Magazine - June-July 2010

Mar 28, 2016

Download

Documents

31 Media

The June-July 2010 issue of TEST Magazine
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: TEST Magazine - June-July 2010

Inside: Small-scale testing | Reporting | Enhanced application testing

Phil Kirkham tackles technical debt

DEALING WITH DEBT

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 2: June 2010

Inside:

16-page T.E.S.T Digest

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

RV

OL

UM

E 2

: IS

SU

E 2

: JU

NE

20

10

Page 2: TEST Magazine - June-July 2010

www.testmagazineonline.com

Page 3: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Asurvey of testers and developers, conducted by Sogeti, found that the need for software testing is

increasing as businesses face pressure to develop more sophisticated applications in shorter timeframes. The survey highlighted a need for more investment – I think this will be a familiar plea over the coming months and perhaps years – the big question is can testing put its case well enough to buck the trend and secure that essential funding?

With the change in Government in the UK and the emphasis now firmly on cuts to reduce the budget deficit the clamour for funding may well be all the more intense in the public sector. In this issue we speak to Andrew Griffiths at NHS Wales about their efforts to enhance quality assurance, reduce development time and create applications that are right first time – surely cutting costs in the process. Testers in both the public and private sector are going to have to shout that bit louder to get the message out that thorough and early testing saves cost in the long run.

Organisations are looking for ways to eliminate the risk of launching poorly-tested applications, yet over half of the IT professionals surveyed by Sogeti said their companies did not spend enough on testing. How many testers, I wonder, think that their companies and clients do spend

enough though? Budgets are rarely if ever sufficient in the eyes of those receiving them, even in good times and lest we forget, many businesses in the UK (and I suspect Europe and the wider world) were already operating as pretty lean organisations before the recent recession. But I think the point is well made, the emphasis is put on development while testing comes a poor second.

The survey also found that the majority of respondents (73 percent) consider there to be a shortage of software testing skills in the UK. The survey was taken at TestExpo, which I suspect attracts patronage from further afield than just the British Isles, so can we assume that the trend stretches at least to Europe and North America? As a humble journalist, I’m not sure, but there would appear to be no shortage of ‘offshored’ testing skills to take up any slack.

And when asked about their preference for offshoring or onshoring, responses as you might expect, were mixed. Over a third (34 percent) said that they currently had some software testing conducted offshore, nine percent said they planned to offshore, and 52 percent said they didn’t plan to offshore any software testing in the future.

Will financial expediency drive more people offshore? That remains to be seen and there are arguments for and against. I suspect that – as in so much of the business world – it is a case of horses for courses.

One final thing; check out the Digest section at the back of the magazine. I hope this will become an annual summer fixture in T.E.S.T where vendors can air some of their theories and solutions.

Until next time...

Matt Bailey, Editor

Leader | 1

Shouting that bit louder

The survey highlighted a need for more investment – I think this

will be a familiar plea over the coming months and perhaps years

– the big question is can testing put its case well enough to buck

the trend and secure that essential funding?

Matt Bailey, Editor

Editor Matthew [email protected] Tel: +44 (0)203 056 4599

To advertise contact:Grant [email protected]: +44(0)203 056 4598

Production & DesignDean Cook [email protected] Barrington [email protected]

Editorial & Advertising Enquiries 31 Media Limited, Media House, 16 Rippolson Road, London, SE18 1NSTel: +44 (0) 870 863 6930Fax: +44 (0) 870 085 8837Email: [email protected] Web: www.testmagazineonline.com

Printed by Pensord, Tram Road, Pontllanfraith, Blackwood. NP12 2YA

© 2010 31 Media Limited. All rights reserved.

T.E.S.T Magazine is edited, designed, and published by

31 Media Limited. No part of T.E.S.T Magazine may be

reproduced, transmitted, stored electronically, distributed,

or copied, in whole or part without the prior written

consent of the publisher. A reprint service is available.

Opinions expressed in this journal do not necessarily reflect

those of the editor or T.E.S.T Magazine or its publisher,

31 Media Limited.

ISSN 2040-0160

Published by:

T H E E U R O P E A N S O F T W A R E T E S T E R

Inside: Small-scale testing | Reporting | Enhanced application testing

Phil Kirkham tackles technical debt

DEALING WITH DEBT

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 2: June 2010

Inside:

16-page T.E.S.T Digest

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

RV

OL

UM

E 2

: IS

SU

E 2

: JU

LY

20

10

Page 4: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

Inside: Risk-based testing | Crowd testing | Testing as a service

James Christie takes the agile approach

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E R

Volume 1: Issue 3: September 2009

BRIDGING

THE GAP

Supported by

T.E.S.T in now online at www.testmagazineonline.comT.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

1: I

SS

UE

3: S

EP

TE

MB

ER

20

09

Simply visit www.testmagazine.co.uk/subscribe Or email [email protected]

The European Software Tester

SUBSCRIBE TO T.E.S.T.

Published by 31 Media Ltd

www.31media.co.uk

In Touch With Technology

Telephone: +44 (0) 870 863 6930

Facsimile: +44 (0) 870 085 8837

Email: [email protected]

Website: www.31media.co.uk

*Please note that subscription rates vary depending on geographical location

Risk-based testing | Crowd testing | Testing as a service

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

I N T O U C H W I T H T E C H N O L O G Y

O F T W A R E T E S T E R

BRIDGING BRIDGING BRIDGING BRIDGING

THE GAPTHE GAPTHE GAP

T.E.S.T in now online at www.testmagazineonline.comJames Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

James Christie takes the agile approach

Inside: Data security | Testing virtual worlds | User accepted testingDave Whalen takes on the cult of Agile

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E R

Volume 1: Issue 4: December 2009

I HATE AGILE!

Supported by

Visit T.E.S.T online at www.testmagazineonline.com

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

1: I

SS

UE

4: D

EC

EM

BE

R 2

00

9 Data security | Testing virtual worlds | User accepted testingDave Whalen takes on the cult of AgileDave Whalen takes on the cult of AgileDave Whalen takes on the cult of AgileDave Whalen takes on the cult of AgileDave Whalen takes on the cult of AgileDave Whalen takes on the cult of Agile

I N T O U C H W I T H T E C H N O L O G Y

E S T E RE S T E R

I HATE AGILE!I HATE AGILE!I HATE AGILE!

Visit T.E.S.T online at www.testmagazineonline.com

Visit T.E.S.T online at www.testmagazineonline.com

Inside: Risk-based testing | Checking automation | User testing

Devon Smith with a woman’s view of a hi-tec industry

HIGH HEELS IN HIGH TECH

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 1: March 2010

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

RV

OL

UM

E 2

: IS

SU

E 1

: MA

RC

H 2

01

0 Risk-based testing | Checking automation | User testing

Devon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industryDevon Smith with a woman’s view of a hi-tec industry

HIGH HEELS HIGH HEELS HIGH HEELS IN HIGH TECHIN HIGH TECHIN HIGH TECHIN HIGH TECHIN HIGH TECH

Visit T.E.S.T online at www.testmagazineonline.comVisit T.E.S.T online at www.testmagazineonline.comVisit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G YI N T O U C H W I T H T E C H N O L O G Y

T E S T E RE S T E R

Inside: Small-scale testing | Reporting | Enhanced application testing

Phil Kirkham tackles technical debt

DEALING WITH DEBT

Visit T.E.S.T online at www.testmagazineonline.com

I N T O U C H W I T H T E C H N O L O G Y

T H E E U R O P E A N S O F T W A R E T E S T E RVolume 2: Issue 2: June 2010

Inside:

16-page T.E.S.T Digest

T.E.S.T

TH

E E

UR

OP

EA

N S

OF

TW

AR

E T

ES

TE

R

VO

LU

ME

2: I

SS

UE

2: J

UL

Y 2

01

0

Page 5: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Contents | 3

1 Leader column A recent survey says there’s a shortage of testers in the UK and that belts are tightening.

4 Cover story – Dealing with debt Phil Kirkham looks at the thorny issue of technical debt and assesses the pros and cons

of being technically ‘in the red’.

8 The benefits of networking Independent testing consultants Lynn McKee and Nancy Kelln make the case for taking

every networking opportunity offered at industry events, and offer some tactics for

learning directly from the thought leaders in the testing industry.

10 Constant change In a changing world, tester extraordinaire Angelina Samaroo attempts to get to grips

with what current events mean for the testing community.

14 The importance of reporting Quality reports are crucial if you want to know how you are doing. Inés Smith, QA

director at a major bank, assesses the benefits of some reporting tools and what they

can deliver in the software arena.

18 Too small to test? Dennis Gurock asks if test management for small software shops is overkill or a crucial

part of the development process.

22 Enhancing application testing in the NHS NHS Wales introduces an application testing framework to improve patient safety,

enhance quality assurance, reduce development time and create applications that are

right first time. Andrew Griffiths, chief operating officer at Informing Healthcare reports.

T.E.S.T Digest

A 16-page round up of state-of-the-art products, processes and

opinion from the vendor sector.

42 T.E.S.T Directory

48 The Last Word – Dave Whalen There is a school of thought in software testing that debunks the value of positive

testing. This school basically states that any test that does not produce a defect is

not a good test. Dave Whalen respectfully disagrees.

CONTENTSJUN 2010

27

34

4

38

SUBSCRIBE TO T.E.S.T.

Page 6: TEST Magazine - June-July 2010

4 | T.E.S.T cover story

Phil Kirkham, test consultant at Acutest looks at the thorny issue of technical debt and assesses the pros and cons of being technically in the red.

Dealing with debt

www.testmagazineonline.comT.E.S.T | June 2010

Page 7: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

T.E.S.T cover story | 5

If this works I’ll eat my hat.” No, that’s not what I say under my breath when I’m about to test the latest

release. It goes back to my days as a programmer, assigned to take over a project where I found this comment in the code. Not the most encouraging start but luckily for the original writer of the code it seemed to work. So there was no demonstration of chapeau consumption!

As the requests for enhancements came in, and as bugs were found, the code got messier and more tangled. Many times we thought about a complete re-write but there were no unit tests covering this code. In fact no automated tests at all so we were too scared to change it. Enhancement requests had to be turned down and the salesmen must have done some smooth talking as I’m sure they wouldn’t tell the customer that the code was too complex to change. Ironically the program was called ‘FutureProof’.

This messy code had several knock-on effects. One was on my morale and I dreaded having to dive into the code. Another was on the release schedule of the company. If the release involved this dreaded code then it would take longer than expected. Another side-effect was that it meant less of my time was available for work on new products. The problem with code such as this is that it suffers from technical debt.

Technical debtSteve McConnell (author of Code Complete, Rapid Development) says: “One of the important implications of technical debt is that it must be serviced, ie, once you incur a debt there will be interest charges. If the debt grows large enough, eventually the company will spend more on servicing its debt than it invests in increasing the value of its other assets. A common example is a legacy code base in which so much work goes into keeping a production system

running (ie, servicing the debt) that there is little time left over to add new capabilities to the system. With financial debt, analysts talk about the ‘debt ratio’, which is equal to total debt divided by total assets. Higher debt ratios are seen as more risky, which seems true for technical debt, too.”

If code were subject to quality ratings a la Standard and Poor then surely this program would be classed as junk. Everyone who has worked in the IT industry knows about the problems of legacy code – but all legacy code started off as shiny new development. A study found that new graduates at Microsoft spent most of their first year reading code rather than writing it. More time is spent maintaining code rather than writing new code. Why not make this process easier and more efficient by having well written and well designed code from the start?

The path to the dark sideFast forward several years and I had moved to the ‘Dark Side’, earned my chops as a tester and I am now working as a test consultant. A quick look at the website of the consultancy I work for, Acutest (www.acutest.co.uk), shows the emphasis we put upon the change process – reducing the cost of change, increasing the speed of change and improving the governance of change.

All of these can be related to technical debt. If the debt is low then changes can be made without fear of impact, changes can be made fast and the effect of these changes is not a leap into the unknown. This doesn’t come free though.

Back to the comment in the code that started this article: “If this works I’ll eat my hat.” How could the person have had the confidence that his code did work? A recent blog post by Jason Gorman explains how: “when I write code I routinely achieve very high test assurance. We're not talking 70 percent of the code or even 80 percent. No, we're looking at a truly obscene 95 percent or higher. And I'm not just talking about code coverage

here. No, I graduated from that years ago. This is the hard stuff. I actually test my tests. I always run them to make sure they fail when they're supposed to, and I often use mutation testing to seek out any gaps in my tests, and once found, I just can't help plugging them.”

Jason ran a Software Craftsmanship conference in London last year and some recently published books such as Clean Code and Growing Object-Oriented Software, Guided by Tests try to tackle this debt problem. The authors of Growing Object-Oriented Software make distinction between external and internal quality: “Everyone can understand the point of external quality. Internal quality is what lets us cope with continual and unanticipated change which is a fact of working with software. The point of maintaining internal quality is to allow us to modify the system’s behaviour safely and predictably, because it minimises the risk that a change will force major rework.”

Everyone who has worked in the IT industry knows about the problems of legacy code – but all legacy code started off as shiny new development. A study found that new graduates at Microsoft spent most of their first year reading code rather than writing it. More time is spent maintaining code rather than writing new code. Why not make this process easier and more efficient by having well written and well designed code from the start?

Page 8: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

6 | T.E.S.T cover story

See how this lines-up with the change approach mentioned earlier? Well designed, maintainable code enables changes to be made quickly and with confidence. From a testing point of view, maintainability, use of good design principles and testability are usually the qualities that are measured least. This could be partly due to ignorance of how software is written and also because it’s one of the hard things to measure.

Going into technical debtSometimes it makes sense to go into technical debt. Martin Fowler has a technical debt quadrant (see sidebar) which explains the differences between deliberate and inadvertent, reckless and prudent debt.

One of the ways that prudent developers are reducing technical debt is with techniques such as Test Driven Development (TDD). The traditional model of defect prevention shows the cost of finding a defect rises the further along the development lifecycle it is found. With TDD it is possible to find the defects really fast as the first acceptance tests are written before any code. Good news for testers, who complain such tests are used too late in the lifecycle. Now is the chance for them to be used correctly from the beginning. As testers we want more code like this and motivated developers like Jason. Instead of finding those easy boundary bugs and finding that new releases of code have gone backwards we can concentrate on the hard to find bugs, the usability and user experience.

I mentioned that technical debt can be hard to measure – TDD can

help with this. I also mentioned that reducing technical debt does not come free. The Growing Object-Oriented Software book mentions a case where a developer was faced with unreadable tests, up to 1,000 lines long test classes, and refactoring leading to massive changes in test code.

Test-driven development can be unforgiving. Poor quality tests can slow the pace of development to a crawl, and poor internal quality of the system being tested will result in poor quality tests. By being alert to the internal quality feedback we get from writing tests, we can nip this problem in the bud, long before our unit tests approach 1,000 lines of code, and end up with tests we can live with, ie, if you having trouble writing tests for your code then it’s a sign that there could be problems in the design.

Debt and automated testingTesters writing automated tests should also take technical debt into account – are their tests maintainable and well designed? Is it hard to write tests - another sign of poor design?

“Only potential survivor, the fabulous Fab... standing in the middle of all these complex, highly leveraged, exotic trades he created without necessarily understanding all of the implication of those monstrosities.” Email from Fabrice Tourre of Goldman Sachs.

Trying to deal with messy code is like auditors trying to unravel the infamous CDOs that precipitated the current financial crisis. The long term results of those have been a disaster. Make sure you are aware of the debt that the project is taking on.

One of the ways that prudent developers are reducing technical debt is with techniques such as Test Driven Development (TDD). The traditional model of defect prevention shows the cost of finding a defect rises the further along the development lifecycle it is found. With TDD it is possible to find the defects really fast as the first acceptance tests are written before any code.

Phil Kirkham Test consultant Acutestwww.acutest.co.uk

Page 9: TEST Magazine - June-July 2010

T.E.S.T cover story | 7

an ncc group company

siteconfidence

For web site monitoring & load testing: Call 08445 380 127Email [email protected] www.siteconfidence.com

Over half Of the UK’s tOp 50 e-COMMerCe sites*

test with Site ConfidenCe.shOUldn’t yOU? *source: comscore Media Metrix (dec 09)

Technical debtThe term technical debt was coined by Ward Cunningham to describe the obligation that a software organisation incurs when it chooses a design or construction approach that's expedient in the short term but that increases complexity and is more costly in the long term.Martin Fowler expanded on this to come up with a Technical Debt Quadrant.

“The debt metaphor reminds us about the choices we can make with design flaws. The prudent debt to reach a release may not be worth paying down if the interest payments are sufficiently small – such as if it were in a rarely touched part of the code-base. So the useful distinction isn't between debt or non-debt, but between prudent and reckless debt.“Not only is there a difference between prudent and reckless debt, there's also a difference between deliberate and inadvertent debt. The prudent debt example is deliberate because the team knows they are taking on a debt, and thus puts some thought as to whether the payoff for an earlier release is greater than the costs of paying it off. A team ignorant of design practices is taking on its reckless debt without even realising how much hock it's getting into.“Reckless debt may not be inadvertent. A team may know about good design practices, even be capable of practicing them, but decide to go ‘quick and dirty’ because they think they can't afford the time required to write clean code.”

Ward Cunningham explains technical debt:

http://c2.com/cgiwiki?WardExplainsDebtMetaphor

Martin Fowler on technical debt:

http://martinfowler.com/bliki/TechnicalDebt.html

Reckless

"We don't have time for design"

"What's Layering?"

"We must ship now and

deal with consequences"

"Now we know how we should have done it"

Prudent

Inadvertent

Deliberate

Links:Jason Gorman’s Dirty Secret: http://parlezuml.com/blog/?postid=880

Steve McConnell on technical debt: http://blogs.construx.com/blogs/stevemcc/archive/2007/11/01/technical-debt-2.aspx

Matthew Heusser (lead organiser of the agile-alliance sponsored 2008 technical debt workshop) technical debt series: http://blogs.stpcollaborative.com/matt/category/technical-debt/

Bibliography:Code Complete 2: http://www.amazon.co.uk/Code-Complete-Practical-Handbook-Construction/dp/0735619670

Clean Code: http://www.amazon.co.uk/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882

Growing Object-Oriented Software Guided by Tests: http://www.growing-object-oriented-software.com/

Page 10: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

8 | Testing Events

Independent testing consultants Lynn McKee and Nancy Kelln make the case for taking every networking opportunity offered at industry events, and offer some tactics for rubbing shoulders with your peers and learning directly from the thought leaders in the testing industry.

The benefits of networking

Software testing conferences are intended to be highly valuable, offering numerous sessions, workshops,

keynote presentations, etc on new trends, tools, and techniques. There is also great potential for networking with peers and rubbing elbows with some of the industry’s thought leaders. How many of us are taking advantage of this networking opportunity? For many, the short breaks between sessions, lunch hour meal and evening socialising events are our primary opportunities to check on emails, make important calls, or just take a mental break. The concept of ‘networking’ sounds great but just doesn’t manage to happen.

For some, the thought of socialising can be intimidating especially for those who are not comfortable with meeting new folks and generating conversation. Although it may take you out of your comfort zone, the chance to network should be considered a top priority as the lessons that can be learned are just as valuable as the material covered during the conference sessions.

What are the benefits of networking? Aside from the chance to expand your ever growing list of connections on LinkedIn, there are some real tangible benefits that could shift or dramatically change your approach to software testing. We have found some of our great takeaways and “ah-ha” moments

have come from thought provoking conversations with conference peers and presenters. Many times we have walked away thinking “...hmmm I never thought of it that way...” and find ourselves mulling it over then finding a great opportunity to apply the learning in our own professional day to day life. Conversations outside of actual presentations tend to allow for more extensive questions and comments around “Well how would that work when..?”, “We tried that and then this happened...”, “That worked so well and here is where we are at now...” segue ways.

Conference networking can happen in a variety of ways. We have found there are three main areas conference

Page 11: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Testing Events | 9

networking tends to focus around; session specific, presenter specific and opportunistic.

Session specific networkingSession specific networking is conversation inspired by the content of a specific conference track or workshop. Here are some ideas to generate conversation around session specific content:• Makementalnotesoftheattendees

of your session in order to follow up with them afterwards.• Makenotesaroundthesession

content. What worked for you? What didn’t work for you? Find someone to discuss these with after the session.• Hangaroundafterthesessionto

see what pockets of conversation generate and take the opportunity to listen in and participate where possible.• Duringfull–orhalf-dayworkshop

sessions get involved in the interactive parts of the session. Use this time, not only contribute your thoughts, but start building relationships with others in the session.

Presenter specific networkingPresenter specific networking is conversation inspired by the individual delivering the session or workshop, etc. Here are some ideas to generate conversation with presenters:• Preparefortheconferenceby

reviewing the list of presenters and the topics they are covering to determine who may be of special interest for you to speak with.• Identifytheseindividualsandseek

them out during breaks, lunches, etc. Note some presenters may not be around for the entire conference so it is best to follow up with them as soon as possible especially while your questions are fresh in your mind. Share your interest in discussing the topic further and most presenters are quite willing to sit down at some point during the remainder of the conference to chat with you. • Identifyotherattendeeswhoare

also interested in the presenter or attendees who may already know the presenter and seek them out for conversation. • Getthepresenter’semailaddress

or contact information. Some presenters will have this on their

first or last page of the presentation, some may hand out business cards, or conference materials may also include contact information. Many presenters enjoy discussing their ideas with conference attendees even long after the conference has ended. This is great support for when you have tried some of the ideas and have run into roadblocks or raised further questions that you would like to discuss.

Opportunistic networkingOpportunistic networking is as its name implies, simply opportunistic. However, to maximize on this type of networking you need to actively seek out opportunities. Here are some ideas to help you in finding these opportunities:• Atbreaksorlunches,freeuptime

from your laptop or blackberry to join in conversations. You may have to start some of these discussions yourself. • Millaroundwherethetopicsor

individuals seem to be generating interest for you and even eavesdrop! Don’t be shy. If you see a group of individuals gathering and appearing to discuss topics, join the group and listen. These groups form very informally after sessions and welcome additional listeners or contributors to the group.• Someconferencesoffermethods

to generate topic-specific networking such as table cards with topics labelled such as “Agile Testing” setting the tone for the conversation over lunch. Seek out such opportunities to participate in conversations that interest you. Again, don’t be afraid to start these discussions with your group. • Headoutforsomesocialisingat

the end of the day. This is a great way to meet other people in the industry. Although there may not be a lot of conference-specific talk at these, it is a great way to add to your LinkedIn Connections list and build a list of people to reach out to in the future.

At this year’s conferences we encourage you to incorporate at least one networking suggestion from above. If you try you may surprise yourself with the results. The industry relationships that can be built at these kinds of events can be invaluable for future discussions or support.

Nancy Kelln Independent consultant Unimagined Testingwww.unimaginedtesting.ca

Lynn McKee Independent consultant Quality Perspectiveswww.qualityperspectives.ca

Aside from the chance to expand your ever growing list of connections on LinkedIn, there are some real tangible benefits that could shift or dramatically change your approach to software testing. We have found some of our great takeaways and “ah-ha” moments have come from thought provoking conversations with conference peers and presenters.

Page 12: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

10 | Testing times

So, I ended my last piece for the T.E.S.T review of the year in December 2009 with these fateful words: “bring

it on”. Oh, and they were in capitals too! Well, they were in my version, Matt Bailey clearly saw the risk and try to mitigate it (‘house style’ is a strict mistress – Ed), but he was too late. In my defence, I was talking to earthly beings, not the Gods! – I didn’t think they took time out to read draft articles to magazines, let alone respond to the challenge!

Seriously though, one prayer we probably all had was for things to get back to normal – snow in winter, showers in April, summer at the seaside, glorious colour in the autumn. Another reasonable expectation would be that you work hard, you play hard, you save some, and it should be all right in the end. But as I haven’t (luckily) reached my end yet, I don’t know if that’s how it will pan out.On the weather at least, I think the Gods may have listened, they started by giving us back our seasons – well,

winter at least. It was winter as we apparently knew it from those long ago, and somehow happier, days, although that may be the rose tinted glasses. From my point of view, those requirements really do need tightening up – a snow bear appearing overnight on my street as wide as the car and as tall as the house was a work of art to marvel at, but at what cost? Couldn’t get back from that arduous holiday in the sun in time for proper work? Marvellous clearly costs, so do we really want it?

Cloudy outlook That other expectation of a journey to some happy end is clearly going to be fraught. Just when we thought we understood computing in the cloud, the cloud changed. Iceland spoke once again to the world, this time, not through its earthly banks, but yes, those Gods again. Eyjafjallajöekull spoke and we huddled.

My next door neighbour was cleaning his car early one Saturday morning as I made my way to my weekly trashing

Since her perspectives on the industry in our inaugural issue, and even since her update in the 2009 yearly review last December, much has happened. Angelina Samaroo, tester extraordinaire and managing director of Pinta Education, watches as the world changes in front of her face...

Constant change

Page 13: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Testing times | 11

at badminton. I noticed on returning an hour later that it seemed no cleaner – his sense of humour was luckily intact – that cloud was as evident on his car as it was on mine – it has a yearly bath, and looked as good as his! My car goes by the name Yaris, Toyota Yaris that is, but Toyota’s quality problems are so ‘last quarter news’ now!

Then crowd sourcing became crowd competing. Competing for a flight, a bus, a taxi, anything to begin the journey home. Not everyone suffered though, a hapless mobile phone user decided to while away the time by listening to his favourite radio programme back home – seven days, 2GB later and a bill of £7,000, ouch! I have to guess he was not a tester, because we all know that we should review everything, including roaming charges – right?

And you must have heard about the taxi fares – in the thousands of euros. That would have left a rather bitter taste in the mouth, and perhaps forced a home vacation this summer – long live Butlin’s? Nothing to review here – requirements clear; if you want a ride home it’s going to cost. It’s all your choice, but don’t hang around, that crowd, isn’t there to help you test, you’re in their way! Not unlike the view of some developers of yesteryear, and today – why do those testers keep putting up those (quality) gates!?

It’s the economy, stupid!Back to those requirements – in the March issue Hammad Khan talked about putting the user at the centre

of things – this is worth a whole campaign for this new decade. In this W3C world, perhaps we can be indulged and alter its meaning, just for a breath or two. In today’s modern world, we need three things – Communication, Collaboration and Courtesy. Perhaps we can add a fourth – Common Sense. OK, indulge me here, I’m a tester, I know this doesn’t quite work, but is it good enough?

The economy, not willing to relinquish its top spot on our agenda, competed once again for our attentions. This time, we learned not of quantitative easing and sub-prime lending, but the Eurozone. Greece hit the headlines, not for reminding us that London 2012 owes them, but they now owe lots. Thus begins the scramble to keep our A, AA, AA+, AAA ratings, or at the very least not have us mentioned in the same sentence as junk rating – that I suppose is really not good, but I’m nowhere near expert or even knowledgeable in this, so will take my clues from the meaning: junk isn’t what we usually aspire to!

While trying to grasp the ramifications of all of this (should we charge in £, $, e or Yuan, should we do our bit for ourselves as testers and conduct a Fagan Inspection on our travel insurance policy before booking the taxi to the airport – after all, who pays for time lost at work), the planet, not wishing to be outdone in headline grabbing, finds another medium of communication – BP. This time it finds a loud, sustained voice – in print, on big screen, on smart phone, at the White House.

A hapless mobile phone user decided to while away the time by listening to his favourite radio programme back home – seven days, 2GB later and a bill of £7,000, ouch! I have to guess he was not a tester, because we all know that we should review everything, including roaming charges – right?

Page 14: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

Those now familiar superlatives enter mainstream again – unprecedented, biggest, most expensive – these could be us talking about the economy all over again. They pumped trillions into the economy; the planet pumps (albeit not entirely of its own volition) barrels and barrels and barrels into the sea – the local economy, the wildlife, the shareholders, the pension funds – they will suffer from that display of force for a while to come. By the way, if anyone knows how you can pump trillions into an economy in year 1 and in year 1.5 be hugely in debt, please would you write in to Matt and explain! What’s this got to do with testing? Assuming we all have lives, lots sadly.

Back to basicsWe then had another first; televised debates in the UK election race. Entertaining though they were, a key point of frustration was when the question of manufacturing output was raised. The answer from all three leaders made some weak mention of IT – firstly, is this now manufacturing, and secondly, where is the leadership from the governments on this? Should they not be guiding schools, universities and businesses as to what the world needs, so that we can decide if our country’s resources can be put to the task? Take up this branch of IT now, and in x years time you will have a skill that will be required, in your country, in your language, in your culture, in your currency.

As we try to navigate our way through this new decade, we should perhaps go back to basics. As testers, remember that when you sign something off, you’re signing to say

that ‘testing shows this’ – not that it will necessarily work in production. There are too many variables in production to have tested them all. As the BP high command shows up in the courts, I wonder whose signatures will be held up to the light.

The vendor community will be playing its bit in the dash to your money. My broadband supplier has been relentless – weekly calls, despite the protest ‘please, leave me alone!’, I can’t ask them to not contact me, they’re my chosen supplier! Then to make matters worse, yesterday’s phone call was trying to offer me a deal, worse than the one I’ve just signed up to! An interesting question was asked though – “can you confirm that you rang to buy the product, not request information about the product?” “No,” I said, very firmly. This decision has been made on the basis of information that you have just supplied. Why? Because distance selling has rules and I don’t want to say anything that might deny me my seven day cooling-off period.

As a tester, when you buy something, use your skills, check what you’re buying, read whatever needs reading, however small the print. In this period of austerity, despite the consumer laws for our protection, let us be reminded, in some cases the old Latin doctrine still applies – caveat emptor – let the buyer beware. And, given the time of this issue, the last words must go to South Africa. We wish you a safe, happy, glorious month of ups, downs, new stars and new starts after the crowds have left. For us, we have just one wish – that football comes home.

12 | Testing times

As we try to navigate our way through this new decade, we should perhaps go back to basics. As testers, remember that when you sign something off, you’re signing to say that ‘testing shows this’ – not that it will necessarily work in production. There are too many variables in production to have tested them all.

Angelina Samaroo Managing director Pinta Educationwww.pintaed.com

Page 15: TEST Magazine - June-July 2010
Page 16: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

14 | Testing reports

Quality reports are crucial if you want to know how you are doing. Inés Smith, QA director at a major bank assesses the benefits of some reporting tools and what they can deliver in the software arena.

The importance of reporting

Reporting has always been a standard and perhaps the most salient requirement for any

business. In recent years such requirements have been led by legal necessities such as in the Sarbanes–Oxley (SOx), PCI legislations or Audit where the impacts of non-compliance range from fines to jail terms, and includes the harsh reality that failure to comply will ultimately impact the organisation’s public image.

It has been statistically proven that 75 percent of organisations must comply with two or more regulations and corresponding audits and more than 40 percent must comply with

three or more regulations. However, this was not the sole reason to increase the profile of the reporting; put simply, the increased growth of information technology resulted in the increase in demand for unified reports which join different views in one place. This allows the corporations, the divisions and the departments within them to take a pro-active rather than re-active approach based on the simple assumption that the earlier a problem is known the sooner it can be addressed.

Risk mitigation and managementIn the early years, IT reporting stemmed not only from a desire

Page 17: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Testing reports | 15

As QA on IT product delivery evolved so did its reporting needs. The industry currently seems obsessed with the desire to have ‘product and business requirements’ traced through to the ‘delivered product’. Simply put: “Can the reports we have show that what we produced is what we wanted and meets our business requirements?” Sadly, the answer is no.

for companies to take a proactive approach, but from the dire need to demonstrate the applied due diligence in order to avert or mitigate the impact of a potential legal action. In simple terms if a company was not checking their product or service quality they could be found guilty of negligence under the Tort law. Reporting mechanisms were created to quality check the product before its release and the industry/art of ‘Quality Assurance’ (QA) was born. The results were remarkable - reports were supplemented by the industry standards stipulating how the information must be displayed. The detail of the report was driven by the law, legal precedents and a QA standards body driven by IEEE.

The benefits of reportingThe obvious question from management is “How are we doing” and the obvious area for reporting is in IT aka Management Information Systems (MIS). These systems hold all the information which is the main ingredient for any report. The data is specifically held in databases which are designed to hold and retrieve information effectively and efficiently. It did not take long for generic reporting IT application to start appearing on the market.

One of the market leaders today is a product called Business Objects. This product connects to existing databases and then provides for reports to be configured / developed to retrieve, calculate and manipulate data into whatever report is required. As expected, every product comes with its limitations. The most common generic reporting limitation is the complexity of the data ‘as it is stored’. Some database storage schemas are just not friendly to standard requests to retrieve the data in a simple manor.

From text graphicsThere was a time when only text reports were available, report images were not normal and people read books. Today things have changed. Most people prefer picture presentations with a small amount of detail. Presentations in some areas have even moved over to videos. Certainly the media web sites now have movies on all the important news events.

Studies have shown that people mostly read headings, captions above or below a picture and bullet points.

Details in the text are often skimmed over or not read at all.

Reporting has also followed this trend. The requirement is for graphical presentations of information or even interactive graphical presentations to cover a point with a small amount of text. Management prefer dashboards which are colour coded to show up only on warnings and not bother them if things are running ‘business as usual’.

In 1989 a testing company saw the market need for a testing and audit tool. The product was the only one available on the market of its kind and remains so today. The test suite became the industry standard, the company was Mercury Interactive. Their test suite product range has dominated the market for many years. The company started with the goal to fill the market need for a test tool. At that stage there was no need or legal requirement for the reporting. The report’s functionality for the test tool suite left much to be desired.

New era for IT product qualityAs time went on from the 1989 start for Mercury new legislation was introduced into the market which drove the need for companies to have better reports and audits. The legislation was first and foremost for the finance industry but as most companies use financial transactions the legislation also applied to them. Unfortunately QC did not respond to this need specifically, however, the product did change to allow for self make and customised reports; though these came with the need for highly technical skills to develop reports with in-depth knowledge of the QC database. In spite of this the reports were still created and constantly maintained. Resources were set aside to continue to extract and maintain the reports. Extraction or compiling of the latest up-to-date report was usually the full time job of a few people. This is an expensive overhead of niche market skill for any company.

Industry standardsMany of the large, and often rich, companies started using the tool somewhere in their IT division. In the normal operations of IT it’s new and evolving software was tested for quality. Reports were generated off that test information to meet the finance legislation and also their own management. It became the standard also for testers in that they were

Page 18: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

familiar with the Mercury test suite and no other test tools. Hiring was then easier for those who knew and would not need training. The market recognised the potential of Mercury Interactive and HP responded with a buyout in July 2006. The need for QA was clear.

As QA on IT product delivery evolved so did its reporting needs. The industry currently seems obsessed with the desire to have ‘product and business requirements’ traced through to the ‘delivered product’. Simply put: “Can the reports we have show that what we produced is what we wanted and meets our business requirements?” Sadly, the answer is no. There was nothing out there to show the trace, in a report, to the delivered product. Many hours of manual tracing was done. The tool of choice was Microsoft Excel.

A new approachIn the years that followed a new report product was created for the market. The product works exclusively with HP Quality Center. It filled the market need for companies to have expensive resources with niche market skills to develop and maintain reports, and interactive graphs, for Quality Center. The QCReporting product is a zero-maintenance overhead for all employees to see the details of the project.

Now companies have visibility of exactly what is happening on their quality front, whenever they want, without having to ask anyone and if there is another report they want they do not need to pay someone, or assign time to deliver it. Now there are no ‘altered’ reports and no way to hide the project’s dirty laundry.

A huge saving appeared on some budgets as they were able to take evasive action from a bad situation at

the very first sign of it instead of being told that “everything is ok” until the last minute.

The offshoring trendThere have been trends with companies to move operations offshore. This may have started with a few call centres but has increased to every market sector over the years. Some companies have moved operations back to their country as they have perhaps not seen the savings they had accounted for.

There seems to be a division in response to offshoring. There are the companies who know exactly what transpires away from home and those who don’t. Logically, if management has very clear visibility then they are better able to manage. In essence visibility means better reporting; both accurate and up-to-date. Some companies don’t believe they will get the reports they need for proper management so have chosen an approach of better QA of the delivered product and better reports on the results of the QA outcome. Ironically the savings made offshore seem to be offset by the increase in QA onshore. The best approach is better reporting on both sides.

Meeting market demandsIn the current market there are few products which can fulfil all corporate needs for reports. Some have tried and made a good common denominator approach such as business objects. However if it’s IT QA reporting and the company takes their product delivery seriously then testing products such as HP QC and QCReporting would be the best match for an overall solution. Currently these two products, working together, meet most of the market demands.

16 | Testing reports

Ironically the savings made offshore seem to be offset by the increase in QA onshore. The best approach is better reporting on both sides.

Inés Smith QA director

Page 19: TEST Magazine - June-July 2010

An independent UK study revealed that although the needfor business agility is driving the growth in spending forApplication Quality Management (AQM) solutions, 84% of QA Managers and Application Development Managersare not having their needs met by their current solutions.

At Original Software, we have listened to marketfrustrations and want you to share in our visionaryapproach for managing the quality of your applications.We understand that the need to respond faster tochanging business requirements means you have to adapt the way you work when delivering business-critical applications.

Qualify is our new solution that aids business agilityand provides an integrated approach to solving yoursoftware delivery process and management challenges.

Don’t let application qualitymanagement limitations hamperyour business agility…

Find out more by visiting:www.origsoft.com/business_agility

J0821 os_A4_advert_03:J0821 os_A4_advert 16/06/2010 12:39 Page 1

Page 20: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

18 | Testing tools

Before I started my own company I once worked as a programmer for a small shop that built software for the

health care industry. Back then, there were just a handful of programmers and a handful of testers who worked on the upcoming flagship product the company was going to sell. All team members were highly motivated and enjoyed the project they were working on.

And why wouldn’t they? The work was interesting enough and the managers made sure that everyone got the tools they needed to accomplish their work. The developers were allowed to play with and use the latest technology when appropriate (developers love learning new stuff, it's a great motivator but you don't want them rewriting your software every six months). Testers could manage their own time and were given the freedom to test the software as they saw fit.

Despite all the highly motivated team members and the interesting work, the resulting software was a disaster. It regularly crashed,

developers ‘forgot’ to implement certain requirements and the testing team didn’t catch obvious defects customers found the first day they used it.

And so the blame game began. The developers were quick to point out that the quality assurance team didn’t test the software properly. “They're just playing around with the software and only test the most obvious functionalities” they said. The testers blamed the developers, saying they were rushing out new code to meet the project deadline without adhering to quality standards. “We are also never informed about new functionality and changes that are introduced in new builds of the software” they further complained, referring to the changelogs that the development team failed to produce.

So was the project cancelled? Absolutely not, in fact, the project turned out to be the most successful products the company ever produced. But how did the team turn the ship around and fix the project?

Turning the ship aroundThe developers and testers had all but completely stopped communicating. If the testers had to get some information about a new software build or needed some details about a project change, they asked the project lead to speak with the developers. Similarly, if a software developer needed more information about a software defect that was reported by a tester, the developer would nag the project lead about this.

Clearly, things couldn’t continue this way or the project would be cancelled, and soon! Having a critical project cancelled for a small software company that had already invested a lot of resources and money would obviously have been a disaster. No one was interested in looking for a new job, so something had to change.

The team got together to discuss the future of the project. Everyone wanted to get the project fixed and make it a success. After all, people really enjoyed working on the product and believed it could become successful with

Dennis Gurock, co-founder of Gurock Software asks if test management for small software shops is overkill or a crucial part of the development process.

Too small to test?

Page 21: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

Testing tools | 19

The person in charge of the testing efforts just wasn’t the kind of person you wanted leading a team and motivating team members. He was definitely a great tester, but he wasn’t comfortable with his new role when he was promoted at the beginning of the project. It showed.

customers if the damned thing worked and didn’t crash every five minutes.

Instead of blaming each other, the team tried something new. Everyone agreed that something needed to change, so we identified the main challenges we faced and brainstormed ideas on how to fix them. After discussing, brainstorming, meeting and playing ping pong for the better part of a week (ping pong helped us be more creative, I swear!), we came up with the following points.

Lack of leadershipEveryone on the project team was well aware of the fact that there wasn’t the leadership to make the project a success. While no one was interested

in being micro-managed, there wasn’t a unified vision for the project nor was there a real project leader who made sure everything was going smoothly.

The lack of leadership was especially apparent for the testing team. The person in charge of the testing efforts just wasn’t the kind of person you wanted leading a team and motivating team members. He was definitely a great tester, but he wasn’t comfortable with his new role when he was promoted at the beginning of the project. It showed.

To improve this situation, the existing project lead was given more competencies to make decisions by himself without having to consult with the executive team for every adjustment to the project plan. Additionally, a new test manager was appointed and was given the task of reorganizing the testing efforts and refocusing the quality assurance strategy. The existing test lead was reassigned to the role of a senior tester. He was clearly happy and relieved about this change.

No processesI’m not a fan of heavy-weight processes. Micro-managing people is a great way to kill creativity, productivity and make sure that people don’t enjoy their work. But having no processes at all is no alternative. For this particular project, we had been struggling with a few key issues that could be improved by introducing a few processes and methods aligning the goals of everyone on the team.

One of the key challenges the test team faced was ensuring that all major project parts and application modules were properly tested and that all requirements were verified. Another thing the quality assurance team struggled with was the fact that they didn’t know the progress of their tests, couldn’t track their own activities and, of course, never had a very good understanding of the overall quality of the project (except for the number of defects they reported each week; a metric that is rather useless without context).

The first change the new test manager introduced was implementing a test management system and reorganising the testing process. From now on all the testing activities were going to be tracked, there would be

Figure 1 (project.png): Organising testing efforts with a test management tool (in this case Gurock Software's TestRail).

Figure 2 (suite.png): Test cases are used to document all necessary steps to verify functionality and requirements.

Figure 3 (run.png): Measuring and tracking software tests can be vital for a project's success.

Page 22: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

20 | Testing tools

documented test cases for all major requirements and weekly stand-up meetings were held to ensure that everyone was on the same page.

Wrong cultureLast but not least, all team members agreed that the culture needed to change in order to make the project a success. Up to that point, software quality had been treated as an afterthought. The developers would do their thing and the testing team’s task was to ensure that the software ‘had no major issues’. Of course it never works that way.

If software quality isn’t treated as an explicit project goal that all team members recognise and adhere to, a high standard of quality is difficult to achieve. To make sure that the team was able to achieve its new goal of improving the quality of the software, the project schedule and roadmap were altered to grant developers more time to write solid code. At the same time, testers were given more resources to test new code and execute their newly laid out test plan and strategy.

Slowly but steadily things started to improve and new versions of the software were released. Customers were starting to use the software successfully in production environments and the amount of customer reported issues went down considerably. Each release of the software improved the overall quality of the product and introduced critical missing features that customer demanded.

If you noticed the title of this feature you might expect me to say that the single fact that the team eventually started to organise their testing efforts and introduced the right tools fixed all of their problems. But this was of course not the only reason the team managed to get the project on track again. It was the combination of improving the leadership, introducing lightweight processes and fixing the culture that made all the difference.

From the standpoint of the testing team, introducing a test management system and reorganising the testing

efforts had the biggest impact. The team couldn't track their testing efforts or ensure that all functionality had been properly tested before the test management tool was introduced. Now that the new system was in place, all this was a breeze. The testers had more time to concentrate on the actual testing, learning more about the inner-workings of the software and of course beating the programmers in ping pong.

This and similar stories convinced me and my business partner to start working on a test management software to help software teams improve their testing efforts. We are well aware of the fact that most current test management tools with their heavyweight processes, clumsy user interfaces and irritating application structures are not a great help in most cases. So we designed and built a product with a modern and fresh approach to test management. The result is a web-based test management tool called TestRail that promotes lightweight processes, comes with a fast and friendly user interface and allows teams to work the way they are most comfortable.

Can small software shops benefit from a test management system? Absolutely, I have seen this first hand with many customers who managed to improve the quality of their software and optimise their testing efforts by introducing such a tool. I always find it fascinating how teams find even new ways to benefit from test management systems after they are introduced. For example, some teams use it to document the routine to set up and maintain new test environments. Other teams use it to document the steps needed to release new software versions, and so on.

Whether your team should introduce a test management system depends on the software you build and on the methodologies you use. But if you want to improve the organisation of your software tests and make your tests easier to track and measure, I would definitely say give it a try.

Dennis Gurock Co-founder & director Gurock Software GmbH www.gurock.com

Whether your team should introduce a test management system depends on the software you build and on the methodologies you use. But if you want to improve the organisation of your software tests and make your tests easier to track and measure, I would definitely say give it a try.

Page 23: TEST Magazine - June-July 2010

A Global Virtual LearningEnvironment for the SoftwareTesting industryPositioned to guide you to successin testing education andcertification■ Options to fulfil your requirements – from start to finish

■ Access to online and classroom courses, virtualclassrooms, ebooks and a range of other testing-related content

■ Sample exam questions, papers, revision sessions and live exam service

■ Scalable to support individuals to the largest corporations

■ Supported 24/7 with a global network of accredited training providers

For details of your local Learntesting provider,visit www.learntesting.com and begin yourjourney to success with Learntesting

www.learntesting.com

New LT Ad.qxd:Layout 1 23/3/10 09:18 Page 1

Page 24: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

22 | Application testing

NHS Wales introduces an application testing framework to improve patient safety, enhance quality assurance, reduce development time and create applications that are right first time. Andrew Griffiths, chief operating officer at Informing Healthcare (part of NHS Wales Informatics Services) reports.

Enhancing application testing in the NHS

Informing Healthcare (now part of NHS Wales Informatics Services) was set up in 2003. It is one of the key enablers for ‘Designed for

Life’, the national strategy to deliver world class health and social care for Wales. Informing Healthcare’s aim is to improve health services in Wales by introducing new ways of accessing, using and storing information. More specifically its aims are to make relevant information about patients health available to the doctors and nurses who treat them, wherever they receive care; to provide doctors, nurses and health professionals with the tools, knowledge and services they need to follow best practice; to give patients better information about their health and healthcare services; and ensure NHS Wales is supported by modern technology and systems.

ChallengeQuality assurance work at Informing Healthcare is embedded in the development of all products and services for NHS Wales – whether developed in-house or externally by third-party suppliers. In January 2009, Informing Healthcare took the decision to enhance its existing quality assurance work with the introduction of a software application test framework, to be used with all Informing Healthcare software application development projects across Wales. This meant a test framework would be in place to provide consistency in the approach to application testing across all national systems and would be adaptable in its approach so that it could be applied to in-house application development or buying an application – bespoke or off the shelf – or any combination of the three.

Page 25: TEST Magazine - June-July 2010
Page 26: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

24 | Application testing

Ultimately, Informing Healthcare wanted to ensure that it was firstly, producing applications that meet the needs of patients and clinicians; second, to ensure the design, development and testing of all Informing Healthcare applications achieve the maximum level of patient safety and three, to ensure applications have undergone a rigorous testing process.

Solution In late 2009, working with its partner Simpl, Informing Healthcare went live with a software application development Test Framework. Established in 2007 as a wholly-owned subsidiary of New Zealand-based Simpl Group, the company says it is a trusted IT advisor and integrator for public and private healthcare organisations primarily in England, Wales and Scotland. It uses its healthcare IT knowledge to operate at a strategic level, working in partnership with customers to help them understand and clarify IT-based problems, design solutions; create project delivery roadmaps, identify existing or required staff resources, deliver and test proofs of concept. It can also own the tactical outcomes and delivery of a project to recognised-industry standards.

Simpl was responsible for all aspects of the Test Framework. At a strategic level, it supported Informing Healthcare to identify, understand and define the need for a test framework. At a tactical level, it was responsible for the actual creation of the Test Framework. It also provided staff resources to fill the role of test manager for certain application development projects and helped Informing Healthcare identify and recruit additional staff resources required to enforce the Test Framework on other specific projects. Lastly, Simpl supported Informing Healthcare with the roll-out of the Framework to all project and programme managers.

Aligned to the NHS Wales’ Clinical Safety Strategy and approved by Informing Healthcare’s Board of Directors, the Test Framework is a comprehensive set of documents which provides guidance and checklists on how to test an application, the appropriate test environment, and a range of testing techniques. It also explains Informing Healthcare’s test

strategy and provides sample test plans, traceability matrices and test summary reports.

Benefi ts The Test Framework is now in use by a six-strong in-house team based at Informing Healthcare Research Laboratories at Swansea University and the NHS Wales Informatics Service’s office in Cardiff. The research labs, a virtual healthcare organisation, represent a range of health care settings, such as the patient’s home, a GP practice, the out of hours doctor service and a hospital outpatients’ department. The research labs enable Informing Healthcare to try out

and test applications in a risk-free environment before purchasing and or introducing them into the live NHS Wales environment.

Test managers at Informing Healthcare have already identified a number in-house application development projects against which the Test Framework will be applied. Several of these projects are already underway and using the Framework.

Referring to the project, Paul Malcolm, EMEA regional director at Simpl commented: “Organisations should not get too carried away with the glamour of application development and the benefits that new applications can bring to healthcare organisations. It is important of course, but it is also essential that only top quality applications reach the doctors and nurses on the front line and that can only happen through testing in a consistent, proven way.”

With the inclusion of the testing strategy in the framework, it helps stakeholders to understand the testing process better than before. Firstly, in-house product managers and external suppliers of applications are required to comply with the testing procedures laid down in the framework. Secondly, the framework helps the external suppliers understand why Informing Healthcare is committed to such rigorous testing, which in short, is because clinical safety is Informing Healthcare’s highest priority along with the ‘Fit For Purpose’ factor of applications. Results The Test Framework is refining and improving Informing Healthcare’s previous application testing procedures and defines a consistent approach to in-house application testing across the organisation. It also provides clear testing guidance and sets expectations to external, third-party suppliers of applications.

The Framework enhances quality assurance procedures, reduces application development testing time and helps create applications that are right first time for doctors and nurses across Wales. The Framework, aligned with Informing Healthcare’s process for ensuring patient safety, also helps ensure that any IT system implemented improves rather than jeopardises patient safety.

Andrew Griffi ths Chief operating offi cer Informing Healthcare www.wales.nhs.uk/IHC

Page 27: TEST Magazine - June-July 2010

A vendor perspective of current software testing processes

DIGEST

Page 28: TEST Magazine - June-July 2010

www.testmagazineonline.com

26 | T.E.S.T Digest

T.E.S.T DIGEST | June 2010

It is a fi ne old tradition in publishing circles to offer the reader something special for the summer holiday period –

something you can take in your hand-luggage to sunnier climes, so you don’t lose your business-focus while you sip exotic cocktails and bask under foreign skies (if extravagant holidays are still on the agenda in these straightened times, perhaps a leaky caravan in Rhyl with a can of Carling is more appropriate these days!) With this in mind here is the T.E.S.T Digest, a round-up of the state-of-the-art in testing products and services.

In the Digest we discuss the pros and cons of offshored outsourcing; the challenges of testing SOA projects; and open communication in the QA department. And who knew that Yoda had so much to offer the tester? Perhaps some ‘Jedis’ in the testing world may have suspected that there are benefits to be drawn from immersion in the ‘light side’, but Original Software is here to lead you from the dark side on page 30.

Have a great summer

Matt Bailey, Editor

Introduction 1 Outsourced test services –

a worrying trend?

With more businesses tempted to outsource software testing

to offshore locations, Julian Holmes discusses the risk and

potentially false economy of outsourced testing.

4 What Yoda can teach us about

Application Quality Management

It’s no Jedi mind trick – Original presents its fi ve black holes

to avoid for successful application delivery.

6 Motivating investment in testing

We all know how expensive software errors can be. Tomas

Schweigert and Dr Kai-Uwe Gawlik offer some advice for

those wishing to generate more investment in their testing

projects.

8 Complete test solution

Gael Quality was looking for a solution that could handle

more than 15,000 test cases, multiple resources and the

ability to fully integrate results from external sources.

TechExel had the solution.

10 Message layer testing tool considerations

Andrew Thompson looks at some of the challenges of testing

current day, more advanced SOA projects.

12 Talk is cheap

With business growth returning at the same time as zero

growth in IT budgets and staff numbers, it has never been

more important for the QA department to keep the lines of

communication open. Julian Dobbins says if talk is cheap,

that’s great news for software quality.

CONTENTS

6

8

4

12

Page 29: TEST Magazine - June-July 2010

T.E.S.T Digest | D1

Irecently came across a worrying set of market figures from industry analyst Nelson Hall that estimated the software testing

services market was worth $29bn in 2007, of which $6bn was spent on ‘independent testing’, where testing activity is outsourced and typically off-shored. These estimates then predicted that the market would increase to $37.6bn by 2012, with the market share for independent testing growing to approximately 30 percent ($12bn).

I was immediately very concerned by this growth and began to wonder why businesses were choosing to

invest a greater proportion of their test effort independently from the projects producing the software. Surely this will only result in software issues being discovered later on in the project, when it is exponentially more expensive to fix them?

The pitfalls of outsourced testingMore businesses are moving towards outsourcing as they are primarily driven by the cost-savings offered by these services. For example, most test services providers are known to offer double the testing resources in India for the same price as independent

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

Outsourced test services - a worrying trend? With more businesses tempted to outsource software testing to offshore locations with promises of double the testing effort for the same cost, Julian Holmes, co-founder of UPMentors discusses the risk and potentially false economy of outsourced testing and clarifies the benefits and dangers for those taking this route.

Page 30: TEST Magazine - June-July 2010

www.testmagazineonline.com

D2 | T.E.S.T Digest

T.E.S.T DIGEST | June 2010

testing performed in an on-shore location.

Particularly in the current economic climate, I completely understand the desire to reduce costs, but this sounds like a false-economy. The real value from adopting an outsourced test approach has to come from a measure of the total cost of adopting a focus on independent testing, as opposed to one of investment in embedded testing where test activity occurs within the time-frame of other software development activities in the project.

Recent thinking within the software industry and common sense suggests that removing test activity from the original development site and team never makes the delivery of an application more efficient, of higher quality, or lower cost. In fact I believe it’s quite the opposite. Failing to have test specialists as an integral part of a team, who understand the latest team position, test regularly and often, typically results in an overall decrease in quality and a larger set of defects that need to be addressed by the project team late in the project lifecycle, when it is far more expensive to resolve them.

Hence, while it may appear cheaper to separate and ‘industrialise’ all test activity into a reactive test factory that processes the outputs of a development team after the application has been developed, the total cost of delivery success will increase substantially.

The temptation to adopt this outsourced testing approach also encourages a more linear, often referred to as “waterfall”, approach to software development. This approach is widely recognised as a very poor delivery model, typically resulting in

an increased risk of project failure, delayed delivery with an associated cost over-run, and a system that is unlikely to reflect the changing needs of the business.

However to move the development approach toward a more iterative or agile style, again requires greater integration of test activity into the development team.

Why is outsourcing an attractive prospect?It is worth noting that the idea of outsourced testing services isn’t all bad, plus it’s easy to see why businesses opt for the outsourced testing model. Yes, you can establish a highly efficient, low cost, leading practice, independent and flexible test resource to provide a high level of quality assurance prior to a system being released into a live environment. There is a lot of value in this service, and it is important to have a ‘last line of defence’ in place for your business operations, but it should not be confused with the need for testing as part of the development lifecycle.

Testing as a discipline within the software industry is changing dramatically. Gone are the days when testers were seen as the team brought into projects to give a final assessment of the quality (or more typically the poor quality) of a system, prior to its deployment. Leaders in the industry now embed test activity at every stage of development, ensuring that test expertise is placed within a collaborative development team, and that tests are performed on any aspect of the solution the moment it is produced.

Testing as a profession is asking test practitioners to step out of their

Recent thinking within the software industry and common sense suggests that removing test activity from the original development site and team never makes the delivery of an application more efficient, of higher quality, or lower cost. In fact I believe it’s quite the opposite.

Page 31: TEST Magazine - June-July 2010

The pros and cons of outsourced testing

Pros•Outsourcedtestingfacilitycouldbeperceivedasthe‘lastlineof

defence’, highlighting potential flaws of a software project, before

released into a live environment;

•Independenttesterspromisedoublethetestingforthesamecost;

•Testersnotinvolvedintheoriginalprojectcouldprovideafresheyeon

the solution, discovering fatal flaws that may have been missed by the

project team.

Cons•Testingspecialistsnotinvolvedintheprojectrightfromthestart

will not be familiar with the main end goal of the project and how it

developed from an original vision;

•Usinganoutsourcedtestingfacilitytotestsoftwareattheendof

the project could highlight costly flaws in the solution when it’s too

expensive to remedy them;

•Timedifferencebetweenbusinessandoutsourcingtestfacilitymay

make communications about the testing procedure difficult and cause

delays on resolving any issues;

•Outsourcedtestingisafalseeconomyifnotconsideredaspartofthe

total cost of delivery;

•Failingtotestregularlythroughouttheprojectlifecycleandleavingit

to the end can often lead to producing low quality software with a high

number of defects.

T.E.S.T Digest | D3

I suspect that this could be a case of market forces winning over ethical and sensible practices, but if the above considerations are taken into account by any organisation looking at outsourced test services, then hopefully a balance of approaches will be reached as part a wider test strategy.

Julian Holmes Co-founder UPMentors www.upmentors.com

‘safe’ test communities and become part of a wider software development team. This calls for a wider range of knowledge and skills, but will typically be more rewarding as they become part of the success delivered by the team.

Therefore, those who decide to use an outsourced testing service need to recognise the mix of testing solutions required to make project teams a success. The “one size fits all” approach of isolating test activity to a remote and reactive team is only part of the solution.

However, in an economic environment where many organisations want ‘more for less’, the offering of outsourced test services to replace expensive full-time test resources will always appeal to those who don’t appreciate the big picture and have been given the hard-sell. When that market segment is also seen to be growing it will quickly become a target for those offering these services, even if the end result for the client may not quite be as they expected.

So, while I expect some agreement from outsourced test services suppliers around the ‘ethical’ nature of encouraging greater investment by clients in their embedded test activity, the financial opportunity for these suppliers may be so great, that I also expect my complaints to fall on deaf ears. I suspect that this could be a case of market forces winning over ethical and sensible practices, but if the above considerations are taken into account by any organisation looking at outsourced test services, then hopefully a balance of approaches will be reached as part a wider test strategy.

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

Page 32: TEST Magazine - June-July 2010

D4 | T.E.S.T Digest

Along time ago, in a galaxy not too far away, the very first CHAOS Report published by the Standish

Group generated worldwide attention by its claim that 40 percent of IT projects failed and that these failings were costing the US economy $140 billion each year. Ten years later, matters had improved somewhat with only half as many projects failing, but worryingly 53 percent were late, over-budget or not meeting their objectives. Now. within a mere five years, the number of failed projects is back on the rise; the 2009 Standish Group CHAOS report indicates that nearly 25 percent projects are doomed.

The quality of application delivery is at the heart of many of the challenges faced in IT projects, and this article reviews some of the most common pitfalls and pain points that often beset development projects. With the help of Yoda, Obi Wan and others from the Star Wars cast, we will learn how best to avoid these challenges and deliver your projects on time, on budget and most importantly with quality.

Black hole No. 1: Walking before you crawlObi-Wan: How long will it take before you can make the jump to light speed?

Han Solo: Travelling through hyperspace ain't like dusting crops, old man! Without precise calculations we could fly right through a star, or bounce too close to a supernova and that'd end your trip real quick, wouldn't it?

It is natural to focus on the eventual goal; the application that will be built and that will deliver the projected business benefits. However, it is equally important to focus on the quality of that deliverable, right from the project’s inception. Fail in this and you will face abandoned projects, missed deadlines and an application that may be implemented but will forever after be associated with instability and high maintenance costs.

The first essential step is to recognise this fact and to put application quality and its management at the heart of all your development efforts. If you do not believe this or do not believe you can, then failure is much more likely than success.

Black hole No. 2: QA as a siloObi Wan: The force is what gives a Jedi his power. It is an energy field created by all living things. It surrounds us and penetrates us. It binds the galaxy together.

The same could be said for quality management. It should be an energy field, created and sustained by all involved in the development process,

What Yoda can teach us about Application Quality Management It’s no Jedi mind trick – Original presents its five black holes to avoid for successful application delivery.

T.E.S.T DIGEST | June 2010 www.testmagazineonline.com

To help you to keep your finger on the pulse of your development lifecycle, you need instant access to key information by the most appropriate and powerful device. Printed reports should be at the bottom of the pile given that they are out of date the moment they are created, PCs and web access are better, and personal devices such as smart phones or Apple iPads at the top of the heap.

Page 33: TEST Magazine - June-July 2010

linking all living parts of the lifecycle, the requirements, code, the build, the test steps, the defects, the regression pack, everything; binding all aspects together and giving us the power of visibility and foresight throughout every stage of the development.

More commonly though, test teams seem to exist in serene isolation: isolated not only from other parts of the development and delivery effort, but also from each other. Frequent status meetings are normal, with the focus on the gathering of historical data rather than forward planning. Similarly, communication with other key teams is often dysfunctional. Defects are reported with a ‘fire and forget’ mentality. This is fine if you are trying to shoot down an enemy star ship but not so clever when building an IT application, as development is a key partner in application delivery.

To date, test management products have reinforced rather than broken down the potential dangerous isolation of QA teams. They have taken a narrow view of QA with a focus on requirements, tasks and defects when what is needed is a solution that can embrace QA across the project disciplines and integrate into essential infrastructure tools such as change management.

Black hole No. 3: Lack of visibility and out of date informationThe Emperor: You’ve paid the price for your lack of Vision: If you will not be turned you will be destroyed.

Understanding the current project status, the trends in the progress and the implications for resources, target dates and costs are vital to making the correct decisions. The information also needs to be available instantly. If gathering status information across all the project disciplines takes a week, the number of hours potentially burnt in the wrong activities becomes alarming. This could destroy your chances of keeping the project on-track, making your doomed project just another statistic on next year’s CHAOS report!

To help you to keep your finger on the pulse of your development lifecycle, you need instant access

to key information by the most appropriate and powerful device. Printed reports should be at the bottom of the pile given that they are out of date the moment they are created, PCs and web access are better, and personal devices such as smart phones or Apple iPads at the top of the heap.

Black hole No. 4: Unnecessary re-workGeneral Madine: Is your strike team assembled?

It might be the same team that has been used on previous projects, but with many application quality management solutions, users have to be set up over and over again for each project. All too often, although you have your team assembled and ready to go, there is still a frustrating amount of work to do in setting up users, permissions and calendars etc. When evaluating AQM solutions take into account simple time-saving factors and where possible choose an option where users need only be set up the once and can then be assigned to multiple projects.

Black hole No. 5: Not supporting all types of working practiceYoda: Decide you must, how to serve them best.

Enterprise Agile versus waterfall and the challenges of heterogeneous environments are increasingly becoming hot talking points and many organisations are working with a variety of platforms and methodologies. So how do you successfully bring together teams that work in different ways and on multiple projects?

With more complexity in IT projects and a need to respond faster to changing markets, development teams have had to adapt the way they work, often utilising different methodologies on different projects in order to support the dynamic nature of their businesses. If the quality management solution does not support the way that they work, you will encourage maverick teams working outside of the ‘Alliance’. Make sure that your AQM solution empowers your teams and

allows the flexibility to aid and not impede them.

ConclusionYoda: No More Training do you require. Already know you, that which you need.

In this article, we have looked at five galactic black holes that projects can get sucked into, turning them off-course. Application Quality Management is not some mysterious Jedi art – most of this is just common sense. You already know what is required in application delivery, you just need to ‘use the Force’ and remember these black holes when selecting technology to assist you in achieving your destiny.

‘Size matters not’ says Yoda, but what’s important is effective project planning and organisation, addressing complexity and empowering different working practices, ensuring good collaboration and communication and maintaining control and visibility throughout. Quality cannot be simply bolted on at the end of a development. It must be embraced from the start and be part of the entire development ethos and infrastructure.

Unfortunately current market-dominating products do not meet these fundamental requirements and only support and reinforce the approach of test management in a silo. Don’t become one of the negative statistics in the CHAOS report. In order to deliver a successful solution that meets all the demands of the business, we need to a take a holistic view of quality in the application delivery process.

In the words of Yoda: ‘Mind what you have learnt. Save you it can’.

This is a cut down version of a longer Original Insight. To view the full paper and read about all ten black holes, visit: http://www.origsoft.com/products/qualify/docs/what_yoda_can_teach_us_about_aqm.pdf

Quotes from Star Wars film characters are accredited to Lucas Films. All other trademarks are properties of their respective owners.

T.E.S.T Digest | D5

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

Page 34: TEST Magazine - June-July 2010

We all know how expensive software errors can be. Tomas Schweigert, principal consultant at SQS and Dr Kai-Uwe Gawlik, head of application intelligence at SQS offer some advice for those wishing to generate more investment in this area.

D6 | T.E.S.T Digest

Motivating investment in testing

www.testmagazineonline.comT.E.S.T | December 2009T.E.S.T DIGEST | June 2010

Quality is still an issue in the software business. According to the Standish Group’s Chaos

report 2009 only 32 percent of all projects are delivered on time and within budget. Quality issues can be contributory factors in projects that run disastrously over budget such as the Ariane 5 project which cost $500m and where a software error caused a mission disaster; Fiscus in Germany that cost $900m and the Sabre Reservation System at $195m.

A major study, “The Economic Impacts of Inadequate Infrastructure for Software Testing” carried out in 2002 by the US National Institute of Standards and Technology concluded that the US economy faces a huge annual loss that could be reduced by better testing and we believe that this conclusion still holds today. It reported that inadequate software testing costs the US economy $59.5bn every year and that testing infrastructure improvements could reduce that cost by $22.2bn. The study also identified a number of improvements that could be achieved, including removing more bugs before the software is released, detecting bugs earlier in the software development process and locating the source of bugs faster and with more precision.

The study sets out some of the issues resulting from poor testing such as increased failures due to poor quality, increased software development costs

and longer time to market. Looking at this study, we would expect there to be a huge amount of investment in effective software testing. Yet since the writing of the report, many organisations have still resisted spending money on improvements in testing.

Cost and timeTypically cost and time are very important drivers for IT organisations. Many perform intuitive tests in time boxes. The justification is that the testers find lots of errors without spending time and money on writing systematic test cases.

However practical experience shows that this approach does not support sophisticated quality goals and that current methods while efficient are not effective. As a result many commercial organisations are finding 30-60 percent of all errors, including critical errors, in production.

Unfortunately this issue can’t be resolved easily by doing more of the same. It might seem that if the quality of software is poor, it is enough to increase the time spent testing and/or the number of testers. The problem is that due to reduced error density, detecting subsequent errors costs a little bit more than detecting the initial errors. So the testing cost per detected error is not linear but exponential and the method that was so efficient in finding the first 30 percent of errors turns out to be very inefficient when

A major study, “The Economic Impacts of Inadequate Infrastructure for Software Testing” carried out in 2002 by the US National Institute of Standards and Technology concluded that the US economy faces a huge annual loss that could be reduced by better testing and we believe that this conclusion still holds today. It reported that inadequate software testing costs the US economy $59.5bn every year and that testing infrastructure improvements could reduce that cost by $22.2bn.

Page 35: TEST Magazine - June-July 2010

T.E.S.T Digest | D7

The cost of error detection and error correction can be measured and calculated fairly easily. It is also straightforward to calculate the cost of improved test effectiveness. Costs resulting from error occurrence are harder to calculate.

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

used to find the remaining 70 percent. However, a change in approach to systematic and automated tests requires a huge investment that is hard to achieve in the present economic climate.

The business caseA standard method to convince management of the need to invest in testing is to develop and calculate a business case. Yet users, management and testers in many organisations are in a no-win situation. There is often no data available to justify investment in improving quality by better testing.

To calculate a business case for improved testing various data are needed including the cost of error detection – comprising of testing, reviews, audits and assessments – and the cost of error correction.

The cost of error detection and error correction can be measured and calculated fairly easily. It is also straightforward to calculate the cost of improved test effectiveness. Costs resulting from error occurrence are harder to calculate.

In the commercial sector there are two ways to justify more budget for testing; first, by carrying out an evaluation of error costs and secondly by evaluating the damage to intangible assets.

In an SQS internal survey, 1,500 consultants were asked if their client measured the cost of errors. The survey found that only 10 percent of all organisations do measure error costs. As a result, management is in the uncomfortable position of making decisions and spending money based on intangibles.

The cost of errorsThe current economic slowdown reduces the probability that organisations will spend money to improve the effectiveness of software testing. Providing data to persuade management of the need for a bigger testing budget and an improvement in error descriptions is necessary.

When using current data in an ERP system such as SAP®, for instance, it is hard to calculate the cost of errors. There is not usually a cost centre for error costs and in many organisations the costs of an error cannot be

accounted for. More often than not it is impossible to evaluate the costs of errors in production.

Error management tools are available but are not normally used to provide financial data. The impact of the error is described in technical terms such as ‘database breakdown’ and linked to non-financial data such ‘200 employees affected by the reduced speed of the application’.

Book-keeping and error management systems need to be improved to provide meaningful financial data to management. A pragmatic approach would be to improve the error management data found in error management tools.

Financial damage caused by the error can be calculated by the user who reports the error. To support the delivery of commercial data it is necessary to improve the definition of severity and/or priority within the error management tool.

Severity should be redefined, based on financial data, so for example a critical error may cause $100,000 worth of damage. Priority too should be redefined, based on financial data, for example assigning costs of $10,000 a day for workarounds. It is possible to customise error management tools to store this financial data.

We would also recommend setting up a policy which ensures that an error is downgraded if there is no financial cost attached to it. Should a database crash, this is traditionally recorded as a severe error but if no financial damage is reported the error should be down-graded to low severity.

As a result there will be a top-level view of error management that shows not only the number of errors but also the financial impact. Once this data is in place, it is possible to estimate or calculate business cases for improved test effectiveness.

The responsibility of gathering the data for a business case and budget decisions lies with quality team which defines, collects, stores and analyses data to provide valid information.

A clear business case based on hard facts instead of intangibles should encourage management to evaluate the need for better software testing and prioritise and budget for more effective testing in the future.

Dr Kai-Uwe Gawlik Head of Application Intelligence Software Quality Systems AGwww.sqs-group.com

Tomas Schweigert Principal consultant Software Quality Systems AGwww.sqs-group.com

Page 36: TEST Magazine - June-July 2010

D8 | T.E.S.T Digest

T.E.S.T DIGEST | June 2010 www.testmagazineonline.com

Gael Quality, a trading division of Gael Ltd, was established in Scotland in 1995. Since its inception,

the philosophy of Gael has always been to offer software solutions to facilitate both personal and organisational improvement.

As UK market leader in the design, development and delivery of compliance management solutions, Gael designed their flagship product Q-Pulse to enable organisations to achieve value from demonstrating compliance. Q-Pulse has helped companies across multiple industries transform the necessary compliance activities from a costly overhead into a business benefit with competitive and commercial advantage.

Gael Quality was using Microsoft Excel to document tests and record results. For defect tracking they used

their own in-house Microsoft Access application. However at this point Gael had outgrown Excel, and they had far too many tests to manage using a spreadsheet.

New test solutionThe new test solution had to be able to manage thousands of test cases across the teams and integrate the results of both manual and automated tests. In addition, it had to handle multiple resources and have the ability to fully integrate results from external sources such as TestComplete.

Looking at the defect tracking solution, it had to be able to do at least what their existing bespoke solution did. DevTrack covers much more than their initial requirements for a defect tracking tool. Initially, they did not have a workflow management system, but this has now been implemented

Gael Quality was looking for a solution that could handle more than 15,000 test cases, multiple resources and the ability to fully integrate results from external sources. TechExel had the solution.

Complete test solution

Page 37: TEST Magazine - June-July 2010

T.E.S.T Digest | D9

as part of DevTrack. Furthermore, they also wanted to have a solution that covered traceability back to the initial test requirements.

Gael Quality had a very short implementation cycle and with only three days consultancy from TechExcel, they had the system up and running.

“Implementing DevTest was very easy and we were up and running almost right away,” said Ian Blair at Gael Quality. “Getting existing test cases into DevTest only took between three and four weeks of full-time effort.”

Gael Quality has now been using DevTest for two years, has DevTest lived up to their expectations? “I would say yes, it has lived up to expectations. We have defined 14,495 master test cases in our main project and several hundred more in smaller projects. We have executed more than 117,000 test cases. Over 19,000 of these have been executed by TestComplete and the results fed back to DevTest by our custom execution framework,” concludes Blair concluded. “We could never have efficiently managed the volume of tests we have now with the old solution.”

TechExcel Dev Test StudioDevTest is a complete solution for test management and includes test case creation, planning and execution through defect submission and resolution. DevTest Studio tracks and manages the complete quality lifecycle.

DevTest Studio combines the award winning test management features of DevTest, the market-leading defect tracking features of DevTrack and the automated test functionality of TestLink into one specially priced, integrated solution.

Key benefits•Gaincontroloverproductquality

with real-time test results reporting,

tracking and analysis that lets you know what has been tested and what still needs to be done.•Improveteststandardisation,

re-use, and revision control using a centralised test library.•Increaseyourteam'sproductivity

with reduced data entry, definable test interfaces, and process automation.

Feature overview

Defect tracking:•Trackeachissuethroughadefinable

workflow;•SCMintegration–trackfixesagainst

their source code deliverables;•Deployaresolutionacrossmultiple

releases, versions and products;•Reportingandmetricstoillustrate

the entire defect lifecycle.

Test management and execution:•Createacentralrepositoryforyourtest cases, knowledge items and automation scripts;•Schedulereleasesandtestcyclesusing a wizard-driven interface;•Executetestassignmentsandsubmitdefects from the same interface;•Trackresultswithreal-timedashboards and reports.

Test automation:•Out-ofthe-boxintegrationwith

TestComplete;•AddautomatedteststotheDevTest

test library;•Scheduleautomatedtestsalongwith

manual tests;•Launchautomatedtestsfromthe

DevTest interface;•Trackautomationresultswithreal-

time dashboards and reports.

TechExcel provides free evaluation copies of DevTest that can be downloaded from the Internet at http://www.techexcel.com/resources/www.techexcel.com

Gael Quality was using Microsoft Excel to document tests and record results. For defect tracking they used their own in-house Microsoft Access application. However at this point Gael had outgrown Excel, and they had far too many tests to manage using a spreadsheet.

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

Page 38: TEST Magazine - June-July 2010

D10 | T.E.S.T Digest

www.testmagazineonline.comT.E.S.T DIGEST | June 2010

Let’s begin by setting the background. Your company is developing an integration platform that will provide

an interface for customers and other companies to access your legacy applications. Access to these services may be via an Ajax/RIA web pages, or via POX (Plain old XML). It’s an ordering system, so customers can order via the web and check their order status. Suppliers will interact at the POX level to process orders on your behalf for items you do not keep in stock. Warehouse staff will also have a GUI interface from which they will collate orders for dispatch.

So what are the challenges that will be faced? Standards compliance, testing in parallel to development, no application GUI’s available for immediate testing, interaction with databases, ESBs and even humans, may need to be simulated or checked.

Oh, one last thing, there is a fixed deadline by which this application must go live (as always), so we need to start the testing process right now, as soon as the developers start writing code.

This article is written to look at some of the challenges of testing current day, more advanced SOA projects, not necessarily those found on the original simple services developed at the beginning of the SOA revolution.

The first deliverable you are likely to be given in a Service Orientated Architecture (SOA) project (aside from project documentation) is the WSDL file (Web Service Definition Language). This is what defines the message layer interface to the main application for both the POX and web page interfaces. All interaction with the application will be via the operations defined in this

WSDL file. No lines of code have been written yet, so, can you start testing?

Parallel testing streamsThis is going to surprise you, because what I am going to suggest goes along a different route to a lot of practise. You see, the following tasks can all be run more or less in parallel.• Backendandmiddleware

development;• GUIdevelopment;• WSDLvalidation;• Functional/scenariotesting;• Penetrationtesting;• Performancetesting.

Back end and Middleware developmentShould a developer be writing his actual application, or a test harness with which to test it? Even once built, a test harness requires maintenance – an on-going human cost. Providing a ready built, adaptable, test harness to a developer is a key step in improving productivity. These ‘pre-built’ test harnesses are available through such tools as the free SoapUI, or more sophisticated tools such as Parasoft SOAtest that can parse a WSDL file and create a GUI or stubbed server from it.

GUI developmentSimilarly, the team tasked with developing the GUI needs to be able to send and receive messages so that they can quickly (and repeatedly) test their GUI. For productive development this means abstracting this ability away from the backend side of the project. You need to provide a test harness that will provide intelligent, repeatable, responses to a set of test messages. This is the art of service

virtualisation, or stubbing, and again is in the realms of the more sophisticated SOA testing tools.

WSDL validation.If your application is going to be easily maintained in the future, or perhaps be made available to third parties, then it should be compliant to the WS-I Interoperability standards. Compliance to these standards can be checked as soon as the WSDL file is defined – a simple, but important check. Another similar test is to ensure your WSDL file complies with its own XML schema definitions! Again this is a simple test with the right tool.

Functional testsSo here we are at the beginning of the project, no GUI, and no backend, what is the tester going to be doing? Well each service operation is going to be defined in the WSDL file (or something similar). This is enough for the tester to start formulating simple, functional, regression tests via the right tool without a line of code having been written. By developing the message request tests, and the virtual service to respond to them, the testers are already providing support to the developers as these can be shared assets.

Scenario testsAs the application matures, the individual functional tests can be begin to be chained into more useful scenario tests, fulfilling the use cases designed to test each flow. Individual tests can either use the virtual server, or developed application, depending on what is available, and pass the results of one test into the outgoing fields of the next request message.

Parasoft managing director Andrew Thompson looks at some of the challenges of testing current day, more advanced SOA projects, not necessarily those found on the original simple services developed at the beginning of the SOA revolution.

Message layer testing tool considerations

Page 39: TEST Magazine - June-July 2010

T.E.S.T Digest | D11

There are many great message layer testing tools on the market. Some are free, some are not. Some are great at service virtualisation, and some concentrate on web pages. There are horses for courses as they say, and one particular tool will not be a panacea to all testing requirements. Check your requirements carefully, and ensure your choice will be a good fit for the majority of your testing requirements.

Andrew Thompson Managing director Parasoft UKwww.parasoft.com

www.testmagazineonline.com June 2010 | T.E.S.T DIGEST

The important thing to note though is that you should not need the entire application to be available to start doing scenario testing. As the services become available, they simply replace the stubbed elements.

One more thought to be considered is if any human interaction will be required to facilitate the conclusion on a test scenario. Perhaps we are testing a loan approval process, and if the loan request is greater than a specified amount then the loan must be approved manually before the testing process can complete. So now we are looking for a testing tool that can be easily configured to check the loan amount, and then run a set of web page actions to approve the loan before the rest of the message layer tests complete, all being done integrally to the same test flow.

Performance testingIs there any reason that performance testing should be left right to the end of a project? A simple performance test should be set up for each SOA operation. This should then be run on a daily basis with the results being logged. As the application matures, these results can be compared and the effect of any changes to the application noted. Quality of service metrics should be assigned, and when they are broken, the developers can be immediately notified that, left as it is, the application will not meet the specified response times under load. It is entirely possible to take the functional tests already developed and run them in a load scenario; this therefore takes little time to set up.

Penetration testingAgain, this is an item that should not be left to the end of the project. Security must be designed in, not tested in. Penetration tests should be set up right from the word go. Run on a daily basis these tests will give developers the heads up immediately that their application is not meeting security requirements. Catch bugs early, and you reduce the development

time needed to fix them – it is a proven fact. Penetration testing is provided as standard within SOAtest!

Other issues to considerApplications that rely on passing messages from one place to another, whether that be client-server style, or business-to-business communication, all rely on having good data. To this end, it is often a practise to use existing customer data during testing. After all, what could be better than that? Well aside from the obvious data protection issues, and the fact you may not cover all required paths. Integration between test data tools, such as that provided by Grid Tool’s ‘SOA Data Pro’ and Parasoft’s SOAtest, provide an excellent alternative. Test data can be created to your specific requirements, and then injected directly into your functional or scenario tests that have already been set up.

While we are on the topic of data, consider whether you need to validate your response data against the contents of a database. For instance an account enquiry has return balance ‘A’ in the response, but is this the true balance as depicted in the database? Adding a database query to the set of functional tests will be useful.

In summaryThere are many great message layer testing tools on the market. Some are free, some are not. Some are great at service virtualisation, and some concentrate on web pages. There are horses for courses as they say, and one particular tool will not be a panacea to all testing requirements. Check your requirements carefully, and ensure your choice will be a good fit for the majority of your testing requirements.

SOAtest covers all of the requirements discussed here, and is compatible with a range of transport protocols such as HTTP, JMS, MQ and .Net WCF to name a few, Free evaluations of Parasoft SOAtest can be requested at www.parasoft.com.

SOA Testing Tool Requirements• WS-IInteroperabilityChecks• SchemaChecks• ServiceVirtualization• TestDatainjection• Databaseinteraction• WebPageinteraction• Penetrationtesting• Performancetesting

Page 40: TEST Magazine - June-July 2010

D12 | T.E.S.T Digest

Growth will return to the IT industry in 2010. Most industry commentators are in agreement on this,

and so, while this is very welcome news, it is not actually tremendously newsworthy. Perhaps of more interest is the fact that such recovery is expected to take place alongside zero growth in IT budgets or employee numbers, placing yet greater pressure on IT to deliver what the business needs. So, if 2009 was about cutting costs and stripping out non-essential projects, 2010 has its emphasis placed firmly on the ‘more’ part of ‘doing more with less’.

To make matters worse, IT’s ability to deliver successful projects currently has its “highest failure rate in over a decade”, according to the Standish Group. And while Gartner figures suggest something less bleak, there continues to be enough high profile

examples of failure and waste in IT to suggest that many will struggle to meet the needs of companies focussed on growth and profitability through the course of the next twelve months. In January 2010, one UK national newspaper reported that a “series of botched [Government] IT projects has left taxpayers with a bill of more than £26bn for computer systems that have suffered severe delays, run millions of pounds over budget or have been cancelled altogether.”

Failure to communicateIn many cases, such as the UK’s £12.7bn National Project for IT, users were simply never consulted on “what they wanted the new system to achieve” or kept informed as projects rolled on and challenges arose. This is surely completely unacceptable and, what’s more, totally inexplicable. Especially when 70 percent of

Talk is cheap With business growth returning at the same time as zero growth in IT budgets and staff numbers, it has never been more important for the QA department to keep the lines of communication open. Julian Dobbins, head of analyst relations at Micro Focus says if talk is cheap, that’s great news for software quality.

T.E.S.T DIGEST | June 2010 www.testmagazineonline.com

Page 41: TEST Magazine - June-July 2010

production defects are considered to have been created during requirements and design – not to mention that the costs associated with fixing these issues rise exponentially the longer they remain undiscovered. In a March 2009 paper on software quality, Gartner analyst Thomas Murphy quite understandably (and with some degree of under-statement) observes that this is a cause of “IT versus business friction.”

It is little wonder, therefore, that the tight linkages established in 2009 between IT spending and business performance metrics are considered to be here to stay.

With scrutiny and concern the order of the day, many of the industry analyst predictions that have appeared at the start of this new decade are focused on what IT should be doing to improve its success in the process of delivering quality software on

time, on budget, and, perhaps more importantly, in line with the needs of the user.

A beautiful friendshipDespite being cash-strapped and under-resourced, there are several factors that would suggest IT has reasons to be optimistic in 2010. Necessity has shown itself to be the mother, if not of invention, then of adoption. Organisations of all sizes are embracing new ways of doing things as they seek to break through yet another glass ceiling of productivity and cost-efficiency.

One view, from Forrester Research, is that they need to start thinking like the “underfunded start-up that is always in the throes of a one-company recession.” It is not about working even harder than they do already. It is about working smarter. It is about working efficiently on the projects that matter, that drive revenue and growth. And staying close enough to the business along the way to know when priorities change and new goals arise.

It’s also about knowing when enough is enough, so that software no longer arrives bloated with redundant features that seemed like a good idea on the drawing board. Many companies find, as requirements are delivered incrementally and in line with business priority, that projects not only complete sooner, but the final 20 percent is often perceived as non-critical for deployment - and often never asked for again.

Such collaboration between all members of the software delivery team yields benefits for everyone. It creates what Thomas Murphy refers to as a “hive” mind set instead of being “adversarial”, where the different disciplines (such as business analysts, testers and developers) bring their respective strengths and viewpoints to bear on the problems being solved, helping to deliver software earlier in the project lifecycle and reducing rework.

T.E.S.T Digest | D13

Collaboration between all members of the software delivery team yields benefits for everyone. It creates what Thomas Murphy refers to as a “hive” mind set instead of being “adversarial”, where the different disciplines (such as business analysts, testers and developers) bring their respective strengths and viewpoints to bear on the problems being solved, helping to deliver software earlier in the project lifecycle and reducing rework.

June 2010 | T.E.S.T DIGEST www.testmagazineonline.com

Page 42: TEST Magazine - June-July 2010

D14 | T.E.S.T Digest

You had me at helloIn Gartner’s December 2009 research note, ‘Predicts 2010: Agile and Cloud Impact Application Development Directions’, the analyst firms draws attention to a number of ways in which IT can raise its game and deliver yet greater value to the business, with closer collaboration and the need for a broader definition of software quality very much at the heart of this.

Perhaps one of the most fundamental statements the research note makes is that “software quality can’t be tested at the end”. The IT horror stories mentioned earlier are living proof of this. Companies must look to drive quality throughout the development lifecycle and make use of facilities and processes that support this.

Agile development methods are already helping. They are driving (if not demanding) greater levels of collaboration. And the fact that these methods are now starting to take root in mainstream development shops is great news for lovers of quality software. Gartner believes that by 2012, “agile development methods will be utilized in 80 percent of all

software development projects”, and, furthermore, that companies who embrace it, and introduce the cultural and behavioural changes to support it, are already seeing “four times the improvement in overall productivity”.

For IT to succeed in 2010, Vendors have a responsibility to provide not only the tooling, but also the process support to help companies drive quality from start to finish on a project, including helping them shore up weak requirement practices. By moving ‘quality’ upstream within the development process and linking it more closely with the beginning rather than the end of the lifecycle (for example, testing against user stories rather than function points), business and IT will understand each other more fully and improve their chances to share a common goal. Only then will the growth that everyone is predicting, the growth that everyone needs, come to the industry on the back of fundamental, grass roots improvement, rather than through increasing the stress levels of an already stretched group of people. As an industry, it is time to grow up once and for all. It is time to talk.

T.E.S.T DIGEST | June 2010 www.testmagazineonline.com

Julian Dobbins Head of analyst relationsMicro Focuswww.microfocus.com

Gartner believes that by 2012, “agile development methods will be utilized in 80 percent of all software development projects”, and, furthermore, that companies who embrace it, and introduce the cultural and behavioural changes to support it, are already seeing “four times the improvement in overall productivity”.

Page 43: TEST Magazine - June-July 2010

Held at: Park Inn Hotel, Heathrow

For more informationContact Grant Farrell on +44 (0) 203 056 4598

Email: [email protected]

Website: www.testfocusgroups.com

21st June 2011● One Day Event ● 80 Decision Makers ● 10 Thought Leading Debate Sessions

Peer-to-Peer Networking ● Exhibition ● Cutting Edge Content

F O C U S G R O U P SHelping you overcome obstacles

Page 44: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

42 | T.E.S.T company profile

The Information Systems Examinations Board (ISEB) is part of the BCS, the Charted Institute for IT, and is an international examination body created to raise the standard of competence and performance of people working in IT. We’re leading the way in qualifications for IT professionals – delivering more than 380,000 exams in over 200 countries.

Our qualifications are internationally recognised and cover eight major subject areas in: Software Testing, ITIL /IT Service Management, IT Assets and Infrastructure, Systems Development, Business Analysis, Project Management, IT Governance, Information and Security and our new qualification Sustainable IT.

These are available at Foundation, Practitioner and Higher Level to suit each individual candidate. ISEB Professional Level is also available.

For more information visit www.iseb-exams.com.

These qualifications are delivered via a network of high quality accredited training and examination providers. The breadth and depth of our portfolio is one of its key strengths as it encourages knowledge, understanding and application in specific business and IT areas.

Candidates develop their competence, ability and aptitude – and therefore their professional potential – giving employers the edge they’re looking for

BCS

BCS, The Chartered Institute for IT, promotes wider social and economic progress through the advancement of information technology science and practice.

We bring together industry, academics, practitioners and government to share knowledge, promote new thinking, inform the design of new curricula, shape public policy and inform the public. As the professional membership and accreditation body for IT, we serve over 70,000 members including practitioners, academics and students, in the UK and internationally. A leading IT qualification body, we also offer a range of widely recognised professional and end-user qualifications.

BCS membership for software testers

BCS membership gives you an important edge; it shows you are serious about your career in IT and are committed to your own professional development, confirming your status as an IT practitioner of the highest integrity.

Our growing range of services and benefits are designed to be directly relevant at every stage of your career.

Industry recognition

Post-nominals – AMBCS, MBCS, FBCS & CITP – are recognised worldwide, giving you industry status and setting you apart from your peers.

BCS received its Royal Charter in 1984 and is currently the only awarding body for Chartered IT Professional (CITP) status, also offering a route to related Chartered registrations, CEng and CSci.

Membership grades

Professional membership (MBCS) is our main professional entry grade and the route to Chartered(CITP) status.

Professional membership is for competent IT practitioners who typically have five or more years of IT work experience. Relevant qualifications, eg a computing-related degree, reduce this requirement to two or three years of experience. Associate membership (AMBCS) is available for those just beginning their career in IT, requiring just one year’s experience.

Joining is straightforward – for more information visit:www.bcs.org/membership where you can apply online or download an application form.

Best practice

By signing up to our Code of Conduct and Code of Good Practice, you declare your concern for public interest and your commitment to keeping pace with the increasing expectations and requirements of your profession.

Networking opportunities

Our 44 branches, 16 international sections and over 40 specialist groups including Software Testing (SIGIST) and Methods & Tools, provide access to a wealth of experience and expertise. These unrivalled networking opportunities help you to keep abreast of current developments, discuss topical issues and make useful contacts.

Specialist Group in Software Testing (SIGIST)

With over 2,500 members SIGIST is the largest specialist group in the BCS. Objectives of the group include promoting the importance of software testing, developing the awareness of the industry’s best practice and promoting and developing high standards and professionalism in software testing. For more information please visit: www.sigist.org.uk.

Information services

The BCS online library is another invaluable resource for IT professionals, comprising over 200 e-books plus Forrester reports and EBSCO databases. BCS members also receive a 20 percent discount on all BCS book publications. This includes Software Testing, an ISEB Foundation and Intermediate. As well as explaining the basic steps of the testing process and how to perform effective tests, this book provides an overview of different techniques, both dynamic and static, and how to apply them.

Career development

A host of career development tools are available through BCS including full access to SFIA (the Skills Framework for the Information Age) which details the necessary skills and training required to progress your career.

ISEB

BCS, First Floor, Block D, North Star House, North Star Avenue, Swindon, SN2 1FA, United KingdomTel: +44 (0) 1793 417655 Fax: +44 (0) 1793 417559 Email: [email protected] Web: www.iseb-exams.com

Page 45: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

T.E.S.T company profile | 43

For over 20 years Parasoft has been studying how to

efficiently create quality computer code. Our solutions

leverage this research to deliver automated quality

assurance as a continuous process throughout the SDLC.

This promotes strong code foundations, solid functional

components, and robust business processes. Whether you

are delivering Service-Orientated Architectures (SOA),

evolving legacy systems, or improving quality processes

– draw on our expertise and award winning products to

increase productivity and the quality of your business

applications.

Parasoft's full-lifecycle quality platform ensures secure,

reliable, compliant business processes. It was built from

the ground up to prevent errors involving the integrated

components—as well as reduce the complexity of testing

in today's distributed, heterogeneous environments.

What we do

Parasoft's SOA solution allows you to discover and

augment expectations around design/development

policy and test case creation. These defined policies are

automatically enforced, allowing your development team

to prevent errors instead of finding and fixing them later

in the cycle. This significantly increases team productivity

and consistency.

End-to-end testing:

Continuously validate all critical aspects of complex

transactions which may extend through web interfaces,

backend services, ESBs, databases, and everything in

between.

Advanced web app testing:

Guide the team in developing robust, noiseless regression

tests for rich and highly-dynamic browser-based

applications.

Application behavior virtualisation:

Automatically emulate the behavior of services, then

deploys them across multiple environments – streamlining

collaborative development and testing activities. Services

can be emulated from functional tests or actual runtime

environment data.

Load/performance testing:

Verify application performance and functionality under

heavy load. Existing end-to-end functional tests are

leveraged for load testing, removing the barrier to

comprehensive and continuous performance monitoring.

Specialised platform support:

Access and execute tests against a variety of platforms

(AmberPoint, HP, IBM, Microsoft, Oracle/BEA, Progress

Sonic, Software AG/webMethods, TIBCO).

Security testing:

Prevent security vulnerabilities through penetration

testing and execution of complex authentication,

encryption,

and access control test scenarios.

Trace code execution:

Provide seamless integration between SOA layers by

identifying, isolating, and replaying actions in a

multi-layered system.

Continuous regression testing:

Validate that business processes continuously meet

expectations across multiple layers of heterogeneous

systems. This reduces the risk of change and enables rapid

and agile responses to business demands.

Multi-layer verification:

Ensure that all aspects of the application meet uniform

expectations around security, reliability, performance, and

maintainability.

Policy enforcement:

Provide governance and policy-validation for composite

applications in BPM, SOA, and cloud environments to

ensure interoperability and consistency across all SOA

layers.

Please contact us to arrange either a one to one briefing

session or a free evaluation.

Web: www.parasoft.com Email: [email protected] Tel: +44 (0) 208 263 6005

ParasoftImproving productivity by delivering quality as a continuous process

Page 46: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

44 | T.E.S.T company profile

www.learntesting.com [email protected] Learntesting Ltd, 5th Floor 117-119 , Houndsditch, London EC3A 7BT.

There are many reasons for individuals and businesses to

increase knowledge, skills and gain industry-recognised

certification. However, with growing financial and time

constraints, more flexible solutions are needed to realise

the benefits.

Learntesting delivers a flexible, innovative online learning

service designed to put you in control of all your software

testing learning and certification needs. It delivers this

unique service via a global network of expert training

providers, a range of high-quality testing courses and

content, powered by learning technology used by some of

the largest businesses in the world.

The Learntesting ‘Virtual Learning Environment (VLE)’

guides and supports your testing education on an ongoing

basis with:

•HighQualityonlinelearningwithavarietyofcontentto

suit different learning styles and budgets, including fully

accredited ISTQB and ISEB courses;

•‘LiveVirtualClassrooms’ledbyexperiencedtutorsfor

exercise revision sessions and exam preparation;

•Aprivatelibraryof50testingandtestingrelatedebooks

24x7;

•Globalsupportwith24x7accesstoaccreditedtutors

worldwide;

•Onlineexamstylequestionpapersandanswerswith

detailed explanations;

•Self-registrationsystemwithaccesstoarangeoffree

valuable content in the Learntesting ‘Testers Treasure

Chests’;

•Examsavailableglobally.

At Learntesting, we recognize that everyone is different and

so provide something to suit everyone according to:

•Personalgoals;

•Existingknowledgeandexperience;

•Budget;

•Studyavailability;

•PreferredLearningStyles.

We can provide this because we have invested in an

‘industrial strength’ solution – scalable from the individual

to the largest corporations in the world.

Learntesting provides support for the complementary

ISTQB and ISEB software testing certification schemes and

also other aspects of software testing:

Certification

•ISTQBCertifiedTesterFoundationLevel(CTFL);

•ISTQBCertifiedTesterAdvancedLevel(CTAL);

- Advanced Test Analyst

- Advanced Test Manager

- Advanced Technical Test Analyst

•ISEBIntermediateCertificateinSoftwareTesting.

General Testing

•AgileTesting

•TestTechniques

•APrivateLibraryofTestingBooks,Templates&

Information

As an independent recognition of achievement, after only

six months of operation Learntesting was selected as a

finalist for the prestigious ‘Learning Technologies Solution

of the Year’ award from the Institute of IT Training in

February 2010.

Visit www.learntesting.com for your nearest

Learntesting provider and self-register for free. For more

information, please contact [email protected]

Learntesting

Page 47: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

T.E.S.T company profi le | 45

TechExcel is the leader in unifi ed Application Lifecycle

Management as well as Support and Service solutions

that bridge the divide between product development and

service/support.

This unifi cation enables enterprises to focus on the

strategic goals of product design, project planning,

development and testing, while enabling transparent

visibility with all customer-facing initiatives.

TechExcel has over 1,500 customers in 45 countries and

maintains offi ces in UK, US, China and Japan.

Application Lifecycle Management DevSuite

is built around the best-practices insight that knowledge

is central to any product development initiative. By

eliminating the silos of knowledge that exist between

different teams and in different locales, DevSuite helps

enterprises transform their development processes,

increasing effi ciency and overall quality.

DevSpec DevSpec is an integrated requirements

management solution that is specifi cally designed to

provide visibility, traceability and validation of your

product or project requirements. DevSpec provides a

framework to create new requirements, specifi cations and

features that can be linked to development and testing

implementation projects.

DevPlan DevPlan is a project, resource, and task

management tool. It allows users to plan high level areas

of work, assign team members to work in these areas, and

then track the tasks needed to complete the activities.

DevTrack DevTrack is the leading project issue and

defect tracking tool that is used by development teams

of all sizes around the globe. Its confi gurable workfl ows

allow DevTrack to meet the needs of any organisation's

development processes.

DevTest From test case creation, planning and

execution through defect submission and resolution,

DevTest tracks and manages the complete quality lifecycle.

DevTest combines the test management features of

DevTest, DevTrack and TestLink for test automation into

one integrated solution.

KnowledgeWise KnowledgeWise is the knowledge

management solution at the core of the entire suite.

It is the centralised knowledge base for all company

documents including: contracts, processes, planning

information and other important records as well as

customer-facing articles, FAQs, technical manuals and

installation guides.

More information at:

www.techexcel.com/products/devsuite.

Service and Support Management Service and

Support Management solutions provide enterprises with

total visibility and actionable intelligence for all service

desk, asset management and CRM business processes.

ServiceWise ServiceWise is a customisable and comprehensive internal Helpdesk, ITSM- and ITIL-compliant solution. Automate and streamline services and helpdesk activities with confi gurable workfl ows, process management, email notifi cations and a searchable knowledge base. The self-service portal includes online incident submission, status updates, online conversations and a knowledgebase. ServiceWise includes modules such as incident management, problem escalation and analysis, change management and asset management.

CustomerWise CustomerWise is an integrated CRM solution focused on customer service throughout the entire customer lifecycle. CustomerWise allows you to refi ne sales, customer service and support processes to increase cross-team communication and effi ciency while reducing your overall costs. Combine sophisticated process automation, knowledgebase management, workfl ow, and customer self-service to improve business processes that translate into better customer relationships.

AssetWise AssetWise aids the process of monitoring, controlling and accounting for assets throughout their lifecycle. A single and centralised location enables businesses to monitor all assets including company IT assets, managing asset inventories, and tracking customer-owned assets.

FormWise FormWise is a web-based form management solution for ServiceWise and CustomerWise. Create fully customised online forms and integrate them directly with your workfl ow processes. Forms can even be routed automatically to the appropriate individuals for completion, approval, and processing, improving your team's effi ciency. Web-based forms may be integrated into existing websites to improve customer interactions including customer profi ling, surveys, product registration, feedback, and more.

DownloadPlus DownloadPlus is an easy-to-use website management application for monitoring fi le downloads and analysing website download activities. DownloadPlus does not require any programming or HTML. DownloadPlus provides controlled download management for all downloadable fi les, from software products and documentation, to marketing materials and multimedia fi les. More information at: www.techexcel.com/products/itsm/

Training

Further your investment with TechExcel, effective training is essential to getting the most from an organisation's investment in products and people. We deliver professional instructor-led training courses on every aspect of implementation and use of all TechExcel’s software solutions as well as both service management and industry training. We are also a Service Desk institute accredited training partner and deliver their certifi cation courses.More information at: www.techexcel.com/support/

techexceluniversity/servicetraining.html

For more information, visit www.techexcel.com or call 0207 470 5650.

TechExcel

Page 48: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

46 | T.E.S.T company profile

Original Software

With a world class record of innovation, Original

Software offers a solution focused completely on the

goal of effective quality management. By embracing

the full spectrum of Application Quality Management

across a wide range of applications and environments,

the company partners with customers and helps

make quality a business imperative. Solutions include

a quality management platform, manual testing,

full test automation and test data management, all

delivered with the control of business risk, cost, time

and resources in mind.

Setting new standards for application quality

Today’s applications are becoming increasingly

complex and are critical in providing competitive

advantage to the business. Failures in these key

applications result in loss of revenue, goodwill and

user confidence, and create an unwelcome additional

workload in an already stretched environment.

Managers responsible for quality have to be able

to implement processes and technology that will

support these important business objectives in a

pragmatic and achievable way, without negatively

impacting current projects.

These core needs are what inspired Original Software

to innovate and provide practical solutions for

Application Quality Management (AQM) and

Automated Software Quality (ASQ). The company

has helped customers achieve real successes by

implementing an effective ‘application quality

eco-system’ that delivers greater business agility,

faster time to market, reduced risk, decreased

costs, increased productivity and an early return on

investment.

These successes have been built on a solution that

provides a dynamic approach to quality management

and automation, empowering all stakeholders in

the quality process, as well as uniquely addressing

all layers of the application stack. Automation has

been achieved without creating a dependency

on specialised skills and by minimising ongoing

maintenance burdens.

An innovative approach

Innovation is in the DNA at Original Software. Its

intuitive solution suite directly tackles application

quality issues and helps organisations achieve the

ultimate goal of application excellence.

Empowering all stakeholders

The design of the solution helps customers build

an ‘application quality eco-system’ that extends

beyond just the QA team, reaching all the relevant

stakeholders within the business. The technology

enables everyone involved in the delivery of IT

projects to participate in the quality process – from

the business analyst to the business user and from

the developer to the tester. Management executives

are fully empowered by having instant visibility of

projects underway.

Quality that is truly code-free

Original Software has observed the script

maintenance and exclusivity problems caused by

code-driven automation solutions and has built a

solution suite that requires no programming skills. This

empowers all users to define and execute their tests

without the need to use any kind of code, freeing

them from the automation specialist bottleneck.

Not only is the technology easy to use, but quality

processes are accelerated, allowing for faster delivery

of business-critical projects.

Top to bottom quality

Quality needs to be addressed at all layers of

the business application. Original Software gives

organisations the ability to check every element of

an application - from the visual layer, through to the

underlying service processes and messages, as well as

into the database.

Addressing test data issues

Data drives the quality process and as such cannot

be ignored. Original Software enables the building

and management of a compact test environment

from production data quickly and in a data privacy

compliant manner, avoiding legal and security risks.

It also manages the state of that data so that it is

synchronised with test scripts, enabling swift recovery

and shortening test cycles.

A holistic approach to quality

Original Software’s integrated solution suite is

uniquely positioned to address all the quality needs

of an application, regardless of the development

methodology used. Being methodology neutral, the

company can help in Agile, Waterfall or any other

project type. The company provides the ability to

unite all aspects of the software quality lifecycle. It

helps manage the requirements, design, build, test

planning and control, test execution, test environment

and deployment of business applications from one

central point that gives everyone involved a unified

view of project status and avoids the release of an

application that is not ready for use.

Helping businesses around the world

Original Software’s innovative approach to solving

real pain-points in the Application Quality Life

Cycle has been recognised by leading multinational

customers and industry analysts alike. In a 2010

report, Ovum stated: “While other companies have

diversified, into other test types and sometimes

outside testing completely, Original has stuck more

firmly to a value proposition almost solely around

unsolved challenges in functional test automation.

It has filled out some yawning gaps and attempted

to make test automation more accessible to non-

technical testers.”

More than 400 organisations operating in over

30 countries use Original Software solutions. The

company is proud of its partnerships with the likes of

Coca-Cola, Unilever, HSBC, FedEx, Pfizer, DHL, HMV

and many others.

www.origsoft.com [email protected] Tel: +44 (0)1256 338 666 Fax: +44 (0)1256 338 678

Grove House, Chineham Court, Basingstoke, Hampshire, RG24 8AG

Delivering quality through innovation

Page 49: TEST Magazine - June-July 2010

June 2010 | T.E.S.T www.testmagazineonline.com

T.E.S.T company profi le | 47

iTrinegyNetwork emulation & application testing toolsiTrinegy is Europe’s leading producer of network emulator technology which enables testers and QA specialists to conduct realistic pre-deployment testing in order to confi rm that an application is going to behave satisfactorily when placed in the fi nal production network.

Delivering more realistic testingIncreasingly, applications are being delivered over wide area networks (WANs), wireless LANs (WLAN), GPRS, 3G, satellite networks etc, where network characteristics such as bandwidth, latency, jitter and packet error or loss can have a big impact on their performance. So, there is a growing need to test software in these environments.

iTrinegy Network Emulators enable you to quickly and easily recreate a wide range of network environments for testing applications, including VoIP, in the test lab or even at your desktop.

Ease of useOur network emulators have been developed for ease of use:

• Noneedtobeanetworkexpertinordertousethem• Pre-suppliedwithanextensiverangeofpredefinedtestnetwork scenarios to get you started

• Easytocreateyourowncustomtestscenarios• Alltestscenarioscanbesavedforsubsequentreuse• Automatedchangesinnetworkconditionscanbe

applied to refl ect the real world• Workseamlesslywithloadgenerationandperformance

tools to further enhance software testing.

A comprehensive range to suit your needsiTrinegy’s comprehensive range of network emulators is designed to suit your needs and budget. It includes:

• Softwareforinstallationonyourowndesktoporlaptop(trial copies available)

• Small,portableinlineemulatorsthatsitsilentlyonthedesktop and can be shared amongst the test team

• Largerportableunitscapableofeasilyrecreatingcomplex multi-path, multi-site, multi-user networks for full enterprise testing

• Highperformancerack-mountunitsdesignedtobeinstalled in dedicated test labs

• Veryhighperformanceunitscapableofreplicatinghighspeed, high volume networks making them ideal for testing applications in converged environments.

If you would like more information on how our technology can help you ensure the software you are testing is ‘WAN-ready’ and going to work in the fi eld, please contact

iTrinegy using the details below:

Email: [email protected] Tel: +44 (0)1799 543 345 Web: www.itrinegy.com/testmagazine

The T.E.S.T Focus Groups is a complimentary event specially designed and targeted at senior software testers, testing managers, QA & project managers, who wish to discuss and debate some of their most pressing challenges in a well thought out yet informal setting.

T.E.S.T Magazine, the T.E.S.T Focus Groups sister product, spends a lot of time spends a lot of time speaking and listening to its customers and then seeking out innovative ways to meet their needs. It has become apparent that senior decision makers wish to discuss their current challenges in a meaningful and structured manner with a view to fi nding pragmatic and workable solutions to what are invariably complex issues. Suppliers, who are naturally keen to meet these professionals want to gain a clearer understanding of these challenges and identify how, through meaningful dialogue, they can assist.

This logic coupled with T.E.S.T Magazine’s consistent desire to drive the market forward lead us to launch the T.E.S.T Focus Groups for 2011!

Due to the demands put on modern managers and the subsequent limited opportunities available to join together and voice opinions – the challenges consistently faced by today’s army of testers and testing management tend not to get resolved as quickly as enterprise would like. As a market-leading publisher and events business the organiser understands there should be a format that empowers meaningful debates to assist managers & directors overcome their issues. The T.E.S.T Focus Groups therefore provides ten specially designed syndicate rooms, each containing a specialist subject for delegates to discuss and debate the matter in hand with a view to fi nding pragmatic and workable solutions.

With some of the industry’s leading minds on hand to help facilitate and steer each session the T.E.S.T Focus Groups will quickly become a ‘must-attend’ event for anyone serious about software testing & QA. Add to this there are plenty of networking opportunities available in addition to a small exhibition, and each

delegate is provided a fabulous opportunity to interact with their peers, source the latest products and services, and develop meaningful relationships in an informal yet professional setting.

Subjects to be debated are:

People or Technology – Who Gets the Cash?

The Value of Testing Requirements

Does The User Matter?

Agile Testing

Crowd Testing

Outsourcing

Qualifi cations, Accreditation, & Exams

Event Sponsors Subject

Identifying Tester Related Risks

Tester Training

If you are interested in being a delegate at the T.E.S.T Focus Groups please visit: www.testfocusgroups.com/delegates.html or to register visit: www.testfocusgroups.com/register.html

If you are interested in sponsoring this event and hosting a session please visit: www.testfocusgroups.com/sponsor.html

Or to discuss any aspect of the event please contact Grant Farrell on +44 (0) 203 056 4598 or email: [email protected]

www.testfocusgroups.com +44 (0) 870 863 6930 [email protected]

T.E.S.T Focus Groups F O C U S G R O U P S

Page 50: TEST Magazine - June-July 2010

T.E.S.T | June 2010 www.testmagazineonline.com

48 | The Last Word

There is a school of thought in software testing that debunks the value of positive testing. This school basically states that any test that does not produce a defect is not a good test. Dave Whalen respectfully disagrees.

The value of positive testing

Software tests can be divided into two categories: positive tests and negative tests. A positive test is used primarily,

if not solely, to validate that a given system, function, operation, etc works as designed when a user enters the right data, in the right place, at the right time, clicks the right buttons, etc. Negative tests try to purposely break the system to verify that the system responds as expected and fails gracefully when it gets the wrong data, in the wrong place, at the wrong time. It’s with negative tests that we really earn our stripes as testers. A negative test should cause an error. It’s expected to cause an error. If it causes an error, and the error is handled correctly, the test passes. A positive test should not cause an error. Does that make it an invalid test – absolutely not!

The ultimate goalThe ultimate goal of positive and negative tests is completely different. We need to verify that something works correctly before we try to break it. If we don’t know that it works correctly, then how can we know when it doesn’t? Positive tests answer that question. If the system doesn’t work as it’s supposed to when everything is correct, all other tests, especially negative tests, are really irrelevant.

Let me state right up front – I’m not a professional software developer – not even close. I never claimed to be one. From my limited experience I like to make sure something works first, and then I focus on what happens when it doesn’t. It’s also much easier to build.

Imagine my surprise when I was recently told that some negative functionality would be delivered before the positive functionality. I was perplexed! The results were predictable. I couldn’t create and save

a basic record but the error handling was really nice. In a nutshell, the system could not do what it was designed to do but it looked really good.

Car troubleIt reminds me of a car I bought in high school. It looked really nice in the driveway. It had to sit in the driveway – it rarely ran. I remember a friend telling me that the car was over-rated and didn’t perform the way it was rumoured to. I just wanted it to start and take me to the store. Apparently it also got really bad gas mileage. Of course, sitting in my driveway it got great gas mileage!

Automotive magazines would advertise all kinds of devices to boost miles per gallon. My ‘friends’ would encourage me to purchase these gadgets. “They will pay for themselves after a couple of tanks”. I just wanted the stupid thing to run, and then I’d worry about how much gas it used.

I view positive tests in much the same way. Show me it works like it’s supposed to and then we’ll worry about what happens when it doesn’t or how well it performs (or doesn’t).

Fun timeI always write and run positive tests first. Once the system can pass the positive tests the fun starts. Now I get to be creative and break it. Equivalence Class testing, Boundary Value testing, etc are all great test techniques, but they are effectively useless if the system isn’t functioning correctly to begin with. If the system is failing with valid data, it doesn’t really make sense to test with invalid data - yet. Unless, of course, the system accepts the invalid data - that would be bad. But that’s why we need to test both positive and negative scenarios. Test the positive first to make sure the system responds correctly to good

data, correct sequences of operation, correct user actions, etc. Then we can validate what happens when entering invalid data, incorrect sequences, or incorrect user actions.

An additional benefit of positive testing – smoke tests! When you receive a new code drop or build, what better way to validate the core system functionality than to run through your suite of positive tests? Positive tests are my first automation candidates. They are typically quick and easy to run. My smoke tests will usually consist of the entire library of positive tests, or a large subset of them: the critical ones.

I like to target no more than 30 minutes to run a valid, end-to-end smoke-test. With a good test automation tool you can achieve a lot in 30 minutes. I like to run an automated smoke test with every new build, on every environment. If we’re doing daily builds, I run a daily smoke test. When the smoke test passes I can be reasonably sure I have a good system to begin more in-depth testing. I can accept the build, and start my test clock. If it fails – I can kick it back.

For a bit of extra incentive – consider the doughnut factor. If the smoke test passes, I buy doughnuts for the team. If it fails, the development team buys the doughnuts. I hear bagels work too.

Dave Whalen President and senior software entomologistWhalen Technologieshttp://softwareentomologist.wordpress.com

We need to verify that something works correctly before we try to break it. If we don’t know that it works correctly, then how can we know when it doesn’t?

the last word...

Page 51: TEST Magazine - June-July 2010

For exclusive news, features, opinion, comment, directory, digital archive and much more visit

www.testmagazineonline.com

The European Software Tester

The Whole Story

www.31media.co.uk

Print Digital Online

Page 52: TEST Magazine - June-July 2010

By gaining an ISEB qualification, you also become eligible for BCS Professional membership (MBCS) keeping you aheadof the competition at every stage in your career.

Software testing qualifications from BCS, The Chartered Institute for IT, provide you with global industry recognition of your skills and experience.They offer a professional foundation for your career development in thegrowing software testing profession.

Software testing qualificationsRaise your professional profile

Find out more at: www.iseb-exams.com/testor call us on 01793 417 655

ISEB-advert-A4-ma_Layout 1 21/06/2010 15:10 Page 1