Page 1
NZTester The Quarterly Magazine for the New Zealand Software Testing Community and Supporters
COMPLIMENTARY ISSUE 9 JUL 2015 - OCT 2015
CONFERENCE ISSUE!
Interview with Dan Minkin, Planit Auckland
Testing@Serko Online
A Picture Is Worth a Thousand Words
Why Mobile Testing Is So Important
Test is Dead, Long Live Test!
100 Days of Testing
Consider the Humble Test Manager + Coffee with Viswa, Testing Events
+ all the Conference details & more….
Page 2
2
NZTester Magazine
Editor: Geoff Horne [email protected] [email protected] ph. 021 634 900 P O Box 48-018 Blockhouse Bay Auckland 0600 New Zealand www.nztester.co.nz
Disclaimer:
Articles and advertisements contained in NZTester Magazine are published in good faith and although provided by people who are experts in
their fields, NZTester Magazine make no guarantees or representations of any kind concerning the accuracy or suitability of the information
contained within or the suitability of products and services advertised for any and all specific applications and uses. All such information is
provided “as is” and with specific disclaimer of any warranties of merchantability, fitness for purpose, title and/or non-infringement. The
opinions and writings of all authors and contributors to NZTester Magazine are merely an expression of the author’s own thoughts, knowledge
or information that they have gathered for publication. NZTester Magazine does not endorse such authors, necessarily agree with opinions
and views expressed nor represents that writings are accurate or suitable for any purpose whatsoever. As a reader of this magazine you
disclaim and hold NZTester Magazine, its employees and agents and Geoff Horne, its owner, editor and publisher, harmless of all content
contained in this magazine as to its warranty of merchantability, fitness for purpose, title and/or non-infringement.
No part of this magazine may be reproduced in whole or in part without the express written permission of the publisher.
© Copyright 2015 - NZ Tester Magazine, all rights reserved.
Advertising Enquiries: [email protected]
Page 3
3
IN THIS ISSUE…
Click on title...
4 NZTester Magazine Conference
7 Interview with Dan Minkin Director of Testing, Planit Auckland
10 A Picture Is Worth a Thousand Words David Rodriguez, Fujitsu NZ
15 Testing @ Serko Online
18 100 Days of Testing Chloe Holt, Assurity Consulting
20 Consider the Humble Test Manager NZTester Magazine, Staff Writer
22 Why Mobile Testing Is So Important Andy Parish, Planit
24 Test Is Dead - Long Live Test! Matt Mansell, IntegrationQA
REGULAR FEATURES
17 Testing Events
27 Coffee with Viswa
31 NZTester and OZTester Magazine
Back Issues
The Journal For New Zealand Test Professionals
Well it’s been another slow drag getting this latest
issue out the door with lotsa excuses from me!
Truth is that I’ve been endeavouring to get the
Conference proceedings under way and it never
seems to the right time to get announcements into
these pages while so many things are up in the air.
But anyway, for better or for worse, its here.
Talking of the Conference, we’re a go for 10-12
August in Wellington, see page 4 for details and
we’re pretty chuffed to have both Julie Gardiner and
Isabel Evans joining us from the UK. Both are well
known in international testing circles and very
experienced testing professionals so we are
certainly privileged to have these two first ladies of
UK testing on our programme.
In addition we have Matt Mansell and Bryce Day
with us again along with David Rodriguez, who
along with Matt, has provided an article for this
issue of NZTester Magazine, see page 10. Stay tuned
for more presenters and promotions as we
confirm them!
We’re also happy to advise that we’ll be keeping the
fees the same as last year. Registrations are open,
our earlybird discount runs until 20 July and as
always we’re providing bulk discounts. We also
expect to release a number of promotions along the
way just make things easier for some folk to get to
Conference who might not otherwise have the
opportunity.
In this issue, along with the Conference details we
also have articles from first-timers David Rodriguez,
Andy Parish and Chloe Holt plus seasoned
campaigner Matt Mansell along with an offering
from our own staff writer.
We also have a few of our regular features; this
issue interview is with Dan Minkin of Planit
Auckland and our Coffee with Viswa column gets an
expanded treatment.
Anyway, I trust you enjoy the issue and we certainly
hope to see you at the Conference in August.
Keep well and stay in touch.
Page 4
4
Crossing the Great Divide
Page 5
5
SCHEDULE Wed 12 August 2015
08:45 Official Conference Opening & Welcome - Geoff Horne, NZTester
Magazine
09:00 KEYNOTE The 2015 Survival Guide - Lessons for Testing in the
Wild - Julie Gardiner, Hitachi Consulting
10:00 Coffee
10:15 The Abolition of Testing? - Matt Mansell, IntegrationQA
11:15 Let the Picture Tell the Story - David Rodriguez, Fujitsu NZ
12:15 Lunch
01:15 Dealing With Professional Manipulation - Geoff Horne, NZTester
Magazine
02:15 Deploying Test Tools Within Government Agencies - Bryce Day,
Catch Software
03:15 Coffee
03:30 KEYNOTE Restore To Factory Settings - Isabel Evans,
Independent Consultant
04:30 Panel Discussion - The Floor Is Yours! - All presenters
05:30 Conference Close - Geoff Horne, NZTester Magazine
05:45 Reception
07:00 Conference Dinner
Page 6
6
TUTORIALS - 10/11 August 2015
Heuristics, Bias & Critical Thinking for Testers – Matt Mansell (NZ) – wanting to get into exploratory testing? Then this full day Tutorial is for you as Matt takes us through the in’s and out’s of investigative test approaches and techniques.
Test Attacks For Testing Mobile Apps – Julie Gardiner (UK) – this for anyone who wants to be more effective at finding issues and defects in mobile applications. Given there are so many platforms & combinations thereof, where do we start?
Communications Skills for Testers – Julie Gardiner (UK) – why is it we can find bugs so well yet sometime cannot describe the details or we know the story but struggle to articulate it. Julie is a master with the both written and spoken word, learn her techniques as she takes you through the whys and wherefores.
Testing the Data Warehouse – Geoff Horne (NZ) - found yourself with one of these to test? Not sure how to go about it? Then this session is for you as Geoff provides the basics on how to approach this quite unique testing challenge.
Testing with #8 Wire – Isabel Evans (UK) – it’s often been said that us kiwis would test with a piece of #8 wire if we could. Well now we can as Isabel shows us how to get the best out of every testing effort regardless of the resources that may or may not be available.
Rainmaking for Test Managers – Julie Gardiner (UK) – a rainmaker is described as someone who’s influence and character just seems to make customers want to follow them. Learn from Julie as she shows you how to make the rain in the test management space!
Applying Emotional Intelligence To Testing – Julie Gardiner (UK) – No matter how intelligent we might be, if we do not learn to bridle the human emotions that run alongside it, we can find ourselves very frustrated indeed. Pick Julie’s IQ & EQ as she shows how to mature emotions and focus them in a positive manner within the context of the testing lifecycle.
12 Tips for Developing Practical, Future-Proofed Test Automation Suites – Geoff Horne (NZ) – automation has been around for years now yet there are still a high number of misappropriations out there. Geoff will take you through the top twelve keys to making automation work for you, no matter what your technology or environment might hold.
Page 7
7
Our interview this issue is with Dan Minkin of Planit.
Dan has been around the testing traps in both the UK
and New Zealand and brings his unique experiences to
our shores.
NZTester: Can you please describe Planit
Planit is the largest Testing Services organisation
across Australia and New Zealand, currently in
release version 2.0! V1.0 was founded by Chris
Carter in Sydney in 1997, as the Australian branch
of a UK testing consultancy, ImagoQA. In 1999, Chris
affected a management buy-out and Planit was born.
Since then there has been significant organic
geographic expansion into Melbourne, Wellington,
Auckland, Perth, Hamilton and Christchurch.
V2.0 was released in January when we entered into
a partnership with leading mid-market Private
Equity firm Archer. This has allowed us to release
significant capital to invest in the next stage which
includes focus on entry into the UK market and
further European, Asian and Australasian services
and geographical expansion.
NZTester: What products and services does
Planit offer?
We offer a comprehensive range of testing services,
employing around 700 functional and non-functional
testing specialists. We provide consultancy and
advice, testing delivery and training and our testers
can be individually deployed or engaged via a
managed service model.
Right now, Service Virtualisation is one of the
most exciting additions to our offering. It is used
to mitigate risks in development while increasing
quality and speed of delivery by simulating costly
or constrained systems.
In a world first, we have recently partnered with
IBM to launch a Service Virtualisation as a Service
(SVaaS) solution. Delivered via the Planit Cloud,
we are providing SV via a highly affordable and
accessible pay-as-you-go model, removing the
barriers to entry for this game-changing technology.
Another key area of interest at the moment is Digital.
This is quite a broad topic but two areas of focus for
us include the devices market, where we are running
mobile-first projects across new technologies such
as wearables. The other area is the delivery
methodology; in a market where speed and time to
market is critical, we have started to focus on strong
Agile and continuous delivery practices such as test
automation and continuous integration.
NZTester: What do you believe makes Planit
different?
If you asked our staff, our clients and our partners
what word would describe Planit, it is ‘quality’. We
achieve this by being different in so many ways.
The fact that all of our consultants are permanent
and therefore are committed testers wanting a
career makes us unique in the global marketplace
as an independent specialist. This model enables
us operate a bench to respond to client needs
immediately, the ability to call on the skills and
manpower of all eight business units, coupled with
the fact that all of our directors and principals
have come through test or technical management
roles, means as a package we do offer a
unique proposition.
This issue’s interview is:
Dan Minkin Testing Director
Planit Testing Services
Auckland
Page 8
8
Our recruitment practices are another key
differentiator, as we are ever scouring the globe for
new talent, ensuring each new recruit meets our high
standards through a strict recruiting process. This
includes formally assessing candidates to ensure
quality of work, communication skills and cultural fit.
Even in the way we truly partner with our clients
makes us different from the pack; we offer so many
services and commercial offerings that are unique in
my experience, I could keep going for hours!
NZTester: What do you think makes a Test
Manager or Analyst come to work for
Planit?
Many of our differentiators and our strong growth
path makes Planit an employer of choice for many.
But in addition to that, we spend a lot of effort
nurturing a really positive atmosphere in our office.
We describe ourselves as a close-knit community.
In the Auckland office, we have office-based get-
togethers every fortnight. We also invest a lot in
providing support for individuals and their families,
many of whom have moved across the world.
During interviews, I have found two factors to be
most significant among candidates. These are the
desire to experience variety of work and the desire
for training and career development. Both of these
are at the heart of any role at Planit, as we upskill
our consultants with our world class training and
provide them with opportunities to work on some of
the most interesting projects on offer in the region.
NZTester: Where do you believe NZ’s approach to
testing is going well?
Firstly, keeping things onshore works for New
Zealand. Even the biggest internationals have such
a sense of local responsibility that they ensure that
the local market remains the key growth area, only
dipping into the international scene once the local
market has been fully considered.
Secondly, and possibly linked with this, the mature
attitude in NZ, to look to see what the wider global
market and, more locally, Australians have done, to
wait to see what gets shaken out and the ability to
learn the lessons from brother and sister companies
over the ditch.
Finally, the speedy acceptance of Agile practices and
Agile testing methods has been a tremendous push in
the right direction and one I think that has helped
testing not be regarded as a separate and disposable
discipline.
NZTester: Where do you believe the challenges
for NZ companies lay?
If you would have asked me two or three years ago,
I would have said acceptance of testing as a fully,
mature discipline, which was given the same
credibility as Business Analysis or Development.
But this seems to have moved on.
I cannot talk for other companies, but the immediate
and ongoing challenge for us now is recruitment to
maintain the quality standards we have set ourselves
over 15 years and the last 6 specifically in New
Zealand. For six years now, growth has been
restricted by the number of quality people we have
been able to find. Finding professional testing staff
with the right communication, self-management,
customer service skills and drive is an ongoing
process. Quite simply we haven’t closed the doors in
6 years. Other than that, an on-going challenge is
keeping the staff we have up-to-date with the latest
skills, be that Agile, BDD, Automation or Digital
skills. I’m sure every testing company finds that a
challenge; good thing for us that we specialise
in training!
NZTester: Where do you believe NZ’s approach
could improve?
Operational Acceptance Testing is much more
integrated and accepted set of disciplines in the
United Kingdom, where I learnt my trade. In much
the way that Security and Performance Testing
services have become much more in demand,
I suspect it will take a large infrastructure failure or
disaster scenario to move OAT into the mainstream.
NZTester: Do you believe that overall the
standard of testing in NZ is improving?
Yes. Both the standard of testing and the belief in
testing has steadily improved in NZ. Having a general
scan across the market over the past three years,
companies who embraced testing those years ago
are now operating at a noticeably higher standard
than and those who did not believe in testing and are
now, more recently, beginning to embrace testing.
Page 9
9
Editor’s comments: Thanks for writing for us
Dan. Had a quiet snigger to myself over the
2004— 2008 period as I think I had a similar
experience around the same time—just about
everything I worked on got canned., perhaps it was
just the times - Ed.
NZTester: Where do you believe the next
initiatives in testing lay? What’s coming next?
In NZ? Internationally?
Predicting the future in technology is frankly
impossible. There will be something that will
surprise 99% of us coming in the next five years I’m
sure … but I don’t know what it is! I tend to work on
1-3 year horizons. At this distance we have a fair
chance of identifying a current trend in its early stage
and keeping up with it.
Some initiatives and trends which will continue to
grow over the coming years include the use of open
source tools, continued change in the skills required
of testers (more technical , more adaptable), shift left
initiatives in their various guises, (DevOps,
Behavioural Driven Development, Service
Virtualisation). Business Intelligence and Digital
projects (and the consequent testing tools and
processes that are needed) cannot be ignored.
NZTester: Do you have a testing horror story to
share?
There was a time I deleted 50,000 customer records
while clearing down customer records for a test, and
pointed the query at production (these things were
possible in the 90’s).
Or the time I rang the help desk because the program
I was testing correctly generated the “Please ring the
help desk” error message. The kicker being I had only
programmed in the error message text earlier in the
week.
Finally, not so much a testing horror story rather
a personal project horror story. Between 2004 and
2008, each and every project I worked on was cut
short, de-scoped or cancelled. For four years I
couldn’t point to single dollar (or pound) or business
value that I had helped deliver. However the value of
testing is often that it allows the business to make
such difficult decisions, so there was value of sorts.
Thankfully at Planit, those are fading memories!
Page 10
10
How many times have you read through lengthy, detailed, wordy Test
Plans and thought “What are we trying to achieve here?” As Testers
our job is to understand complexity, and be able to convey that
understanding to others. Diagrams aid understanding. So why not let
diagrams tell the story?
In this article we will look at the benefits of using diagrams in testing.
Starting with a classic example of a diagram – the London Tube map,
we will then look at how informal sketches and then formal diagrams
of different types can be used.
Test planning involves reviewing types of documents including
Business Requirements, Solution Architecture and Network Designs.
These will often be very detailed – with complex diagrams. It is well
worth developing skills in producing diagrams at the right level for
inclusion in Test Plans. Such diagrams can then be used in discussions
with Project and Programme Managers and Business Sponsors.
Diagrams are not a silver bullet. They can help in gaining understanding
of the systems under test and then be used to explain clearly the
approaches to be taken to test them.
—————————————————————————————-
1 Mr Beck’s Underground Map – Ken Garland 1994, Pages 16 and 17 http://www.theguardian.com/artanddesign/2009/nov/26/london-tube-map-design#img-1 http://www.theguardian.com/arts/pictures/0,8542,1406256,00.html
A Picture Is Worth a Thousand Words— Using Diagrams In Testing
by David Rodriguez, Fujitsu NZ
A Classic Example of a diagram – the London Underground Map:
Page 11
11
In 2005 I attended an exhibition at the Design
Museum in London called “You are here – The
Design of Information”. A key exhibit was the London
Underground map drawn by Harry Beck in the
1930s. It was fascinating to see how Beck had
managed to simplify the complexity of the London
Underground network into a one page diagram.
He was an engineering draughtsman and his original
idea was based on an electrical wiring diagram. The
main features were: simplification of route lines to
verticals, horizontals or diagonals; the expansion of
the central area; the elimination of all surface detail
except for the line of the River Thames, itself
presented in the same stylised form as the
route lines1 .
How can diagrams help in testing?
As Test Analysts and Test Managers we need to
understand complexity to work out ways to test
infrastructure and applications. Projects will often
involve client server and web-based applications,
web servers, database servers, application servers,
virtualised environments, local and wide area
networks, storage area networks, data centres – in
multiple locations and countries. To test applications
we need to understand information flows and
processes within and between applications. Test
Plans will need to contain schedules identifying
interdependencies with other projects.
Testing is often an important part of the risk
mitigation strategy of a project. Test Plans need to
articulate clearly the ways in which this will be done.
Diagrams can contribute to expressing ideas
concisely.
Attention to detail is a key skill for Testers, but I have
found that focussing too much on the detail can mean
losing sight of what we are trying to achieve. I like
the concept of describing the project on one sheet of
paper, with an explanation of the main aspects of
testing. I have worked on projects for which this was
very difficult to do, but in most cases it is possible.
We will look at using diagrams informally – such as
in white-board sessions. These informal diagrams
can then be translated into more extensive pictures
in Test Plans. These pictures summarise the testing
approach to all parties involved in the project, from
the Business Sponsor, to Project Managers,
Developers, Business Analysts, Network Engineers
and of course Testers The key idea is “How can
I make things easier for my readers?”
Figuring it out – informal diagrams
When creating a painting you need to understand
your subject and understand the background.
The first step is to draw a sketch. This helps to
identify the main focal point and its relationship
with the other features of the picture.
The same principle can be applied to projects.
You need to be clear about the main purpose
of the project and its background, but also to
understand the various components and how
they fit together to make it work. The first steps
usually involve white-board sessions to get the
thinking going. Drawing on the board allows
freedom to get things down, but also to rub them
out and change them as understanding grows.
White-boarding your test strategy with your
Project Manager helps to get communication
going – with the opportunity for early feedback.
Often Test Planning will occur concurrently with
Design activities. Diagrams drawn to aid
understanding for testing can also contribute and
be useful in other areas.
I have included an example here of a white-board
diagram for a Subscription Service application.
This formed the basis of the formal Context
diagram shown in the next section. Such
diagrams can be invaluable – and are easily
captured with the camera on your mobile phone.
They can then be translated into more formal
diagrams and used in project documentation.
Page 12
12
Formalising Diagrams using a Tool
The standard tool for drawing diagrams is Microsoft
Visio which has been used to draw the following
sample diagrams. It is possible to use features in other
tools such as Excel, and the presentation software tools
PowerPoint and Prezi ii. Obviously every project is
different and it is a matter of choosing the right type
of diagram, with the appropriate level of detail, which
best expresses the testing challenges of the project.
Context diagrams
I find that Context diagrams frequently prove to be the
most useful in the early stages of the project. They help
to identify how the main components of the solution fit
together, and how the delivery of the solution may be
broken down.
The diagram above captures the main business func-
tions and software components of the solution. Down
the left hand side the main components of the releases
are shown. In the bottom right corner the need for mi-
gration of subscriber data from the old to the new ap-
plication is captured.
Solution Architecture
Often diagrams showing hardware configurations can
be borrowed from Solution Architecture documents.
On some occasions I have found it beneficial to create a
simplified version for discussion of testing activities.
The above diagram represents an application
with a web portal and thin client
configuration. The internet gateway is located
within the demilitarised zone (DMZ). The
application and database servers are located
in the Main Office.
Information flow diagrams
This diagram captures the interfaces
and information flows of a multi-location
project. Data is retrieved from multiple
sources and put through a calculation engine.
Page 13
13
Scheduling
This diagram represents the testing phases and main
testing activities of three interrelated projects. It helped
to keep a clear focus on project responsibilities, what
was due when and the various interdependencies. Visio
includes templates for schedules.
Conclusion - What are the benefits of
using diagrams?
Drawing a diagram takes time, but the benefits do repay
the investment.
Rough diagrams on a white-board are useful in
developing early understanding of a project. These
diagrams can then be formalised and used within
testing documents.
I have often found that a Context diagram has proved
to be a useful trigger to discussions with Project
Managers, project team members and in particular
representatives from the business. Rather like
conventional drawing and painting – skills improve
with practice.
A customer project sponsor recently said to me that a
context diagram in a proposal document had given him
confidence that his project had been properly
understood.
So I would encourage you to get the creative juices
flowing, use the white-board to get the concepts
mapped out and then translate them into formal
diagrams. Instead of writing another 1000 words in
your Test Plan, create a diagram instead. Spend the
time to save time in the long run.
Project 2Phase 3
ProductionVerification
Testing
Projects 1,2 and 3 Testing Phases and Responsibilities Summary
November December January February March
Project 1 Phase 1System
Integration TestExecution
1.1.1 Physical Installation
1.1.2 Functional Testing
1.1.3 BasicPerformance
Project 1
Project 2 Key:
Equipment Delivery 1
Milestone
Ready for Project 2 Prod. Tests
Project 1 Phase 2System Integration Test
Execution
1.2.1 Multi-SiteFunctional Testing
1.2.2 Fail Over Testing
Project 2 Production
Phase 1Functional
Testing
2.1.1 Component
Testing2.1.2 Application
Testing
Project 2 Phase 2 Final IntegrationTest Execution
2.2.1 Performance Testing - Site to Site Connectivity
2. 2.2 Failover TestingWith Project 1 and 3
Project 1 Phase 3System Integration Test
Execution
1. 3.1 Multi-Site TestingIntegration with Project 21.3.2 Performance Testing 1.3.3 Penetration Testing v
Production
2014
High LevelTesting
Activities
Project 3
Project 3 Phase 1System Integration Test
Execution
3.1.1 Application Functional Testing
3.1.2 DR Failover
Project 3 Phase 2 UAT
EquipmentDelivery 2
Project 3 Imp.
Project 2 Imp.
Project 3Phase 3
ProductionVerification
Testing
Milestones
Project 1 Phase 0Proof ofConcept
1.0.1 Proof of Concept to
validatedesign
OctoberSeptember
Project 2GoLive
Key Points about Diagrams
Diagrams aid understanding
Need to be created with the right level of
detail
Turn white-board sessions into formal
diagrams
Use diagrams to describe scope and
boundary of testing
Use diagrams to tell the story
David Rodriguez is a Test Manager based in
Wellington at Fujitsu New Zealand. Just to
prove that Testers are human here is a picture
of him working on a watercolour painting. He
can be contacted at
[email protected]
Page 14
14
TestAnalytics = TestIntelligence Hidden in the depths of your test repositories and not even accessible by your
management tools is testing gold!
What gold you may well ask? Well, for example, your project might only be
wanting to know that all the test cases have passed and all the defects are fixed,
in other words, whether the completion criteria have been met. However what if
on the Friday before your Monday go-live, the team found and fixed 10 Severity
1 defects? Completion criteria would still be met however what would your
confidence be like and how would you communicate it? Better still, how could
you pre-empt this situation?
The gold is information about your product-under-test beyond the garden-
variety and vanilla-flavoured. And it’s almost certainly sitting in your test
repository or database yet your management tools won’t know its there or
even what it is, because they only count things.
Want to get it out and use it? We can help.
We’ve developed a set of tools and services to get this gold out from its hidden
recesses, onto paper and into the hands and before the eyes of those needing it.
Click below to find out more!
NZTesterMagazine
TestAnalytics
Find the gold and convert to cash!
Page 15
15
For this instalment of the Testing@ series, I visited
Serko Online Ltd another of New Zealand software
success stories. At the head office in Parnell,
Auckland. I was hosted by Rob Hawker, CIO and John
Kubiak, who describes himself as the Serko Tester’s
Guild Lead and Coach; after spending an hour in
their company, I found out why.
More on that in a moment….
Serko Online has been around some 15 years,
initially as Interactive Technologies then as part of
the Gulliver’s Travel group. It became Serko Online
in 2007 after a management buy-back effected by
the current owners Bob Shaw & Darren Grafton.
Developing corporate travel booking solutions, the
company enjoys a sizable client base primarily
across New Zealand and Australia but also further
afield. There is even an office in Xi’an, China, made
famous by the discovery there of the Terracotta
Army in 1974.
The Serko product is browser-based and I detected
a certain amount of pride in Rob’s voice when he
mentioned that a mobile offering is about to be
launched on Android and iOS platforms.
With 150 personnel across the globe, Serko has
grown primarily courtesy of its self-confessed
customer focus and enviable track record in
delivering exactly what the customer requires,
even if it means wholesale modifications to the
application. Consequently it has been ‘encouraged’
to move on from upon its previously Waterfall-based
approach to a more ‘agile’ focus although John was
quick to point out that this is definitely agile with
a small ‘a’!
Using a Kanban workflow approach, development
teams of between 2 and 8 team members are
assigned to particular application modules - apart
from one which is focused upon Serko’s largest
client. Teams comprise of a lead, business analysts,
developers and testers however there is a high
degree of job cross-pollination, especially in the
smaller teams where a handful of people are
required to cover multiple tasks. In true agile style,
the whole team is responsible for delivery and either
reaps the reward for a job well done or the kick in
the seat if not!
Rob outlined that a great deal of emphasis is placed
on software design and the practice of developing
acceptance criteria as part of the initial process with
all team members expected to contribute regardless
of job designation. Over time the whole team
collaboration focus has vastly improved the quality
of the final product and as an organisation, the
company now prides itself on churning out as close
to clean code as it can.
Serko deploys a total of 15 testers (or rather people
who do testing) across New Zealand and China and
25 developers (or rather people who write code).
A great deal of effort has been expended ensuring
the developers are able to spend the lion’s share of
their time writing code and are divested of some of
the more traditional overheads developers have to
carry such as documentation, which is handled by
other team members.
All testing is heuristics-based following exploratory
testing principles and John indicated that there were
no formal test plans as such. While testers are
considered gatekeepers, Serko runs a no-blame
culture ie. if a defect is found post-release, the tester
(thankfully I’m sure) doesn’t have his feet held to the
fire. Indeed, one of the primary modus operandi is
the concept of experimentation and investigation –
mistakes are viewed as opportunities to continually
learn and improve upon. Team members are
encouraged to try out new methods and approaches,
discuss differing contexts and use evidence-based
decision-making to bring the team to agreement on
moving forward.
Testing @ Serko by NZTester Magazine Staff Writer & John Kubiak, Serko Online
Page 16
16
Wanna Get Published?
Our formula for selecting articles for
publishing:
Good + Relevant = We’ll Print It (well,
digitally-speaking anyway)
Good = one or more of: thought-provoking,
well-articulated, challenging, experienced-
based, technical skill-based, different
perspective to mainstream, unique….
Relevant = one or more of: emerging
trends, new technology/methodology,
controversial (within reason), beyond the
basics (eg. testing is good, defects
are bad)….
Serko has also made use of automation more at the
unit level than anywhere else. Using Selenium as the
engine and the Redwood framework, automated test
input code is constructed by the developers with
each job. One of the tester’s tasks is to develop sets
of test data that cover the possible scenarios that the
application is designed to encounter which is then
defined using Redwood and executed via Selenium.
With these automated tests carried out at unit level,
very little manual regression testing is required.
All testing training is carried out in-house with John
taking the lead in ensuring that all testers are
performing to the highest possible standards.
Interactive and online learning is encouraged and
each team member is coached in the ‘art’ of passion;
not just for the job but also for what the company
stands for a whole. I noted in reception that a couple
of walls were covered with artwork where personnel
had obviously been given the opportunity to express
their perceptions around working for the company;
there were some interesting pieces!
Rob used an interesting analogy when explaining
how the company measures its output. Rather than
capture metrics as such, each release is monitored
using a red- light/green-light approach; a green
light, while good news, is considered that perhaps
the deliverable in terms of time/cost/quality wasn’t
pushed for hard enough in other words it was all a
little too easy and can be improved upon. Two red
lights indicates that maybe it was pushed too hard
as issues were encountered. One red is the optimum
as it mean issues were encountered but not so bad
as to cause major problems. Certainly an interesting
approach.
Overall, I was impressed by the lean, mean and
hungry approach and while there is ample
opportunity for experimentation and investigation it
doesn’t mean that there is a lot of ‘fat’ built into the
development process. As always the proof of the
pudding is in the eating and if the feedback from
Serko clients is anything to go by then the company
is doing a superb job.
Page 17
17
Testing Events
If you have an event you’d
like to promote on this page,
email [email protected]
Boston, MA, USA
5 - 8 October 2015
Maastricht, The Netherlands
2 - 5 November 2015
Page 18
18
Having completed Assurity’s Graduate Programme as part of the July
2014 intake, Chloe Holt gives a rookie tester’s view of her first 100 days
on a client site. I’ve been working as a professional tester on client site
for just over 100 working days. It’s had its challenges and frustrations,
but it’s also extremely satisfying to see the project I’ve been working on
make its way to completion.
Though I now have a taste of the industry and some experience, I still
consider myself a rookie tester and will continue to do so for a while
longer. Even after only 100 days of testing, I’ve found the niggling
temptation to approach a product/system/idea with my own
assumptions and preconceived ideas of how it should behave has
already kicked in. However, I’ve also found that, although it is helpful
to have clear expectations, our assumptions can be dangerous and
misleading.
Two observations as a rookie tester are:
Challenge #1
The client can know what they want
The developers can know what they want to make
These two things aren’t necessarily the same
Our Approach
As consultants, we can have an insight into the workings of the two
parties. A key part of our role is to open up communication channels
between the people involved in the project and to help them to work
alongside one another – as opposed to heading in completely different
directions. I found it really helpful working directly with those on the
client side and the team of developers. Clear bug reporting tools helped
promote transparency between all involved. One man’s bug may be
another’s pet – but at the end of the day, the client has final call on what
is desired behaviour and what is not.
If this information is accessible to everyone and in a format that
promotes discussion, then there is visibility, a quick feedback process
and a way of tracking how these requirements evolve over time.
Challenge #2
Testers get less and less popular as deadlines get closer and closer and
closer.
100 Days of Testing
by Chloe Holt, Assurity Consulting
Page 19
19
Chloe Holt is a Test Analyst at Assurity Consulting
in Auckland. She can be contacted at
[email protected]
Our Approach
It’s human nature not to like people critiquing
our work and highlighting our mistakes. Even as
a tester, I’m not a huge fan of people scrutinising
my work and trying to pull it to pieces.
Therefore, it makes perfect sense that the
testing team doesn’t always have the most
popular job.
However, if testing is approached with genuine
care and concern for the quality of the project –
both the client and the developers seem to pick
up on this. Everyone will be a lot less defensive
and more cooperative if they know that you are
working with and not against them.
We developed good relationships with the
developers and the client, enabling both parties
to see that we were on their team and keen to
help push the project to success. This happened
gradually over time. We intentionally worked
on site with the developers – as well as spending
time on site with the client.
Despite the convenience of emails and phone
calls, when it comes to building relationships
with people, nothing trumps talking to them
face-to-face! We gradually bonded over a mutual
understanding of the product, frustration with
its issues and a keenness to get it across the line
with a high level of quality.
We also discovered the areas where we were
able to help one another out to support the
overall progress of the project.
A little food bribery every now and again also
goes a long way!
Page 20
20
Traditionally the role of the humble Test Manager
has been defined loosely as three-pronged with
specific responsibilities around each:
1. To determine the test architecture and
establish and manage the governance
processes around it.
2. To strategise and plan testing, project manage
it to ensure it remains on schedule and report
on progress.
3. To manage the day-to-day activity of the test
team and its requirements and objectives.
As most NZTester readers will know, I’ve been in
and around testing for many years and would like to
think that I can operate across all three prongs with
no difficulty. No doubt many other experienced test
managers feel quite comfortable doing the same.
However at a recent conference, I attended a session
at which a Senior Test Manager presented around
a major programme of testing work where these
three sets of responsibilities were split across three
different people. Given that overseas programmes
tend to be on the whole larger than the local variety,
I surmised that this structure was necessary due the
size of the work at hand however upon talking with
the presenter after the session, I found out it was no
larger than any of the programmes I have run. The
reason for dividing the role into three, I was told,
was to ensure that each responsibility was afforded
the best possible skill and attention in order to see
the programme achieve its objectives.
Now this interested me as I continue to observe
Test Managers, myself included, juggling deadlines,
defects, budgets, quality, coverage etc. - all the things
that make our days so challenging. The usual course
of events is that this bunch of intangibles is
continually reviewed and balanced to ensure that
testing meets its requirements and objectives.
However if a Test Manager perpetually struggles in
even one of these areas, it often creates trouble in
the testing camp and of the type that can leave Test
Manager with limited options. The conundrum is
often not recognised for what it is and the Test
Manager is seen as “not cutting it”, “lacking depth”,
“wrong for the job”, “too technical/not technical
enough” etc etc. In extreme cases I’ve seen this
scenario lead to a Test Manager leaving a project or
taking a lesser role within it and whilst everything is
done to ensure that he/she doesn’t expire in shame;
the reality is that some sense of failure is always
evident. We wouldn’t be human if there wasn’t.
However in this light, I wonder whether the Test
Manager has been unfairly treated, albeit
unknowingly by those who do not have sufficient
exposure to testing to fully understand its
idiosyncrasies. A Test Manager has to be a manager
of people and of tasks, an effective communicator,
a careful strategist, an efficient planner, a diplomat,
a consultant, a scheduler, a presenter, an advocate
and general ace fortune teller/crystal ball gazer in
addition to being an accomplished Test Analyst. After
all, Project and Programme Managers will often have
a PMO (Project Management Office), assistant Project
Managers, PAs, schedulers/planners etc around
Consider the Humble Test Manager By NZTester Magazine Staff Writer
Page 21
21
them to assist however the Test Manager is often
expected to do everything him/herself. Now while
some of us might like to think we can do anything
and everything, maybe we’ve been trying too hard
for too long. Maybe this is why we are so
susceptible to stress, burnout, self-devaluation,
frustration and the “occasional” explosion. Maybe
this is why some Test Analysts would rather have
teeth pulled than become a Test Manager or why
many see testing a merely a stepping stone to
something else.
Now I’m not advocating that every project needs
three people instead of one to take on Test
Management responsibilities. Nor am I suggesting
that every Test Manager necessarily needs
assistance however, just maybe in certain
circumstances the most appropriate solution to
a problem is to put people where their skills and
abilities are best deployed. If a Test Manager
struggles with strategy and planning, provide
appropriate assistance (how will be down to each
situation) and allow him/her to focus where skills
and aptitudes are best utilised. If a Test Manager
needs a bit of beefing up around test architecture
or governance, same thing. Such assistance does
not need to be full time, often an advocate to come
in alongside to support, if only for a season is all
that may be required.
A colleague once likened Test Management to
herding cats which I think can probably be applied
to IT project management or any management role
in general. However with testing there are always
the variable unknowns that we never how many
defects will be found nor how long it will take to
remedy them. The Test Manager often needs to grip
the tiger by the tail and hang on for dear life. If he/
she lets go then chances are there will be lots of
grovelling in the dust when the tiger changes tack
and does a 90 degree turn with no notice. Defects,
test coverage and general testing trends often mean
we have to change tack in such a manner.
I’ve worked on plenty of projects and programmes
over the years where I have followed another Test
Manager and sometimes more than one. I think the
best one was where I was Test Manager #5. Often
when I lift up the bed sheets on a project, I find that
the outgoing Test Manager has not actually done
much “wrong” as in “not right” but more that he/
she struggled in one or more of the areas
outlined above. Perhaps if there had been specialist
expert assistance available to help over-the-hump,
then there may have been every chance of a
successful outcome.
If you’re in the situation that I have described
above, please do feel free to email me at
[email protected] . I’d only be too happy to assist
in pointing you in the direction where I can.
And in case you’re interested, my solution for
herding cats is really quite simple. Get yourself a
raft, pick each cat up and place it on the raft then
when you have them all aboard, float it down
the river!
Page 22
22
In the highly competitive digital world of mobile
applications the ability to get high quality apps to market quickly can make or break the success of a product or company. With new applications for iOS Android and Windows platforms being released daily, all battling for media attention and consumer dollars, the pressure to get apps built, tested and deployed has never been greater.
Testing methods that have worked previously for web and desktop applications do not meet the needs of mobile applications as Testing Matrices and fragmentation has become far too complex.
Companies must now carefully consider both usability and functional requirements and test their applications across different devices, models, networks, operating systems, browsers and locations.
This calls for a new approach, one that presents
benefits and challenges.
Key Points:
Pressure to get mobile apps built, tested and launched quickly and effectively has never been higher.
Testing methods that have worked for web and desktop apps fall short for mobile applications.
A new approach is required specifically for Mobile Testing.
Challenges
Mobile applications present many challenges for
today’s test teams, many of which differ from
traditional projects. Defining the scope of mobile
testing must take into consideration the following:
Devices and Operating Systems
With a mass of device makers each producing a multitude of devices released year after year, test teams must carefully consider the various operating
systems, screen sizes and hardware specific to each of these devices. In doing so, they need an adequate coverage of handsets to test against including various generations of operating system updates. Testing must pinpoint target users of the app under test, prioritising their chosen devices, which includes being up-to-date with the most popular devices in the market.
Networks
Networks are an important and complex consideration as mobile applications depend on both Wi-Fi and 3/4G connections. Network test scenarios should cater for offline activities as well as interruption to calls, alerts and low battery, testing the data connection and the expected responses. Test teams should also consider the network provider, as each has a different network infrastructure which can also affect performance and user experience.
Application Types
There are 3 types of mobile applications each with its
own set of challenges for test teams:
Native apps are client-side apps downloaded
from stores onto the device. These require
dedicated testing effort across each platform/
language that the app is designed and coded.
Web apps are mobile optimised webpages,
which access websites through a mobile
device’s browser. These require tests to be
run on different types of browsers on devices
connected via a mobile or Wi-Fi network.
Hybrid apps are a combination of both native
and web, which run on the device and are
platform dependent. While the code remains
the same there are different factors that come
into play for testing due to the need to address
characteristics of both native and web apps.
Why Mobile Testing Is So Important
by Andy Parish, Planit Testing Services
Page 23
23
Strategies
Development Methodologies play an important part
in defining a mobile test strategy.
Mobile Testers need to be highly skilled and
practiced in agile techniques. Agile and mobile
development practices tend to go hand-in-hand due
to todays technology advances in the digital space.
Flexibility and responding to market changes are key
advantages of following the agile framework,
therefore enabling faster speed to market.
Risk-based and Attack-based Test Strategies need to
be considered for mobile testing, evolving through
the lifecycle and focusing on early testing to unearth
defects during development.
Exploratory testing provides feedback loops to
teams, with testers determining if more or less test
attack type testing is needed, constantly assessing
risks refining scope and going beyond basic user
story and requirements validation.
Effectively leveraging Test Automation as part of a
mobile test plan provides the ability to continuously
test and run the same test case over multiple
platforms and devices. This saves testers valuable
time, increases quality and improves the team’s
ability to correctly repeat tests without errors in test
script execution.
Performance and security testing also form the main
components of a clearly defined mobile test strategy.
Tests should be executed on real devices to
maximise confidence for product release while also
leveraging emulators or remotely connect to devices
and web browsers in the cloud.
Conclusion
Mobile development has fast become mainstream
and a top choice for organisations as mobile
penetration approaches 100% market coverage. In
fact, as mobile sites and apps are increasingly
considered as business critical, it stands to reason
why mobile testing has become such an important
part of the mobile evolution.
Mobile applications are a game changing force for
companies across all industries. But for all their
benefits they also hold significant challenges and
risks, making it crucial that organisations adopt and
apply the correct testing techniques and strategies.
Andy Parish is a Senior Test Consultant with
Planit Software Testing in Auckland. He can be
contacted at [email protected]
Page 24
24
As a tester have you, or your team, ever been blamed for not finding a defect
in production? Have you ever heard a developer say something like: “it’s the
testers’ job to find defects”? I have more times that I can count..
I remember one time, when one of my team was testing an old UNIX
application; a thin client emulator accessing the server. Some changes had
been made to the work flow and this meant a new screen had to be developed.
The changes were promoted to test by the developer. The test analyst worked
through the flow and got to the new screen. When the screen loaded she
found that all the field names had overwritten themselves. This meant that
the developer hadn’t even launched the screen to look at it himself before
promoting it, let alone actually unit testing it. As I recall the supplier of the
UNIX system was delivering 5% of the code change for the overall project,
and delivered 21% of the defects. Almost all of them I would categorise as unit
testing defects.
More recently I was on another piece of work. A major change programme. I
was in a meeting with a bunch of people including testers and developers. A
question came up about the number of tests for a screen. The tester said they
were checking the validation of every field on the screen. I questioned this
with something like: “hang on, there are three date fields on this screen, and
you are running the same tests on each of them?” “Yes” came the reply. “But
have you talked to the developers? I’ll bet that all three of those fields use the
same underlying piece code. Once you tested it once you don’t need to test it
everywhere else. You’re just massively duplicating effort for no added value.”
I then looked at a developer who confirmed my hypothesis and then confirmed
that they unit tested it; including the boundaries. This cut hundreds of test
cases from the test execution. The plan was to test every date field (and every
other field) on every screen.
As a test analyst do you use Boundary Value Analysis as a part of your test
analysis? I’m guessing the answer is a resounding yes. This is the most
common technique I see when I go into different organisations. It’s in most
testers’ tool kits. But, I’d have to ask, why? As in the example above I’d expect
a boundary value test to be a unit test. The code of any field, in any well-
developed product, will be written once and called often.
I can answer the question as to why testers do boundary value testing. The
simple answer is that we don’t trust developers . When we do these tests we
find defects which means the developers haven’t actually unit tested this kind
of function (despite that fact that it would be vastly more efficient for them
to do so, especially if they automate the unit tests).
In another organisation I reviewed development and testing. Most of the
development teams I talked to didn’t unit test. “It’s test’s job.”
Test Is Dead—Long Live Test! By Matt Mansell, IntegrationQA
Matt Mansell is a Senior Test Consultant with IntegrationQA in
Wellington. He can be contacted at [email protected]
Page 25
25
As a test practice manager I once spent considerable
time commiserating with one of my test leads. He
had been in a meeting and received some confusing
answers from a developer. After the meeting he took
the developer aside and asked: “what unit test
framework do you use”. The response he got was:
“we don’t use any, it’s your guy’s job to test.”
One of my favourites from Agile projects is the “test
sprint”. I saw a project when the sole tester was not
able to get through testing and the sprints would fail.
The answer? The tester runs a sprint behind in their
own test sprint. Talk about the antithesis of team
commitment to a sprint. This is certainly not valuing
people over process. It sees test as a process to be
complied with; and doesn’t see quality as something
that everyone is responsible for.
Something is rotten in the state of software
development lifecycles.
So what is the problem?
Test.
Yes, you read that correctly. We are the root cause
of the problem. In particular the idea that Test is
a separate function or team is the heart of the issue
here. For years, we testers (me included) have been
lobbying for test as its own profession. We have
argued to have our own teams established in
organisations projects, and programmes. I spent six
years managing just such a team; full of awesome,
intelligent, professional people who do an incredible
job. In my view one of the best teams in the city
(because of their brilliance not mine). And yet I still
fielded calls from disgruntled stakeholders blaming
my team for not finding this or that critical defect
in production.
Creating Test as a separate structure has led to an
industry wide abdication of responsibility. Everyone
has cast off their quality responsibility onto Test and
then kick Test when their work is low quality. Once,
in a project steering meeting a developer said: “its’
Test’s fault the defect is in production, they didn’t
find it.” I had to say: “Hang on, my team didn’t write
the defect into the code. The defect is there because
one of your guys wrote it into the code.” The steering
group still blamed Test for not finding the defect.
This conversation is ludicrous. The defect is the
responsibility of every single person on that project.
But there was a separate test team. So everyone else
on the project abdicated their quality responsibility.
This is so endemic that most University Computer
Science courses I can find don’t have any courses on
unit testing, let alone any other kind of testing. I
have talked to grads who’ve come out of University
with the barest understanding of how to actually do
real unit testing. I ask experienced developers if they
are doing control flow testing and they look at
me blankly.
I am coming to the conclusion that Test as we know
it needs to die. And the people who need to kill it are
testers. The people who fought for it need to re-
envision it. We need to fundamentally rethink why
we do what we do, how we do it, when we do it and,
most importantly, who else should actually be
doing it.
The world is changing around us, yet there have been
few real changes in the way testing occurs in my
lifetime as a tester(1). We are still basically working
the same way we did in the 90’s. But everything else
is changing around us. Agile, when done with
discipline, challenges the core of current testing
models. The advent of cheap tools to automate all
kinds of testing is fundamentally changing the test
landscape. New technologies like Service
Virtualisation offer the ability to fundamentally
rethink what we do and when we do it. I used to
scoff at James Whittaker’s test is dead ideas. Now
I find I them harder to dismiss.
How much of our current advocacy of separate
testing is now just about protecting our jobs?
I scoffed at these kinds of questions when my job was
to run a test practice. Then I got a secondment in the
same organisation out of testing and got to see it
from the outside. My perspective has started to shift.
There is more to say on this, which I’ll be presenting
at the NZ Tester Conference this year. But the
questions I’d ask you to consider now is: “what is
the value of what you are doing now? And “to whom
is it valuable?” And “would it be more valuable if
someone else did it?” If you can’t answer those
questions then what are you doing?
To be clear; I’m not saying testing is dead, but I do
think the current models of how Test is organised
are outmoded and need to change.
(1)With the exception of some big online service
providers like Google, Amazon and Facebook!
Page 26
26
TestCases = TestStudies Your company excels at testing! You’ve been helping New Zealand companies
perform and manage testing for years now and you know you’re good at it. Your
revenues prove it.
Problem is there are others good at it as well. Companies that are just as big and
just as successful as yours. So how do you convince your next customer that yours
is the offering to go for? Well, maybe we can help.
Perhaps you’ve seen our Testing@ series here in NZTester Magazine. This is where
we go into an NZ company and develop a short article on how its goes about
testing. Or maybe you’ve laughed along with our Staff Writer as he has had to
come to grips with the ups and downs of testing.
So why not consider an NZTester Magazine Case Study? We’ll come to you and
document your success, we’ll work with you to develop narrative, copy and design
an appropriate vehicle for your success story! After all, we are first and foremost
testers, so we know the landscape however we’re also journalists and writers, so
we know how to articulate information. Plus we have our own professional
graphic design and photography options to ensure professional presentation
and production. Interested?
Click below to find out more!
NZTesterMagazine
TestCaseStudies
Get it into print and out there!
Page 27
27
Coffee with Viswa – Brewing Ideas! One can have a great discussion over a good cup of hot coffee. The Circle of Testing - everything around you is
tested and verified.
Often we hear developers asking “why do we need
to test this minor code change?” Even a minor
change should undergo proper testing and here’s
why:
Remember the line “every action you do is Kung-
Fu” - Pat Morita in “The Karate Kid”?
Similarly everything you verify or check in your
day-to-day life is called Testing. Testing is a part of
our life. An ATM was installed after testing
hundreds of times but when you withdraw cash,
don’t we still go ahead and check our cash? When
the shop keeper tenders exact change, we still
verify the change. When we know our shoe size, we
don’t just pick a pair of shoe and buy it. We still
wear them and check them before buying.
Likewise, every tool we use needs to be tested. In
fact on a philosophical note, Man is also always
tested by God. So next time a developer asks “Do
you need to test this minor code change?”. Your
answer should be “If I don’t test it, your customer
will test it anyway. So let’s it get tested and
approved before it is released to the customer”.
One can have a great discussion over a good cup of hot coffee! Viswa Devarajan is a Senior Test Analyst with QualIT in Auckland.
Page 30
30
85% of all software tests are still being performed manually. Is there a way to significantly improve manual testing and be able to work 2x faster? This free webinar is designed to change the way you perform manual testing. "Manual Testing 2x Faster in Word and Excel" focuses on new ways to reduce your manual test effort. Join this webinar to learn how to:
Build, execute and analyse software tests directly in Word and Excel
Make UAT, Exploratory, SAP and CRM manual testing much easier
Test websites and applications with greater flexibility
Improve the efficiency of manual testing and test 2x faster
Integrate with popular test management systems
Ask questions and take back valuable tips and tricks to use in your every day manual testing.
Space is limited so sign up today. For further details: http://www.autom8.co.nz/webinars/
See you soon.
- Aaron Athfield, Founder and Chief Manual Tester Guy nz.linkedin.com/pub/aaron-athfield/75/811/626 www.autom8.co.nz
Page 31
31
Subscribe to the magazines free at www.nztester.co.nz
Calling all
PROGRAMME TEST MANAGERS
There’s now a LinkedIn group for all test professionals operating at
Programme Test Management level (or at least aspire to). Click on
the title above.
Page 32
32
Click on title
NZTester Magazine Conference 4
NZTester Magazine TestAnalytics 14
NZTester Magazine TestCases 26
WorX / Autom8 30
And now it’s your turn…
If you would like to be involved with and/or
contribute to future NZTester issues, you’re
formally invited to submit your proposals to me at
[email protected]
Articles should be a minimum of ½ A4 page at
Cambria 11pt font and a maximum of 2 A4 pages
for the real enthusiasts. If you wish to use names
of people and/or organisations outside of your
own, you will need to ensure that you have
permission to do so.
Articles may be product reviews, success stories,
testing how-to’s, conference papers or merely
some thought-provoking ideas that you might
wish to put out there. You don’t have to be a great
writer as we have our own staff writer who is
always available to assist.
Please remember to provide your email address
which will be published with your article along
with any photos you might like to include (a
headshot photo of yourself should be provided
with each article selected for publishing).
As NZTester is a free magazine, there will be no
financial compensation for any submission and the
editor reserves the sole right to select what is
published and what is not.
Please also be aware that your article will be proof-
read and amendments possibly made for
readability. And while we all believe in free speech
I’m sure, it goes without saying that any
defamatory or inflammatory comments directed
towards an organisation or individual are not
acceptable and will either be deleted from the
article or the whole submission rejected for
publication.
Feedback
NZTester is open to suggestions of any type, indeed
feedback is encouraged. If you feel so inclined to
tell us how much you enjoyed (or otherwise) this
issue, we will publish both praise and criticism, as
long as the latter is constructive. Email me on
[email protected] and please advise in your email
if you specifically do not want your comments
published in the next issue otherwise we will
assume that you’re OK with this.
Sign Up to Receive NZTester
Finally, if you would like to receive your own copy
of NZTester completely free, even though we’re still
real low tech right now, there’s two easy ways: 1)
go to www.nztester.co.nz, or 3) simply click here -
Ed.