Skip to main content
 
uk

  • Increase Speed to Market
    Deliver quality quicker by optimising your delivery pipeline, removing bottlenecks, getting faster feedback from customers and iterating quickly.

  • Enhance Customer Experience
    Delight your customers in every digital interaction by optimising system quality and performance to provide a smooth, speedy and seamless user experience.

  • Maximise Your Investment
    Realise a positive ROI sooner and maximise your investment by focusing your energy on high-value features, reducing waste, and finding and fixing defects early.
  • The Wellington City Council (WCC) wanted to deliver quality outcomes without breaking the bank. Find out how Planit’s fast and flexible resources helped WCC achieve this goal.

this is a test Who We Are Landing Page


INSIGHTS / Media Coverage

Planit Testing Index 2013 Review

 26 Feb 2014 
INSIGHTS / Media Coverage

Planit Testing Index 2013 Review

 26 Feb 2014 

As a precursor to reading this article, I suggest a visit to the Planit website to download this year’s Planit Testing Index Executive Summary – Ed.

Planit Testing Index 2013

I didn’t make it to the Planit Testing Index roadshow this time around. Not that I didn’t want to; events conspired against me and I never seemed to be in a place long enough to get myself along. However as always, Planit has made the documentation available to all so I have relished the opportunity of working my way through and determining how things (may) have changed from the 2012 survey.

I’ve noticed that in most specialist industry surveys conducted on an annual basis that the percentages tend to change no more than plus or minus 5% points from year to year. In other words things stay pretty much the same; slightly up one year, slightly down the next et al. So the first statistical difference that caught my attention was that the number of responses from New Zealand was up from 9% in 2012 to 21%. In fact there were more respondents from New Zealand than New South Wales so looks like the Kiwis have taken the encouragement made in NZTester2 to heart and given this year’s survey a fair crack of the whip.

Another number that caught my eye, although only a 3% increase on last year was the number of respondents rating Testing as a “Critical Element in Producing Reliable Software”, up from 48% in 2012 to 51% ie. just over half. I shouldn’t cheer too much though as I would have thought that this one was a no-brainer, warranting a much higher assessment than just over half. Does this mean than 49% of respondents do not see Testing as a “Critical Element in Producing Reliable Software”? The mind boggles… as mine is so apt to do.

Last year’s eyebrow-raiser of Desired (62%) versus Actual (18%) ratings for Testing starting during the Requirements phase was similar this year; 60% and 15% respectively (I now have both eyebrows raised). However when reading on, we see that the main reason for project failure is still “Poor or Changing Business Requirements” at a whopping 70% (up from 68% last year which in turn was up 9% on the year before) so maybe not such a surprise after all.

So onto the whole Requirements matter; has anything improved on last year? If you’ve read NZTester2, you will have noted that 28% of respondents reported feeling positive about their company’s Requirements Definitions with another 44% feeling OK (=28% not OK at all). A huge 97% conceded that their company could benefit from improving Requirements Definition and 67% believed it was suboptimal at the time. Unfortunately 2013 has seen no improvement and has indeed worsened: just 23% positive, 39% OK and 38% for that other category – a 10% increase on 2012! In addition, 99% of respondents now feel that their company could benefit from improving requirements definition and 71% believed it suboptimal. So there you have it… disappointing to say the least. Testers: we must yell louder!

The Project Methodologies category yielded an interesting statistic; last year respondents reported a breakdown of Waterfall 36%, Agile 29% and V-Model 24% (ignoring the Other or No Methodology responses). We queried at the time whether respondents fully understood the differences between Waterfall and V-Model and wondered whether the categories had been used interchangeably. This year the breakdown is Waterfall 33%, Agile 33% and V-Model 22% showing a small but distinct increase for Agile (4%). It should be remembered that the “Agile” umbrella includes a number of different methods eg. XP, SCRUM, RAD, BDD/TDD/FDD/TLADD etc although I think most folk tend to think of development Sprints as synonymous with the term “Agile”.

Project Outcomes showed a significant increase for the “On Time/On Budget” category, up to 52% from 39% last year, good news! The breakdown of this category by project method also makes interesting reading; Agile 52%, Waterfall 49% and V-Model 55% (same caveat on the last two as per above). It is encouraging to see the increase here as it reverses the downward trend from the last three surveys.

The Testing Investment section stills shows expected increases in spending around Structured Test Processes, Testing Tools and Testing Training as the three main areas for 2014 although it’s interesting to note that plans to Engage Contract Testing Professionals rose from a 19% increase last year to 29%, with only 25% expecting a decrease – good news for the contract market!

Utilisation of Performance Testing stayed very much the same as for 2012 which is a little surprising given the increased awareness of Performance Testing and Engineering. The specialist Performance Testing companies that I have contact with report that they are rushed off their feet especially in the telecommunications and banking sectors where emphasis on designing and developing IT systems for performance as opposed to purely for functionality appears to be be growing.

On the Software Testing Tools front, HP continues to rule across all three categories; Test Management, Test Automation and Performance Testing. Interesting enough though, in each category the next most popular tool is not from another generalist vendor but a specialist tool ie. Atlassian Jira, Selenium and Apache JMeter respectively. Other traditional vendors eg. Rational (IBM), Microsoft, Tricentis and SmartBear notch single figure usages only with a few eg. MicroFocus (Silk), Telerik and Fitnesse not rating a mention. I also wonder whether Jira users might also be including NZ’s own EnterpriseTester in their figures.

Finally, I always find the Project Conditions section an amusing one. Last year I homed in on Project Estimation as I’ve always found that when estimates prove to be too light, that it’s testing that cops it at the southern end of the project lifecycle. In other words, the architects, business analysts and developers have spent all the money so sorry, testing has to be done on and with a postage stamp! In 2012, those who rated Estimation for Budget and Timeline as Poor or Very Poor were 27% and 31% respectively. This year it’s up to 33% and 39% so oops, no improvement there! If we add in Realistic Expectations and our old favourite, Requirements Definition (neither of which I mentioned last year) at 34% each for 2012 and 39% and 38% respectively for 2013, this makes these four categories, which of out of the 10 assessed are the most applicable to testing (in my humble opinion), the lowest rated categories of all! Same as last year, gulp! Please excuse me being so negative and cynical, maybe it’s the tester in me!

In summary, while some of the categories this year are worthy of further optimism e.g. project success rates, New Zealand participation in the survey et al, those that are applicable specifically to testing do seem to be creeping westward (and no, not to Western Australia). Will we ever see a day where we’re all satisfied with requirements, estimations, expectations etc., no, probably not and it’s possibly quite naïve to think that we will? However, that said, it does mean that we have to continue to i) find better, faster and more innovative ways to test and ii) keep the flag aloft around these areas, and then just maybe we might start to see a swing east again. Until next year…

View Article Here

Deliver Quality Quicker

At Planit, we give our clients a competitive edge by providing them with the right advice, expert skills, and technical solutions they need to assure success for their key projects. As your independent quality partner, you gain a fresh set of eyes, an honest account of your systems and processes, and expert solutions and recommendations for your challenges.
 
Find out how we can help you get the most out of your digital platforms and core business systems to deliver quality quicker.

 

Find out more

Get updates

Get the latest articles, reports, and job alerts.