Skip to main content
 
nz

  • Increase Speed to Market
    Deliver quality quicker by optimising your delivery pipeline, removing bottlenecks, getting faster feedback from customers and iterating quickly.

  • Enhance Customer Experience
    Delight your customers in every digital interaction by optimising system quality and performance to provide a smooth, speedy and seamless user experience.

  • Maximise Your Investment
    Realise a positive ROI sooner and maximise your investment by focusing your energy on high-value features, reducing waste, and finding and fixing defects early.
  • The Wellington City Council (WCC) wanted to deliver quality outcomes without breaking the bank. Find out how Planit’s fast and flexible resources helped WCC achieve this goal.

this is a test Who We Are Landing Page


INSIGHTS / Media Coverage

Planit Testing Index 2012 Review

 21 Jan 2013 
INSIGHTS / Media Coverage

Planit Testing Index 2012 Review

 21 Jan 2013 

As a precursor to reading this article, I suggest a visit to Planit's Index Reports to download this year’s Planit Testing Index Executive Summary – Ed.

7 December 2012 saw the Planit Testing Index roadshow roll into Auckland, the last stop on a 7-date tour around New Zealand and Australia. Sounds more like a rock band itinerary I know however every year Planit surveys the testing landscape in this part of the world and presents its findings in each of the major centres – which is great for number nuts like me who seem to derive an inordinate amount of satisfaction when my own predictions and expectations turn out to be in sync with Planit’s findings.

As per previous years, the Index was presented by Planit Managing Director Chris Carter and generated much questioning and discussion in Auckland, as I expect it did in the other centres. However this year there were a few results that raised even my eyebrows.

First off, I thought that New Zealand could have done a little better than provide a mere 9% of respondents. With our software industry having been successful on the world stage in recent years, it would have been nice to see a greater slice of the input coming from NZ – therein lays the challenge for next year folks!

I was pleased to see that 74% of respondents believe that their organisation’s view of testing was either as “a critical element in producing reliable software” or ‘’strategically important for organisation success” leaving only 26% still leaning to the negative. While this 26% may seem still too much to some of us, it is important to note that this represents a steadily improving position on previous year’s findings.

Also pleasing was the increasing budgetary allowances for testing, up to 22% (from 19% last survey) across all project phases – compared to 18% for requirements, 17% for design, 33% for development and 10% for implementation.

First eyebrow-raise was the timing of testing commencing on projects. 62% of respondents reported a continuing desire to kick testing off during the Requirements phase however only 18% actually did. Compare this against only 16% wanting to start during development whereas this was where 65% actually did. I believe this shows that the message for earlier testing is certainly being broadcast by the testing community however for whatever reason(s), is not being acted upon (would be very interested in why). Amusingly, 5% actually started testing during implementation and 1% performed no testing at all, yeeow!

Another interesting result was the Project Outcome. On time/on budget projects represented just 39% of the total, the lowest in the six years the Planit Index has been running and down from 49% two years ago. 29% were reported as over time and/or budget, 22% with significant changes, 6% postponed and 4% cancelled. The last Standish numbers I saw from a few years back now showed 35% as successful (presume this meant on time/on budget), 46% challenged and 19% failed – interesting.

Of the causes of project failure, a whopping 68% stated “poor or changing business requirements and priorities” as the most common – still! The remainder are really not worth a mention as they’re so small by comparison. This tells me that even in this day and age of new and improved methodologies that IT projects fail mainly for the same reasons they always have. And this is born out further in the next finding….

Only 28% of respondents reported feeling positive about their organisation’s requirements definition. 44% were OK and another 28% believed it to be poor. 97% conceded that their company could benefit from improving requirements definition and 67% believed it was currently suboptimal.

Another eyebrow-lifter for me was the percentage of respondents who believed that their project estimation methods for budget and timings were either Excellent or Good – only 24% and 23% respectively with 22% expressing the same view for whether they believed their projects worked to realistic expectations. That’s over three-quarters who rated OK, Poor or Very Poor across all three of these categories. As an estimation junkie, I found this one quite disheartening however at the same time encouraging as I’m currently putting the finishing touches on another test estimation tool!

The Project Methodology usage feedback came in at 36% Waterfall, 29% Agile and 24% V-Model (11% reported Other or No Methodology – again, yeeow!). Now this is where things became a bit murky for me. I’ve always seen Waterfall and V-Model as essentially the same however as I understand it, true Waterfall does not promote starting testing of any sort until a build is available. The V-Model approach at least promotes the documentation of high-level test conditions if nothing else, any time after there are requirements to work with. So a 36% usage of true Waterfall seemed to me to be somewhat high and I suggest a couple of reasons for this; 1) some respondents have not understood Waterfall v V-Model and 2) as outlined, if our requirements are considered by 67% of respondents as suboptimal, perhaps testing is not able to commence any sooner!

While not in the Summary, the slide below (Outcome by Method) makes interesting reading…

Waterfall

 

  On time, budget and scope   Significant changes   Over time/budget    Postponed  Cancelled 

 

V-Model

  On time, budget and scope   Significant changes   Over time/budget    Postponed  Cancelled 

 

Agile

  On time, budget and scope   Significant changes   Over time/budget    Postponed  Cancelled 

 

My initial reaction was one of a certain amount of disbelief however when taking into consideration that agile methods are designed for accommodating change, maybe not so out of kilter.

The last result I’d like to comment on is the Project Investment in Testing for 2013. 48% of respondents reported that their organisation is expecting to increase spending around structured testing processes and methodologies, 38% expected increased investment in testing tools and the same in testing training. This tells me that there is still the impetus out there to perform testing more efficiently and economically although I do wonder how many IT/business managers intend to increase investment in requirements definition. Dare I suggest that this initiative could in turn make the return on testing investment a whole lot healthier!

In summary, as always I found the Planit Index compelling reading. There were the usual surprises along with the results that we’ve come to expect, certainly around the number of organisations using Agile methods (a 22% jump on 2010’s survey). And I’m not in the least bit surprised that shortcomings with requirements remains as the most common cause of project failure (did you know I once had a BA Manager tell me that the testers needed to figure the requirements out for themselves – yeeow!).

For the next Index, I would be interested to see something around what IT professionals see as the top three challenges to delivering on time/on budget projects as opposed to those that cause projects to fail (in other words what had to be overcome to achieve a successful delivery).

View Article Here

Deliver Quality Quicker

At Planit, we give our clients a competitive edge by providing them with the right advice, expert skills, and technical solutions they need to assure success for their key projects. As your independent quality partner, you gain a fresh set of eyes, an honest account of your systems and processes, and expert solutions and recommendations for your challenges.
 
Find out how we can help you get the most out of your digital platforms and core business systems to deliver quality quicker.

 

Find out more

Get updates

Get the latest articles, reports, and job alerts.