As testers, we frequently find ourselves locked into using a lifecycle model that is not the best one suited for the project, people, schedule and budget.
When faced with this, we can try to convince the decision makers that this is the wrong choice, but often the decision is already made. Our best bet is to figure out how to make it work.
All lifecycle models have their strong and weak points. One approach that is helpful when adapting to a prescribed lifecycle model is to understand which model our project “should” use versus which model it is using.
Once we understand the delta, we can look at the strengths and weaknesses of the two models and work to apply the strengths to help remove the weaknesses. Blended models are much more common than pure models, so defining the proper blended model (using the most suitable features) can provide us with the best solution – even when the prescribed lifecycle is not ideal.
The first step in the evaluation is to identify key criteria of the project and project team that will help determine the best model. This exercise will also help highlight the strengths and weaknesses of the project and team, and will provide visibility into the risks that will be inherent in the chosen model.
The following evaluation criteria have been derived from a large set of projects using a variety of lifecycles. Each criterion can be evaluated on a scale of 1 – 5, where the values are defined for each criterion.
Examples of the characteristics that would be exhibited, based on the scale value, are also provided for the values 1, 3, and 5. This will help align your choice with the proper value.
Each evaluation criterion has two values. The x value is used to categorise the project, while the y value is used to categorise the project team and its maturity. The two numbers are used to plot the value in a matrix that will indicate the best-fit lifecycle model as shown below.
The evaluation is based on the rating given for each of the following 15 project and team characteristics. Each of these must be evaluated based on the defined criteria. Once the evaluation is complete, the outcome rating can be plotted on the chart shown above to determine the “ideal” lifecycle model.
The criteria are:
- Knowledge (K)
- Working Together (W)
- Passion (P)
- Requirements (R)
- Project Team Engagement (E)
- Documentation (D)
- Project Type (T)
- Schedule (S)
- Budget (B)
- Tester Involvement (I)
- Communication (C)
- Location (L)
- Management (M)
- Quality (Q)
- Meeting overhead (effective working hours) (O)
The tables below explain the range for each criterion, at a general level, for both the X and Y axis ratings. Specific examples are provided for a rating of 1, 3 and 5 for each of the criteria. Recommendations for actions that can help a project move from a rating of 1,1 (pure sequential) to a 5,5 (pure agile) are provided for each axis.
This move is based on the assumption that Agile is a very popular lifecycle model and the one that will likely be pushed. If a project has a sequential lifecycle model and is ideally suited for sequential, you’re all done! But, if you need to think about how to move your V-model suited project into a modified Agile lifecycle, you’ll need some pointers on how to get there.
With these guidelines, the next step is to evaluate a project, plotting the ratings on the matrix. For example, the following ratings might be applicable to an enterprise upgrade project:
There is a concentration of the ratings toward the left, or sequential, side of the diagram.
Given this information, the project should follow a sequential lifecycle and would probably benefit from an embedded iterative model, where the iterative development and testing lifecycle is followed within the stricter documentation requirements of a V-model.
So, what do we do when the decision is made that the project will follow an Agile lifecycle? Other than sighing heavily, we need to understand how to move each of the ratings from the current 1-2 range, into the 3-5 range.
It’s not feasible to try to change everything that doesn’t fit into the right quadrant. What we can do is make a list of the items to watch versus those that need immediate action.
In the diagram below, the solid circle indicates the items that should be watched but may not require significant change. The dotted circles indicate the items that are likely to require the most change and the most time to change.
In particular, we need to worry about the ratings that fell into the 1/1 to 3/3 range because these will require the most profound change. If we look at one of these, Knowledge, we know from our previous charts that moving from a 2/2 to a 4/4 (assuming a 5/5 is probably not realistic), we should consider the following recommendations:
For the X axis: 1’s may not respect 5’s and 5’s may not value the knowledge of the 1’s. A good plan would be to form a specialist consulting team who can still keep their identity as specialists, but can also contribute to the team.
For the Y axis: 3 will be the common ground. 1’s will be uncomfortable with the informality of the 5’s environment. Plan the move to 3 and then slowly to 5 as sharing and teaching increase sociability.
In the sample project, an embedded iterative lifecycle would have been the best choice. It would have required the least changes and would have suited the project. We’d still need to worry about budget, schedule, passion and knowledge, but these are feasible areas for change.
Embedded iterative, which is a blend of the sequential and iterative models, allows us to have the higher level of control for the budget and schedule. With a bit of latitude, it’s probably feasible to make the change.
Passion and knowledge will take longer and will have to be built slowly and by example. Still possible – but it will take some time.
This highlights that lifecycles should not be declared arbitrarily. There are many considerations. The right choice will help lead to success, while the wrong one may lead to failure.
This model doesn’t solve all the problems, but it gives a way to approach change. It supplies a plan we can apply as we move into a project. It also tells us where the pitfalls may be, and that will help us adapt to the project and understand other team members better.
It’s still not perfect. Ideally our example project team would look at this evaluation and say, “Hey! We should definitely switch to an embedded iterative lifecycle.” But, what are the chances of that happening?
Better that we understand the changes that will be needed and help the team adapt to be successful with the chosen model. Success is the goal. Getting there is the challenge.