Earthquakes in the project portfolio
About three months before the due date, Dr Graham Oakes noticed that the project portfolio he was monitoring would suddenly experience a large slip followed by a series of smaller slips a pattern that looked a lot like an earthquake.
Compuer games are a serious business. A game for a current generation console can cost 10 million or more to build. Even in the mid 1990s, when my experience with formal project reviews began, developing a game could cost 2 million. The company I worked for had about 60 such games under development at any one time.
My company, like most in the industry, had a problem: projects slipped. They often slipped by months or even years. This didnt help our reputation with retailers, reviewers and customers. It also made it impossible to predict cashflow. My team was set up to bring predictability to our project delivery.
Each member of the team was responsible for providing an independent view of status on ten projects in the portfolio. Each week we tracked our projects deliverables against their milestone schedules. We monitored key risks. We read status reports. Above all, we talked with project managers, discussing the issues they were dealing with and listening to their concerns. Sometimes we offered suggestions or put them in touch with other people who were dealing with similar issues, but often we just functioned as a sounding board to help them think through what was going on.
We also produced a weekly report to executive management. This consisted of a simple ordered listing of the projects under development, ranked by our assessment of their level of risk. We also wrote one or two sentences on each project, summarising our reasons for its ranking. This report was copied to most managers in the company, giving everyone plenty of opportunity to tell us where wed got it wrong
(Interestingly, project managers generally reckoned their project was riskier than wed assessed it. Business unit managers generally thought projects were less risky than wed assessed them. Either way, people started to actively track the positioning of their projects, and to tell us how our ranking could be improved. By publishing our report openly, we created a very useful channel for such information.)
After wed been monitoring our projects for a while, we began to see a pattern. Projects would run quietly for nine or 12 months. Then, about three months before the date they were due to deliver into testing, theyd start to slip. Often theyd have a big slip initially, followed by a series of progressively smaller slips as they got closer to the end date. (See figure 1.)
This pattern was remarkably consistent. Because we were working with a portfolio of 60 similar projects, we could draw graphs and see trends. We found a strong correlation between the magnitude of each slip and the length of time before a project was scheduled to deliver into testing.
For some reason, projects would remain stable for much of their development phase, then suddenly experience a large slip followed by a series of smaller delays.
To me, with my original training in geophysics, this pattern looks a lot like an earthquake. Stress gradually builds up as tectonic plates move. Finally the rocks break, give off a loud bang, and settle into a less strained position. Then a series of aftershocks releases any residual stress. So it was with our projects. For a long time people could ignore the small setbacks and minor slippages, perhaps hoping to make up time later. Finally, as the end date loomed, they could no longer avoid reality. So theyd take a big slip to release all the built up delay. Then the stress would build up again, and theyd take a series of smaller slips to release it.
We monitored this pattern as we continued our reviews. After a couple of years, a familiar pattern emerged. Projects were still slipping. The general pattern of slippage was pretty much the same. But the slips were happening about three months earlier in the project lifecycle. There were several reasons for this people were monitoring status more closely; project managers could use the assurance team to back their judgement as to when a slip was needed, so had confidence to make the call earlier; wed got better at defining clear milestones. Overall, however, we were simply having a more informed dialogue about our projects. This helped us to identify and relieve the stresses earlier.
Not all of life is like games development. Industries have different drivers. Companies have different strategies and approaches. People differ in all sorts of ways. Games are very different to banking engines or ERP (Enterprise Resource Planning) systems. However, there are many similarities too. Some of the lessons we learned on that games portfolio can be generalised. For example:
- People need a sounding board. Projects are complex, with many people involved and lots of moving parts. Project managers are rarely given the time to sit down and reflect on whats happening. Simply by creating space for reflection, we helped them identify and solve many of the problems on their projects.
- Openly published information creates a conduit for dialogue. This dialogue helps improve the accuracy of the original information, and provides an opportunity to gather more information. Project, line and executive managers all need such accurate and complete information if they are to make effective decisions.
- Independent reviews can help validate the information provided by teams and project managers. They can also help make the above reporting and reflection processes more robust.
Projects can only succeed when they deal with reality. Reflection, dialogue and independent review are key tools for helping our projects keep in touch with reality.
- Dr Graham Oakes is principal of independent consultancy Graham Oakes Ltd. His book Project Reviews, Assurance and Governance was published by Gower in October 2008. He can be contacted at graham@grahamoakes.co.uk
0 comments
Log in to post a comment, or create an account if you don't have one already.