Skip to content

Are Analysts Complicit in Cost and Schedule Growth?

In a recent post, I called into question the credibility of the current practice of cost risk analysis. I showed that for 10 projects for which a cost risk analysis had been conducted, the actual cost was greater than the 90th percentile for 8 of the 10. The 90th percentile should capture 90% of the variation in the actual cost from that predicted, so what we see in practice is the opposite of what we should expect, if cost risk analysis is realistic. I have seen the same phenomenon occur in schedule risk analysis, but there is even less of a track record than for cost. One colleague asked a great question about this data – how do you compare the actual versus predicted cost when what is produced is different than planned?

There are two sides to this problem. One is that to try to mitigate cost growth and schedule delays, a project cuts scope. This is difficult to quantify. You have to realize that the long history of cost growth and schedule delays is even worse than it appears, as you are not holding performance constant. The flip side is that the system becomes more complex over time. One of the primary reasons for this is optimism. A project has a stated goal at the outset, such a build a dam to generate hydroelectric power, or send a robotic lander to Mars. There is a universal tendency to underestimate how difficult it will be to achieve this objective.

This tendency over time, along with all the other sources of risk, can be measured by examining historical cost growth and schedule delay data. This has formed a reliable pattern over time across a variety of industries. See the table below on cost growth and schedule delay data for development projects, from my forthcoming book.

The problem has been going on for a long time, and is not getting any better. Norm Augustine, who wrote about cost growth and schedule delays for defense and aerospace projects, found the average cost growth for a development project was 50% and the average schedule delay was 33%. If you look at the NASA/DoD column, you can see little has changed in the last 40 years. Oxford professor Bent Flyvbjerg has found a similar long-term pattern of cost growth and schedule delays for infrastructure projects, and John Hollmann has observed this to be the case for construction projects.

An analyst who conducts risk analysis for a project can often claim innocence by stating that they provided a risk analysis that was consistent with the project assumptions. If the project manager is optimistic, they claim it is not their fault. However, as we need to help project managers make better decisions, it is also incumbent on us to measure our track record over time (which we do not systematically do), widen our risk ranges to be more realistic, and use this information to defend our position with project managers. Even when project managers claim this time is different, we have to inform them that, no, this time is likely not different. Otherwise, we play the role of Charlie Brown in the recurring story with Lucy van Pelt, as illustrated by the 1957 Peanuts comic strip below.

For many projects, this growth has a pattern. It follows a skewed distribution called a lognormal. This distribution can model the heavy right-tail of cost growth and schedule delays we consistently see over time in defense and aerospace projects. This pattern has also been found to hold for construction and infrastructure projects. In other cases, the tail is even heavier than can be reliably modeled by a lognormal, but in such cases the right tail can be measured with a Pareto distribution. By using this data, we can calibrate our risk analyses to historical cost and schedule growth data to provide realistic risk ranges. To do otherwise is to continue to look foolish when we are blamed for underestimating cost and/or schedule for yet another project.

The table on cost growth and schedule delay information is from Chapter 1 of my forthcoming book. You can read this chapter for free at https://bit.ly/3ggPZK2. My book will be in bookstores November 3rd, but you can now preorder the hardcover or Kindle version online from Amazon, or the hardcover or Nook version from Barnes and Noble.

4 thoughts on “Are Analysts Complicit in Cost and Schedule Growth?”

  1. I think it is difficult for many analysts to go against the information they are given by the project or study leads. These analysts often believe they are not qualified or it is not their job to question questionable data or assumptions. That is one reason why I try to get analysts to compare their estimates (including adjustments for risk and uncertainty) to historical experience. I hope that by giving them something real they can point to it will stiffen their spine and encourage them to push back against unrealistic assumptions. Of course, this only works when there are appropriate historical analogs.

    1. Good points. One thing we need to do a better job of is understanding our poor track record on risk analysis. That could also arm an analyst with information to defend a wider range for uncertainty.

  2. An OSD analyst, I do not recall his name, wrote a book on this topic where he bucketed cost growth into two categories: decisions and mistakes.

    He then went through every major DoD acquisition program in the last 20 years and did his best to identify increases in quantities, changing govt reqts, and the like. Then, he computed an average growth. The rest of the cost growth he attributed to mistakes (optimism, cost estimating error, contractor redesign effort, etc).

    One of his insights was that the larger the program (in dollar value) the larger the cost growth due to mistakes (as a percentage).

    To me, this was the best treatment of a complex problem, but it requires an in depth knowledge of many acquisition programs.

Comments are closed.