The paper on the Cone of Uncertainty is used by many in the No Estimates community as an example of why estimates are of little use, since uncertanty is not reduced.
It's not reduced in the paper, because no specific intentional actions were take to reduce the uncertainty - either the aleatory uncertainty with margin, or the epistemic uncertainty with buydown activities, nor re-evalutiing the estimates as the project progresses. So no wonder the uncertainty wasn't reduced. No one took action to reduce it. Uncertainty doesn't reduce by itself.
Let's deconstruct the essence of the paper. In this paper, there is data that does not follow the Cone of Uncertainty, in that the uncertainty of the estimates does not reduce as the project proceeds. Then there is some analysis by the author.
Although I don’t have definitive evidence to explain the variation in estimation accuracy I observed, I’ve identified what I believe are the primary causes:
• optimistic assumptions about resource availability,
• unanticipated requirements changes brought on by new market information,
• underestimation of cross-product integration and dependency delays,
• a corporate culture using targets as estimates, and
• customer satisfaction prioritized over arbitrarily meeting a deadline.
Note, no specific actions are mentioned to address these possible causes - resulting in NO reduction of the uncertainty. This lack of reduction is then used to conjecture the CoU is not valid, when in fact the project managers appeared to sit by and let the uncertainty NOT reduce without taking corrective actions. The Cone of Uncertainty does not reduce by itself. This is observing the symptom and ignoring the cause then arguing that the CoU is wrong.
...
While the data supports some aspects of the cone of uncertainty, it doesn’t support the most common conclusion that uncertainty significantly decreases as the project progresses. Instead, I found that relative remaining uncertainty was essentially constant over the project’s life. Although one company’s data might not be enough to extrapolate to other environments, I believe that the data casts doubt on the reduction of relative uncertainty being a naturally occurring phenomenon.
...
Reducing the range of uncertainty must be possible, right? Traditional project management approaches, several of which are based on a strong belief in the cone of uncertainty, advocate stronger project control and greater planning. While controls and planning are useful, an overly strict focus can result in attempts to solve the wrong goal. Shipping on time, to specifications, and within budget might be meaningless if a competitor is shipping software that has a greater value to the market. In that case, the competitor will win nearly every time, and the prize for “good” project management might be losing market share.
(the) ... measure of success over these three years had much more to do with customer satisfaction and market share than with meeting knowingly aggressive targets.
Let's clarify something up front. Let's re-read from the paper - I believe that the data casts doubt on the reduction of relative uncertainty being a naturally occurring phenomenon. Reduction in the uncertainty is NOT a naturally occuring phenomenon. Redcution in the undertainty only occurs from intentional, specfic work processes applied to the project to reduce uncertainty. These processes are called Risk Management. Reducing risk NEVER occurs naturally. Direct intervention - risk buy down work - is required to reduce Epistemic uncertainties. And margin is required to reduce Aleatory uncertainties. Let me say it again uncertainties NEVER reduce themselves.
From [7] "Figure 1 shows the accuracy of software sizing and estimation by phases. The level of estimation uncertainties is high during the initial estimations due to lack of data and experience. As long as the projects are not re-assessed or the estimations not re-visited, the cones of uncertainty are not effectively reduced [12]." In the reference paper, there was no mention of specific actions taken to reduce the uncertanty in the estimates. So no wonder they did not reduce, no one thought to manage the project as a Risk Management process as Tim Lister tells us
Risk Management is How Adults Manage Projects
It's suggested from observations, the Cone of Uncertainty (CoU) is not a valid model of how uncertainty behaves in software development projects. And from the project's data that may well be true. The project did not seem to have a plan or follow a plan for reducing uncertainty in the estimates. But there are reasons provided for this failure (after the fact) to reduce the uncertainty and then it's argued that the CoU is not valid. While at the same time acknowledging the data may not be definitive and has direct causes for not being reduced inside a shrinking cone of uncertainty.
So CoU was not met, and the author knows possibly why. But then states the CoU is not Valid - although the data shows possibly why the CoU was not met, but took no action to correct those causes. Seem like tossing the baby with the bath water. The paper confuses a symptom - not having uncertainty reduce as the project proceeds with the cause - not taking any corrective action to fix causes of the uncertainty not reducing. And then claiming the COU is not valid.
While there some possible root causes for this observation, the paper fails to acknowledge that if those root causes were corrected soon enough, the uncertainty may well start to be reduced as the project progresses, thus following the CoU.
But a more important idea is missing in the article. The notion of reducing uncertainty may be possible, but shipping on time and budget may be meaningless if a competitor ships greater value.
In all software development businesses, showing up late and over budget has a direct impact on the bottom line.
Managing in the presence of uncertainty requires continuous assessment of progress to plan and continuous adjustment of that Plan and applying those adjustments to the execution to increase the probability that the project will produce the needed value at the needed time (ahead of the competition) for the needed cost (to meet the business goals).
This is the role of Management, determine corrective actions to stay on Plan. This assumes of course the Plan is credible. If not, that's another issue. But Keeping the Program Green is a critical success factor for all project work.
If the competition has moved ahead of your work, make a new Plan, update the current Plan, take corrective actions, and execute the new Plan. Don't toss out the notion that reducing risk and uncertanty and all other performance measures doesn't follow the plan.
If project management does not behave in this manner - continuous adjustment in the presence of uncertainty - they deserve to lose to the competition. Another example of Doing Stupid Things on Purpose
The Cone of Uncertainty is the Planned reduction in the uncertainty of critical project variables (cost, schedule, technical performance) needed to increase the probability of project success.
The Cone of Uncertainty is NOT an after the fact assessment of what was not properly managed.
The Cone of Uncertanty is a Steering Guide providing feedback (in a closed loop control system) to the project on what corrections to take to Keep the Program GREEN.If you see someting going wrong - uncertainty not being reduced - and you sit by and don't do anything about it, reporting that I observed nothing got better - you're not managing the project, you're just an observer of its failure
Some Background on Managing in the Presence of Uncertainty
- Cone of Uncertainty - Part Cinq
- Managing in the Presence of Uncertainty
- "Shrinking the Cone of Uncertainty with Continuous Assessment for Software Team Dynamics in Design and Development," Pongtip Aroonvatanaporn, Ph.D. Thesis, University of Southern California, August 2012.
- "Domain-Based Effort Distribution Model for Software Cost Estimation,: Thomas Tan, Ph.D. Thesis, University of Southern California, August 2012.
- "Estimating Software Effort Hours for Major Defense Acquisition Programs," Corinne C. Wallshein, Ph.D. Thesis, George Mason University, 2010.
- "An Inaccurate Conception: Some Thoughts on the accuracy of estimates," Phillip G. Armour, Communications of the ACM, March 2008, Vol. 51, No. 3, pp. 13-16.
- "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “Cone of Uncertainty”," Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm, ASE' 10
- "Six Rules of Effective Forecasting," Paul Saffo, Harvard Business Review, July-August 2007
- Software Measurement and Estimation: A Practical Approach, Linda M. Laird and M. Carol Brennan, John Wiley & Sons, 2006.
- "Estimating Software Intensive Projects in the Absence of Historical Data," Aldo Dagnino, ICSE 2013.
- "Coping with the Cone of Uncertainty: An Empirical Study of the SAIV Process Model," Da Yang, Barry Boehm, Ye Yang, Qing Wang, and Mingshu Li, CSP 2007, LNCS 4470, pp. 37–48, 2007
- Software Engineering Economics, Barry Boehm, Prentice-Hall, 1981.