Posted on by Measurement & Analysisin
By law, major defense acquisition programs are now required to prepare cost estimates earlier in the acquisition lifecycle, including pre-Milestone A, well before concrete technical information is available on the program being developed. Estimates are therefore often based on a desired capability--or even on an abstract concept--rather than a concrete technical solution plan to achieve the desired capability. Hence the role and modeling of assumptions becomes more challenging. This blog posting outlines a multi-year project on Quantifying Uncertainty in Early Lifecycle Cost Estimation (QUELCE) conducted by the SEI Software Engineering Measurement and Analysis (SEMA) team. QUELCE is a method for improving pre-Milestone A software cost estimates through research designed to improve judgment regarding uncertainty in key assumptions (which we term program change drivers), the relationships among the program change drivers, and their impact on cost.
According to a February 2011 presentation by Gary Bliss, director of Program Assessment and Root Cause Analysis, to the DoD Cost Analysis Symposium, unrealistic cost or schedule estimates are a frequent causal factor for programs breaching a performance criterion. Steve Miller, director of the Advanced Systems Cost Analysis Division of OSD Cost Analysis and Program Evaluation, noted during his DoDCAS 2012 presentation that "Measuring the range of possible cost outcomes for each option is essential ...Our sense is not that the cost estimates were poorly developed [but] rather key input assumptions didn't pan out." For instance, an estimate might assume
QUELCE addresses the challenge of getting the assumptions "right" by characterizing them as uncertain events rather than certain eventualities. As we've noted previously, modeling uncertainty on the input side of the cost model is a hallmark of the QUELCE method. By better representing uncertainty, and therefore risk, in the assumptions and explicitly modeling them, DoD decision makers, such as Milestone Decision Authorities (MDAs) and Service Acquisition Executives (SAEs), can make more informed choices about funding programs and portfolio management. QUELCE is designed to ensure that DoD acquisition programs will be funded at levels consistent with the magnitude of risk to achieving program success, fewer and less severe program cost overruns will occur due to poor estimates, and there will be less rework reconciling program and OSD cost estimates.
QUELCE relies on Bayesian Belief Network (BBN) modeling to quantify uncertainties among program change drivers as inputs to cost models. QUELCE then uses Monte Carlo simulation to generate a distribution (as opposed to a single point) for the cost estimate. In addition, QUELCE includes a DoD domain-specific method for improving expert judgment regarding the nature of uncertainty in program change drivers, their interrelationships, and eventual impact on program cost drivers. QUELCE is distinguished from other approaches to cost estimation by its ability to
The QUELCE method consists of the following steps in order:
Improving the Reliability of the Expert Opinion
Early cost estimates rely heavily on subject matter expert (SME) judgment, and improving the reliability of these judgments represents another focus of our research. Expert judgment can be idiosyncratic, and our aim is to try to make it more reliable. QUELCE draws upon the work of Dr. Douglas Hubbard, whose book How to Measure Anything describes a technique known as "calibrating your judgment" that we are adapting for our DoD cost estimation analysis.
For example, if you state you are 90 percent confident, you should be correct in your answers 90 percent of the time. If you state you are 80 percent confident, you would be correct 8 times out of 10. Performing in agreement with your statement of confidence is termed "being calibrated."
Hubbard's technique operates by giving participants a series of questionnaires. The participants are asked to provide an upper and lower bound for the answer to each question such that they believe they will be correct 90 percent of the time. Hence, a participant should get 9 out of 10 answers right. If they answer all 10 correctly, they are being too conservative in their answers; they provided too wide of a range. If they get fewer than 9 correct, they are over confident and providing too narrow of a range for their answers. Hubbard's approach provides feedback so that participants are consistently correct 90 percent of the time. Through this method of testing and feedback, they learn to calibrate their judgment.
Applying that same approach to DoD cost estimation analysis would ideally mean that if two calibrated judgments are being applied to the same cost estimate, there is now a more precise idea of what those judgments mean. Hubbard, who taught a class at the SEI, demonstrated that most people start off being highly over confident in terms of their knowledge and judgment.
We plan to test Hubbard's approach of calibrating judgment with questions specific to software estimating at several universities, including Carnegie Mellon University and the University of Arizona. To develop the materials for these experiments, we are mining information from open-source repositories, such as Ohloh.net. Our objective is to increase the consistency and repeatability of expert judgment as it is used in software cost estimation.
A key challenge that our team faces in conducting our research is validating the QUELCE method. It can literally take years for a program to reach a milestone against which we can compare its actual costs to the estimate produced by QUELCE. We are addressing this challenge by validating pieces of the method through experiments, workshops, and retrospectives. We are currently conducting a retrospective on an active program that provided us access to its historical records. Key to this latter activity is the participation of team members from the SEI Acquisition Support Program (ASP). The ASP members are playing the role of program experts as we work our way through the retrospective.
Another challenge that our work on QUELCE addresses is insufficient access to DoD information and data repositories may significantly jeopardize our ability to conduct sufficient empirical analysis for the program change driver repository. To address this, we have been working with our sponsor and others in the Office of the Secretary of Defense to gain access to historical program data stored in a variety of repositories housed throughout the DOD. We plan to use this data to develop reference points and other information that will be used by QUELCE implementers as a decision aid when developing the BBN for their program. This data would also be included in the program change driver repository.
Developing a Repository
We are creating a program change driver repository that will be used as a support tool when applying the QUELCE method. The repository is envisioned as a source of program change drivers--what events occurred during the life of a program that directly or indirectly impacted its cost--along with their probability of occurrence. The repository will also include information that will be used as part of the method for improving the reliability of expert judgment such as reference points based on the history of Mandatory Procedures for Major Defense Acquisition Programs.
Developing the repository is a major task planned for FY13. We also plan to conduct additional pilots of the method including use of the repository and support tools. From those pilots, we will develop guidance for the use of the repository and make it available on a trial basis within the DoD. After the repository is adequately populated and developed, we intend it to become an operational resource for DoD cost estimating.
Transitioning to the Public
During the coming year, our SEMA team will work to
To read the SEI technical report Quantifying Uncertainty in Early Lifecycle Cost Estimation (QUELCE), please visit www.sei.cmu.edu/library/abstracts/reports/11tr026.cfm.
For more information about Milestone A, please see the Integrated Defense Life Cycle Chart for a picture and references in the "Article Library."
Visit the SEI Digital Library for other publications by David