Posted on by Acquisitionin
By SPRUCE Project
This is the first post in a three-part series.
Software and acquisition professionals often have questions about recommended practices related to modern software development methods, techniques, and tools, such as how to apply agile methods in government acquisition frameworks, systematic verification and validation of safety-critical systems, and operational risk management. In the Department of Defense (DoD), these techniques are just a few of the options available to face the myriad challenges in producing large, secure software-reliant systems on schedule and within budget.
In an effort to offer our assessment of recommended techniques in these areas, SEI built researchers built upon an existing collaborative online environment known as SPRUCE (Systems and Software Producibility Collaboration Environment), hosted on the Cyber Security & Information Systems Information Analysis Center (CSIAC) website. From June 2013 to June 2014, the SEI assembled guidance on a variety of topics based on relevance, maturity of the practices described, and the timeliness with respect to current events. For example, shortly after the Target security breach of late 2013, we selected Managing Operational Resilience as a topic.
Ultimately, SEI curated recommended practices on five software topics: Agile at Scale, Safety-Critical Systems, Monitoring Software-Intensive System Acquisition Programs, Managing Intellectual Property in the Acquisition of Software-Intensive Systems, and Managing Operational Resilience. In addition to a recently published paper on SEI efforts and individual posts on the SPRUCE site, these recommended practices will be published in a series of posts on the SEI blog. This post, the first in a three-part series by Robert Ferguson, first explores the challenges to Monitoring Software-Intensive System Acquisition (SISA) programs and presents the first two recommended best practices as detailed in the SPRUCE post. The second post in this series will present the next three best practices. The final post will present the final two recommendations as well as conditions that will allow organizations to derive the most benefit from these practices.
Monitoring Software-Intensive System Acquisition (SISA) Programs - SPRUCE / SEI
Our discussion of monitoring SISA programs has four parts. First, we set the context by providing an answer to the question "Why is monitoring an SISA program challenging?" Seven practices for monitoring SISA programs follow. We then briefly address how a program manager (PM) and acquisition team can prepare for and achieve effective results by following these practices. We conclude with a list of selected resources to help you learn more about monitoring SISA programs. Also, we've added links to various sources to help amplify a point--these sources may occasionally include material that differs from some of the recommendations below.
Every program is different; judgment is required to implement these practices in a way that benefits you. In particular, be mindful of your mission, goals, processes, and culture. All practices have limitations. We expect that some of these practices will be more relevant to your situation than others, and their applicability will depend on the context to which you apply them. To gain the most benefit, you need to evaluate each practice for its appropriateness and decide how to adapt it, striving for an implementation in which the practices reinforce each other. In particular, the practices are not intended to be in strict sequence: practices may iterate or overlap with others. Also, consider additional practice collections (such as Pitman's SCRAM approach, which is referenced at the end of this web page). Monitor your adoption and use of these practices and adjust as appropriate.
These practices are certainly not complete--they are a work in progress. We welcome your feedback (use the comments section at the end).
Why is monitoring SISA programs challenging?
Essential to effective program management is the capability to maintain an accurate and current understanding of a program's status so that issues that threaten program objectives can be identified quickly and dealt with efficiently. Also, the program manager (PM) depends to a large degree on the goodwill and commitments of a program's many stakeholders. When a program's status and forecasts frequently change, causing the PM's promises and assurances to fail, stakeholders may lose confidence and withdraw their time commitments and attention from the PM's program and instead invest them in less risky undertakings. Thus, the PM cannot afford to break many commitments; the PM wants to identify and resolve issues early before they grow into significant problems that require the attention of external stakeholders. Continual cost overruns or schedule slippage may lead to greater oversight and even program termination.
Monitoring a program's progress is challenging for several reasons:
To address these challenges, the PM and contractor should work together to understand how they will use the contractor's measures to identify and evaluate potential threats to program objectives and commitments, which can also have an effect on the PM's commitments to stakeholders. This observation is the basis for these practices.
Practices for Monitoring SISA Programs
The following practices implement a particular approach to helping the PM and contractor come to a mutual understanding of a program's progress and the significance of deviations from expectations. The key to that approach is to consolidate a contractor's progress-related data into a structure, called the "program dashboard," that covers those areas of contractor performance that enable forecasting future progress. The program dashboard provides the PM with the evidence needed to say, "I can keep the promises I made," or "here's the evidence that I can't keep those promises, so I'm going to have to slip schedule two months." Program dashboards can take many forms. Not all these examples relate to monitoring SISA programs, but they provide some notion of the types of data that can be collected, organized, and presented in a visual display of the status and progress of a program.
1. Address management measures and their use in RFPs and contracts.
The PM wants to direct prospective bidders or the awarded contractor to use a specific set of measures. The problems with such an approach were discussed in the challenges above. Rather than directing the contractor to use a specific set of measures, the PM is better served by leveraging the professionalism of the contractor. Certainly, the contractor has questions similar to those of the PM and a need to answer those questions to plan and manage the overall effort and individual tasks. Contractors need to measure and track what they produce. Those are measures that PMs want to see. PMs must make commitments to their stakeholders, and any request to change those commitments must be carefully introduced and socialized. Given the difficulty of specifying the measures in advance, the PM must instead request the contractor to provide measures that address the PM's questions.
The dashboard poses a set of questions--but what gets measured (and thus what measurements are reported) is determined by what the contractor has learned works and has incorporated into a formal measurement program.
Thus, the request for proposals (RFP), statement of work, and contract should
Regarding (b) and (c), unforeseeable changes in requirements, process, or technology may require the contractor to update the set of measures used for that phase and communicate this to the PM. The contractor should allow sufficient time for the program office to become better acquainted with the new measures that the contractor will use during that phase.
The PM may also provide an example dashboard in the RFP and contract to help clarify his or her intent.
2. Set up the dashboard.
In an acquisition context, the PM does not direct how the contractors will perform the work; instead, the PM participates in making decisions about deliverables and prioritizing product content in keeping with commitments to external stakeholders. Some of these decisions will result in a change request that may affect either cost or schedule. Such change requests will need to be reviewed by external stakeholders who will ask for supporting justification. That justification is needed to support the decisions made and the need for changes to existing commitments, which can be shown easily on a dashboard.
The contractor's data is organized into quadrants according to the type of control that the PM, contractor, or stakeholders can exercise. Such an organization results in a program dashboard that notionally looks something like this:
The program dashboard helps the PM justify what action (e.g., change the direction of the contractor or change the commitments to external stakeholders) is necessary by
The first three quadrants represent opportunities to control and direct the work with the involvement of appropriate parties (e.g., customer, contractor, major subcontractors):
The fourth quadrant represents the target objectives for product quality:
Certainly these quadrants relate to one another. Taking action on one can affect another one. But collectively they help ensure that all the information critical to making decisions is well founded. Thus, you shouldn't ignore any of these quadrants.
An example dashboard, including measures that it can portray, is described in the Condensed Guide to Software Acquisition Best Practices (slide 4). It doesn't align into these exact four quadrants, but you should recognize the quadrants in the different arrangement.
The next post in this series will present the next three best practices for Monitoring Software-Intensive System Acquisition (SISA) Programs, which include
The final post in this series will discuss the remaining best practices.
Below is a listing of selected resources to help you learn more. We have also added links to various sources to help amplify a point. Please be mindful that such sources may occasionally include material that might differ from some of the recommendations in the article above and the references below. Technology transition is a key part of the SEI's mission and a guiding principle in our role as a federally funded research and development center. We welcome your comments and suggestions on further refining these recommended practices.
Richard Crume. "Who Is to Blame When Government Contracts Go Astray?" IACCM, 2008. http://www.iaccm.com/news/contractingexcellence/?storyid=548
Adrian Pitman, Elizabeth K. Clark, Bradford K. Clark, & Angela Tuffley. "An Overview of the Schedule Risk Assessment Methodology (SCRAM)." Journal of Cyber Security & Information Systems 1, 4 (2013). https://www.csiac.org/sites/default/files/journal_files/CSIAC_V1N4_FINAL_2.pdf
Defense Acquisition University. Defense Acquisition Guidebook. DAU, 2013. https://dag.dau.mil/Pages/Default.aspx
Steve McConnell. Software Estimation: Demystifying the Black Art (Best Practices Series). Microsoft Press, 2006.
Software Program Managers' Network. The Program Manager's Guide to Software Acquisition Best Practices. Computers & Concepts Associates, 1998. https://acc.dau.mil/adl/en-US/33409/file/6731/%2316705%20Software%20Best%20Practices%20Initiative.pdf