icon-carat-right menu search cmu-wordmark

7 Recommended Practices for Monitoring Software-Intensive System Acquisition (SISA) Programs

SPRUCE Project Robert Ferguson
and
CITE

This is the first post in a three-part series.

Software and acquisition professionals often have questions about recommended practices related to modern software development methods, techniques, and tools, such as how to apply agile methods in government acquisition frameworks, systematic verification and validation of safety-critical systems, and operational risk management. In the Department of Defense (DoD), these techniques are just a few of the options available to face the myriad challenges in producing large, secure software-reliant systems on schedule and within budget.

In an effort to offer our assessment of recommended techniques in these areas, SEI built researchers built upon an existing collaborative online environment known as SPRUCE (Systems and Software Producibility Collaboration Environment), hosted on the Cyber Security & Information Systems Information Analysis Center (CSIAC) website. From June 2013 to June 2014, the SEI assembled guidance on a variety of topics based on relevance, maturity of the practices described, and the timeliness with respect to current events. For example, shortly after the Target security breach of late 2013, we selected Managing Operational Resilience as a topic.

Ultimately, SEI curated recommended practices on five software topics: Agile at Scale, Safety-Critical Systems, Monitoring Software-Intensive System Acquisition Programs, Managing Intellectual Property in the Acquisition of Software-Intensive Systems, and Managing Operational Resilience. In addition to a recently published paper on SEI efforts and individual posts on the SPRUCE site, these recommended practices will be published in a series of posts on the SEI blog. This post, the first in a three-part series by Robert Ferguson, first explores the challenges to Monitoring Software-Intensive System Acquisition (SISA) programs and presents the first two recommended best practices as detailed in the SPRUCE post. The second post in this series will present the next three best practices. The final post will present the final two recommendations as well as conditions that will allow organizations to derive the most benefit from these practices.

Monitoring Software-Intensive System Acquisition (SISA) Programs - SPRUCE / SEI
https://www.csiac.org/spruce/resources/ref_documents/agile-scale-aas-spruce-sei

Our discussion of monitoring SISA programs has four parts. First, we set the context by providing an answer to the question "Why is monitoring an SISA program challenging?" Seven practices for monitoring SISA programs follow. We then briefly address how a program manager (PM) and acquisition team can prepare for and achieve effective results by following these practices. We conclude with a list of selected resources to help you learn more about monitoring SISA programs. Also, we've added links to various sources to help amplify a point--these sources may occasionally include material that differs from some of the recommendations below.

Every program is different; judgment is required to implement these practices in a way that benefits you. In particular, be mindful of your mission, goals, processes, and culture. All practices have limitations. We expect that some of these practices will be more relevant to your situation than others, and their applicability will depend on the context to which you apply them. To gain the most benefit, you need to evaluate each practice for its appropriateness and decide how to adapt it, striving for an implementation in which the practices reinforce each other. In particular, the practices are not intended to be in strict sequence: practices may iterate or overlap with others. Also, consider additional practice collections (such as Pitman's SCRAM approach, which is referenced at the end of this web page). Monitor your adoption and use of these practices and adjust as appropriate.

These practices are certainly not complete--they are a work in progress. We welcome your feedback (use the comments section at the end).

7 Practices for monitoring SISA programs

  1. Address in contracts
  2. Set up dashboard
  3. Assign and train staff in its interpretation
  4. Update regularly
  5. Discuss in program reviews and as needed
  6. Refresh for each new phase
  7. Renegotiate commitments with stakeholders

Why is monitoring SISA programs challenging?

Essential to effective program management is the capability to maintain an accurate and current understanding of a program's status so that issues that threaten program objectives can be identified quickly and dealt with efficiently. Also, the program manager (PM) depends to a large degree on the goodwill and commitments of a program's many stakeholders. When a program's status and forecasts frequently change, causing the PM's promises and assurances to fail, stakeholders may lose confidence and withdraw their time commitments and attention from the PM's program and instead invest them in less risky undertakings. Thus, the PM cannot afford to break many commitments; the PM wants to identify and resolve issues early before they grow into significant problems that require the attention of external stakeholders. Continual cost overruns or schedule slippage may lead to greater oversight and even program termination.

Monitoring a program's progress is challenging for several reasons:

  1. Contractors don't understand the PM's commitments to other stakeholders. Contractors may misunderstand the PM's role, misinterpreting the PM's questions as efforts to infringe on the contractor's responsibilities, and not understanding the important coordination role the PM has with a program's stakeholders. To be successful, the program must have the support (i.e., resources, services, and attention) of many stakeholders (e.g., for reviews, testing, training, and sponsorship meetings). To sustain this support when other worthy endeavors compete for the same limited resources, the PM negotiates commitments with the program's stakeholders. The PM becomes dependent on a stakeholder for certain forms of support, and the stakeholder becomes dependent on the accuracy of a PM's forecasts of what resources will be needed and when. For example, contractors often have the data the PM needs to do these things but don't recognize this or put it in a form that helps the PM; conversely, without the right data, the PM cannot create accurate forecasts.
  2. PMs don't understand contractor data. The PMs may not have sufficient information to interpret the activity and schedule-related data that they get from contractors. The PM needs to understand the implications such data have for a program's objectives and the viability of a PM's existing commitments with stakeholders. Absent such understanding, the PM may waste time addressing the wrong issues, allowing serious problems to go unaddressed and later blindside the PM and stakeholders, resulting in lost time, resources, and goodwill.
  3. Changing to a new set of measures is hard. Measures are important to guiding system and software engineering and program-management decisions, but it takes effort and time to establish an effective measurement program. PMs may believe that they can ask the contractor to provide a particular set of measures and the contractor can easily and quickly make the necessary changes, but that's rarely the case for several reasons:

  • It takes effort and time for a contractor to introduce a new measure. Doing so may also require changes to operational definitions, analyses, reporting templates, tools, automated data collection, training, data management, access rights, privacy, and so forth.
  • Useful measures are generally obtained by observing and instrumenting a real process. Since each organization will have a unique process, some measures will also be unique to the organization.
  • There is always something new about a large project, so the organization doing the work will discover new things about the project as it proceeds. The organization may not be able to determine what to measure completely in advance.
  • An acquisition program may have several phases of work, and different measures will have more or less importance in different phases. Rather than asking for a permanent change to the measurement program, sometimes the PM issues a "data call." With data calls, the contractor does not expect that the measure will be repeated, so he or she gathers whatever is at hand. Such data lacks precision and supporting process. Issued too often, data calls become disruptive. Collecting good data involves instrumenting the process so the data is collected as a natural result (or side effect) of doing the work.

To address these challenges, the PM and contractor should work together to understand how they will use the contractor's measures to identify and evaluate potential threats to program objectives and commitments, which can also have an effect on the PM's commitments to stakeholders. This observation is the basis for these practices.

Practices for Monitoring SISA Programs

The following practices implement a particular approach to helping the PM and contractor come to a mutual understanding of a program's progress and the significance of deviations from expectations. The key to that approach is to consolidate a contractor's progress-related data into a structure, called the "program dashboard," that covers those areas of contractor performance that enable forecasting future progress. The program dashboard provides the PM with the evidence needed to say, "I can keep the promises I made," or "here's the evidence that I can't keep those promises, so I'm going to have to slip schedule two months." Program dashboards can take many forms. Not all these examples relate to monitoring SISA programs, but they provide some notion of the types of data that can be collected, organized, and presented in a visual display of the status and progress of a program.

1. Address management measures and their use in RFPs and contracts.

The PM wants to direct prospective bidders or the awarded contractor to use a specific set of measures. The problems with such an approach were discussed in the challenges above. Rather than directing the contractor to use a specific set of measures, the PM is better served by leveraging the professionalism of the contractor. Certainly, the contractor has questions similar to those of the PM and a need to answer those questions to plan and manage the overall effort and individual tasks. Contractors need to measure and track what they produce. Those are measures that PMs want to see. PMs must make commitments to their stakeholders, and any request to change those commitments must be carefully introduced and socialized. Given the difficulty of specifying the measures in advance, the PM must instead request the contractor to provide measures that address the PM's questions.

The dashboard poses a set of questions--but what gets measured (and thus what measurements are reported) is determined by what the contractor has learned works and has incorporated into a formal measurement program.

Thus, the request for proposals (RFP), statement of work, and contract should

  1. identify the progress-related questions on which the contractor-PM discussions and program reviews will focus. These questions are general and commonly used by the management of any diligent contractor to better understand what might affect the success of the program.
  2. request bidders or contractors to indicate how they intend to respond to those questions--what measures will apply?
  3. identify predefined times relative to milestones or at the beginning of each major development phase when the contractor will report what measures the contractor will use to report progress as a contract deliverable.

Regarding (b) and (c), unforeseeable changes in requirements, process, or technology may require the contractor to update the set of measures used for that phase and communicate this to the PM. The contractor should allow sufficient time for the program office to become better acquainted with the new measures that the contractor will use during that phase.

The PM may also provide an example dashboard in the RFP and contract to help clarify his or her intent.

2. Set up the dashboard.

In an acquisition context, the PM does not direct how the contractors will perform the work; instead, the PM participates in making decisions about deliverables and prioritizing product content in keeping with commitments to external stakeholders. Some of these decisions will result in a change request that may affect either cost or schedule. Such change requests will need to be reviewed by external stakeholders who will ask for supporting justification. That justification is needed to support the decisions made and the need for changes to existing commitments, which can be shown easily on a dashboard.

The contractor's data is organized into quadrants according to the type of control that the PM, contractor, or stakeholders can exercise. Such an organization results in a program dashboard that notionally looks something like this:

The program dashboard helps the PM justify what action (e.g., change the direction of the contractor or change the commitments to external stakeholders) is necessary by

  • accurately forecasting milestones and delivery of products
  • providing clear warnings if the plan is not working or an unplanned event has affected some desired outcome
  • supporting re-estimation and re-planning by showing the magnitude of the problem
  • providing data that can be used as evidence for stakeholders to reset expectations, take certain actions, and renegotiate commitments

The first three quadrants represent opportunities to control and direct the work with the involvement of appropriate parties (e.g., customer, contractor, major subcontractors):

  • Change the scope
  • Change schedule and resource allocation
  • Change a process

The fourth quadrant represents the target objectives for product quality:

  • Frequent quality checks during development help to avoid rework caused by testing failures found late in the project.
  • This quadrant records the frequency and outcome of all product quality checks, including the results of testing.

Certainly these quadrants relate to one another. Taking action on one can affect another one. But collectively they help ensure that all the information critical to making decisions is well founded. Thus, you shouldn't ignore any of these quadrants.

An example dashboard, including measures that it can portray, is described in the Condensed Guide to Software Acquisition Best Practices (slide 4). It doesn't align into these exact four quadrants, but you should recognize the quadrants in the different arrangement.

Looking Ahead

The next post in this series will present the next three best practices for Monitoring Software-Intensive System Acquisition (SISA) Programs, which include

  1. Assign and train staff in its interpretation
  2. Update regularly
  3. Discuss in program reviews and as needed

The final post in this series will discuss the remaining best practices.

Below is a listing of selected resources to help you learn more. We have also added links to various sources to help amplify a point. Please be mindful that such sources may occasionally include material that might differ from some of the recommendations in the article above and the references below. Technology transition is a key part of the SEI's mission and a guiding principle in our role as a federally funded research and development center. We welcome your comments and suggestions on further refining these recommended practices.

Resources

Adrian Pitman, Elizabeth K. Clark, Bradford K. Clark, & Angela Tuffley. "An Overview of the Schedule Risk Assessment Methodology (SCRAM)." Journal of Cyber Security & Information Systems 1, 4 (2013). https://www.csiac.org/journal-article/an-overview-of-the-schedule-compliance-risk-assessment-methodology-scram/

Defense Acquisition University. Defense Acquisition Guidebook. DAU. https://www.dau.mil/tools/dag

Steve McConnell. Software Estimation: Demystifying the Black Art (Best Practices Series). Microsoft Press, 2006.

CITE

Get updates on our latest work.

Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.

Subscribe Get our RSS feed