icon-carat-right menu search cmu-wordmark

The Value of Systems Engineering

Headshot of Joseph Elm
CITE
SHARE

This post has been shared 1 times.

Building a complex weapon system in today's environment may involve many subsystems--propulsion, hydraulics, power, controls, radar, structures, navigation, computers, and communications. Design of these systems requires the expertise of engineers in particular disciplines, including mechanical engineering, electrical engineering, software engineering, metallurgical engineering, and many others. But some activities of system development are interdisciplinary, including requirements development, trade studies, and architecture design, to name a few. These tasks do not fit neatly into the traditional engineering disciplines, and require the attention of engineering staff with broader skills and backgrounds. This need for breadth and experience is often met by systems engineers. Unfortunately, system engineering is often not valued among all stakeholders in the Department of Defense (DoD), and is often the first group of activities to be eliminated when a program is faced with budget constraints. This blog post highlights recent research aimed at demonstrating the value of systems engineering to program managers in the DoD and elsewhere.

In 2004, the Director for Systems Engineering in the Office of the Undersecretary for Defense for Acquisition, Technology and Logistics (OUSD [AT&L]) came to the National Defense Industrial Association (NDIA) and voiced concerns that DoD acquisition programs were not capitalizing on the value of systems engineering (SE). He knew the value of SE and knew that it could help DoD programs, but he also knew that not all DoD program managers shared his convictions. Consequently program managers were taking shortcuts and eliminating SE capabilities from their programs. He came to NDIA seeking quantitative evidence of the value of SE.

Subsequently, others have recognized this same problem. A recent Government Accountability Office (GAO) report indicates that acquisition program costs are typically 26 percent over budget and development costs are typically 40 percent more than initial estimates. These programs routinely fail to deliver the capabilities when promised, experiencing, on average, a 21 month delay. The report finds that "optimistic assumptions about system requirements, technology, and design maturity play a large part in these failures, and that these optimistic assumptions are largely the result of a lack of disciplined SE analysis early in the program."

Despite findings such as this, many programs still fail to deploy good SE. Why? It may be because there is relatively little quantitative evidence of the impact and value of SE. Everyone can see the SE costs, such as the labor applied and the time allocated in the schedule. The benefits of SE, however, may be less identifiable. They often manifest themselves as

  • risks that didn't materialize
  • rework that didn't need to be done
  • customer complaints that didn't occur, and
  • product deficiencies that are circumvented

Because these benefits are hard to quantify, however, the return from investment in SE is often unrecognized. To get a true picture of the value of SE, we need to quantitatively measure its impact on acquisition program performance.

The remainder of this blog posting describes a research effort that the SEI undertook in partnership with the NDIA and the IEEE Aerospace and Electronic Systems Society (IEEE AESS). This effort provided quantitative evidence of the value of SE in terms of its impact on program cost, program schedule, and program technical performance - impacts that are crucially important to program managers and executives.

Building on Previous Research in Systems Engineering

While "software engineering" is etched in stone at the SEI's Pittsburgh headquarters, it is sometimes hard to draw a clear line between software and the systems supported by software. For that reason, SEI staff have often conducted research in the SE realm. For example, the Capability Maturity Model Integration Framework addresses processes that apply equally to software development and system development. Architecture development and evaluation methods developed for software are routinely adapted and applied to systems.

Through the SEI's affiliation with NDIA, my fellow researcher, Dennis Goldenson and I became involved in developing their response to the 2004 inquiry from OUSD (AT&L) mentioned earlier. In 2005, I suggested conducting a survey of acquisition programs to gather information about their activities related to SE, and how those programs performed. We could then identify relationships between these factors. We conducted the survey in 2006 and published our results in 2007. Our initial research demonstrated that those programs that deployed more systems engineering performed better against measures of cost, schedule, and technical performance.

In 2010, the DoD approached NDIA and the SEI about strengthening the business case for SE by expanding the survey population to include not just the NDIA but other professional organizations including the IEEE-AESS, and the International Council on Systems Engineering (INCOSE). For this latest study, we surveyed individual programs in participating organizations to obtain answers to the following questions:

  1. What systems engineering activities do you perform on your program?
  2. How well does your program perform?

We surveyed 148 different programs. Although most programs were supplying systems for the U.S. defense sector, we also received a few responses from organizations serving other market sectors and operating in different countries.

An Even Stronger Link Between SE and Performance

Our latest results, published in the SEI technical report, The Business Case for Systems Engineering Study: Results of the Systems Engineering Effectiveness Study, identified strong links between the performance of systems engineering tasks and overall program performance. These results provide a convincing case for the value of systems engineering. This latest study collected information from participating programs along three dimensions:

  • systems engineering deployment. We assessed SE deployment by examining both the presence and the quality of work products resulting from SE activities. These work products were selected from those listed in the CMMI Framework by a panel of SE experts. Based on this assessment, SE deployment for each program was categorized as either low, medium, or high.
  • program performance. We assessed program performance as a combination of cost performance (satisfaction of budget), schedule performance, and technical performance (satisfaction of requirements). Again, based on this assessment, program performance for each program was categorized as either low, medium, or high.
  • program challenge. Some programs are inherently more challenging than others due to factors such as size, duration, technology maturity, interoperability requirements, etc. Based on the combination of these factors, program challenge was categorized as either low, medium, or high.

We then looked for relationships between these metrics. We found a very strong relationship between SE deployment and program performance. In particular, as programs deployed more SE, they delivered better performance. For example, among those programs deploying the least SE, only 15 percent delivered the highest level of program performance. Among those deploying the most SE, however, 56 percent delivered the highest level of program performance.

As one would expect, our research showed an inverse relationship between program challenge and program performance. But, we also learned that SE practices became even more valuable when used with these challenging programs. We already noted that the number of programs delivering high program performance increased from 15 percent to 56 percent as SE deployment increased. For the most challenging programs, however, the number of programs delivering high program performance increased from 8 percent to 62 percent with increased SE deployment. This result clearly shows the increasing need for SE as programs become more challenging.

As mentioned above, we measured SE deployment by assessing SE-related work products for each program. Now, we could group these artifacts into process areas such as

  • requirements development and management
  • program planning
  • product architecture
  • trade studies
  • product integration
  • verification
  • validation
  • program monitoring and control
  • risk management
  • configuration management
  • integrated product teams

Grouping artifacts into process areas enabled us to probe more deeply into the relationships between SE and program performance, identifying not only the overall benefit of SE but also the benefit of specific SE processes. For each program, we assessed SE deployment in each of the 11 process areas above and analyzed the relationship to program performance. Here again, we found strong supporting relationships for all SE process areas - increased SE deployment in any of these areas contributed to better program performance. Relationships to program performance, however, were stronger in some than in others. Particularly strong relationships to program performance were found for

  • program planning. The number of programs delivering highest performance increased from 13 percent to 50 percent as SE activities related to program planning increased.
  • requirements development and management. The number of programs delivering highest performance increased from 21 percent to 58 percent as SE activities related to requirements development and management increased.
  • verification. The number of programs delivering highest performance increased from 16 percent to 54 percent as SE activities related to verification increased.
  • product architecture. The number of programs delivering highest performance increased from 16 percent to 49 percent as SE activities related to product architecture increased.

As strong as these relationships were, we found that they grew even stronger for the more challenging programs.

Transitioning to the Public

The results of our research can be used in a number of ways by system developers, system acquirers, and academia. For example, our findings constitute a baseline of SE deployment across the industries surveyed. System developers can use our methods to assess their own SE capabilities, compare them to this baseline, and identify their strengths and weaknesses. They can then develop process improvement plans to improve their weaknesses and strategies to leverage their strengths. We continue to work with defense contractors who are applying this process to improve their SE capabilities.

System acquirers can also benefit from these findings. A program management office (PMO) acquiring a system needs to deploy good SE practices in planning the program, defining system requirements, developing system architectures, etc. The PMO also needs to ensure that the system supplier deploys good SE practices. For example, the PMO must first include in the solicitation a definition of SE activities that they expect from the supplier. They should evaluate the supplier's response to these expectations as a factor in the selection of the supplier. They should also ensure that the SE expectations are included in the contract. They should monitor the supplier's performance throughout execution to ensure that SE expectations are being met.

The academic community can also use the results from our study. For example, several universities offering systems engineering programs at the master's level are using this information in their curriculum and their courses to show their students the value of systems engineering and to direct some of their courses to capitalize on the knowledge that we've gathered here. INCOSE is also incorporating the results of the survey into its systems engineering handbook.

Additional Resources

You can download the technical report The Business Case for Systems Engineering at
https://resources.sei.cmu.edu/library/asset-view.cfm?assetID=34061

CITE
SHARE

This post has been shared 1 times.

Get updates on our latest work.

Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.

Subscribe Get our RSS feed