icon-carat-right menu search cmu-wordmark

Challenges in Making the Transition to Digital Engineering

Headshot of Bill Nichols.

“Digital acquisition holds the key to unleashing the speed and agility we need to field capability at the tempo required to win in a future conflict with a peer competitor,” wrote Darlene J. Costello, acting assistant secretary of the U.S. Air Force (Acquisition, Technology, & Logistics) in an Air Force memo in May 2021.

Making the transition from traditional development practices to digital engineering (DE) presents challenges. This blog post describes some of the challenges I observed while working with the U.S. Army Capstone Future Vertical Lift (FVL) program. In Capstone, I was specifically looking for costs and benefits of model-based systems engineering (MBSE) in the context of introducing the architecture-centric virtual integration process (ACVIP) using the Architecture Analysis and Design Language (AADL). These observations include transition challenges within the emerging digital engineering ecosystem.

The Promise of Digital Engineering

Tom McDermott of the Systems Engineering Research Center defines digital engineering as

an integrated digital approach that uses authoritative sources of systems data and models as a continuum across disciplines to support lifecycle activities from concept through disposal.

I will return to the key concepts from this definition of integrated and continuum across the lifecycle in this post. Of course, the eponymous characteristic is that digital engineering uses digital tools and representations in the process of developing, sustaining, and maintaining systems, including requirements, design, analysis, implementation, and test. As described in the SEI blog post “What Is Digital Engineering and How Is It Related to DevSecOps?,” digital engineering is well suited to the DoD’s need to sustain and maintain long-living systems whose missions evolve over time. The digital modeling approach is intended to establish an authoritative source of truth (ASOT) for the system in which discipline-specific views of the system are created using the same model elements. This model-based approach carries forward into the design and implementation.

A digital modeling environment effectively applies MBSE to the design and creates a common standards-based approach to documenting a system that enforces the use of standards by all stakeholders. A common modeling environment with commonly accepted and well-defined properties and stereotypes is intended to improve the ability to analyze the system and reduce the likelihood of finding late defects. The availability of digitized system data for analysis across disciplines should provide consistent propagation of corrections and incorporation of new information.

Ideally, with MBSE this information can be stated once and then automatically propagated to various views of the data for all stakeholders. The result of this approach is an overall reduction of development risks, the ability to find and correct defects earlier in development when changes are relatively inexpensive, and elimination of document-driven development. I tried to measure this benefit in Capstone.

The foreword to the DoD Digital Engineering Strategy of June 2018 summarizes the expected improvements to system development and maintenance that the DoD envisions for its systems:

…incorporating the use of digital computing, analytical capabilities, and new technologies to conduct engineering in more integrated virtual environments [will] increase customer and vendor engagement, improve threat response timelines, foster infusion of technology, reduce cost of documentation, and impact sustainment affordability. These comprehensive engineering environments will allow DoD and its industry partners to evolve designs at the conceptual phase, reducing the need for expensive mock-ups, premature design lock, and physical testing.

Nonetheless, measuring the effectiveness of digital engineering/MBSE has been elusive. Although the DOD digital engineering strategy emphasizes the conceptual phase, designs must be captured as the concepts evolve. We focused in this project on virtual engineering for embedded systems through ACVIP. The ACVIP process is built on a digital engineering methodology and AADL-based tools implementing the concepts expressed in the Systems Modeling Language (SysML). Engineers therefore used a diverse toolset across the development stages of the software-development lifecycle for embedded computing systems. While AADL is precise in semantics and units, the SysML used in the conceptual phase is looser. Some engineers thought that the lack of precision at the boundary between abstraction layers made it hard to share and analyze the models.

Challenges

We encountered several challenges related to confounding from multiple tools and standards, the learning curve, and measurement.

A first challenge was that ACVIP was not the only new technique introduced. The program introduced other digital engineering tools and standards during the project including, among others, SysML, the Future Airborne Capabilities Environment (FACE) data model, Comprehensive Architecture Strategy (CAS), Business Process Model and Notation (BPMN), Unified Profile for DoDAF/MODAF (UPDM), and associated tools.

A second challenge was that the engineers were still learning how to apply the techniques for both ACVIP and MBSE. A third challenge was that measurements did not isolate process elements sufficiently to measure the causal effects of specific techniques. As a result, although we showed some positive benefits, we could not attribute them to specific tools or techniques.

To better understand the challenges, we not only reviewed performance data but also surveyed participants to get their subjective opinions on digital engineering and MBSE effectiveness. The survey asked a variety of questions, including demographics; previous experience in the domain; experience with the MBSE technologies used on Capstone; support for the development environment; and the perceived effectiveness with respect to cost and duration. A total of 26 people responded, two from government acquisition and the rest about evenly split between development and integration roles.

Recurring themes in the survey results included

  • a lack of experience with digital engineering tools and practices, and with ACVIP and MBSE
  • limited tool maturity
  • substantial cost of entry to learn and use some of the tools
  • limited process maturity, with no integrated end-to-end process, as evidenced by lack of overarching measurement and metrics
  • need for commitment to an overarching strategy with follow-through
  • point solutions that did not work effectively together
  • issues with training, tooling, and experience

Few individuals surveyed had previous experience with MBSE, ACVIP, FACE, and digital engineering, and they frequently cited emerging tool limitations as a common obstruction. Even where the organization had used digital engineering, few individuals worked on the earlier project and few established practices were carried over to the new project. Open-ended comments in the survey referred not only to insufficient training, but also to the failure of contractors to integrate the new practices into the development process. To be effective, therefore, the new tools and practices needed additional transition support.

Integrators—those who assembled hardware and software components—were more optimistic about the positive schedule effects of using MBSE than component developers. The component developers surveyed agreed that ACVIP shortened the duration to integration or test, while no integrator thought that ACVIP increased the duration. A reasonable explanation for this divergence of opinion is that developers see only the early costs and extra efforts, while integrators see the benefits of the extra efforts to arrive at an improved state ready for integration. This divergence also illustrates why it is critical to have a holistic view of the development rather than to rely on individuals who might micro-optimize at the expense of overall system performance.

Our survey had too few responses for us to assert scientific validity, but we do think that the results suggest some problem areas that warrant further investigation. Although subjective opinions are not necessarily true and do not assure a broad perspective, they do capture some of the frustration felt by at least some of the working engineers.

Issues with Tools

Many of the survey respondents pointed to lack of any integration across MBSE practices, and that problem was confounded by the introduction of multiple digital engineering tools and techniques. Tool issues included the process being too manual and lack of workflow coordination among the tools provided. This survey cannot determine specific tool or process problems or how to solve them. Nonetheless, multiple respondents felt underprepared or inadequately supported.

We observed that the toolchain linking requirements using SysML to data-modeling tools using the FACE standard and ACVIP had limitations. Models and tool support did not flow seamlessly between the different abstractions. This inconsistency weakened the ability to have an ASOT and encouraged working at an inappropriate level of abstraction.

These tools were under development and improved considerably during Capstone execution. There are two separate lessons from this observation. First, integration of software-systems approaches into a coherent workflow is a key improvement that could facilitate adoption of digital engineering. Second, while measurement should support process evolution, a stable process and tooling are needed to measure cost effectiveness.

The preference for a tool that crosses levels of detail might suggest that there is no clear plan across the lifecycle for developing and using models. When asked, “How will this model be used going forward?” the developers did not provide an answer. We still see contractors present models in PowerPoint, which indicates that although SysML clearly models architecture, structure, and behavior exactly as intended, some prefer to model in Visio or PowerPoint to explain what they did. This reliance on PowerPoint became less common as Capstone progressed, and sometimes the Powerpoint transcribed SysML or Department of Defense Architecture Framework (DoDAF). While slow tool adoption may be an issue with the tools themselves, training, experience, or mentoring, it shows that just providing tools or mandating their use is not enough. Those responsible for the transition to digital engineering should expect that there will be a period of some duration during which the users become accustomed to a new way of working.

A representative free-form response was that “It is hard to point to any tangible benefit…model creation is largely manual; our program has struggled to keep up with modeling. Anything found has always been 2-to-3 steps behind the point where it would have been useful because we simply don’t have the resources to manually duplicate modeling.” This response is an indication of insufficient training combined with tight execution timelines.

This response also neatly encapsulates issues of experience, tool integration, and the lack of a coherent organizational process—all key impediments to the successful adoption of digital engineering. A technology-transition program should allocate time to learn while requiring incorporation of the new technology into the modeling approach.

Constructive Suggestions for Measurement and Improvement

Programs need the upgrade capability that digital engineering provides. But digital engineering represents a large-scale change requiring a comprehensive investment plan for technology transition. Barriers to effective digital engineering include tool capability, personnel skills, and overall management. If you are responsible for the transition of this technology, you must start the improvement process with clear statements of the goals and follow up with a practical measurement program to evaluate progress toward achieving those goals.

The key objective for our study concerned measurement. How can you tell if what is being done is an improvement? Multiple process introductions and changes during execution confound this evaluation. Lack of sufficient expertise using the new process makes it hard to understand how to measure success effectively, because many steps in adoption require a distinction between whether the process is ineffective or whether it is just not being used effectively. In particular, first efforts reflect the early stages of a learning curve, and thus may not represent a fair test.

Measurement would also be made easier by narrowing the scope of digital engineering introduction so that you could evaluate each step in adoption more carefully. Of course, narrowing scope would create other risks. However, fast, large-scale change is seldom successful. Including change-support agents or mentors who already know the processes is also essential.

Another benefit to narrowing the introduction scope and adding expertise through mentoring or just-in-time support is that it makes possible a focus on satisfying needs. As workflows are developed, you should keep the digital engineering artifacts at appropriate levels of abstraction. Training followed by active mentoring helps to achieve the right level of abstraction, while the issues with the handoff from one role to the next clarify the abstraction needed. To achieve an ASOT, you must make sure that the digital engineering artifacts evolve and are actually used. Both conditions—evolution and actual use—are observable and should indicate success or opportunities for improvement.

It is also critical to introduce new workflows where the digital engineering implies something different from the old way of working. Much of the power of ACVIP is in enabling virtual integration and virtual analysis before moving to physical components. You must integrate these activities into a coherent incremental workflow, and make the effort applied and the rework discovered visible. It is good if ACVIP finds issues in virtual analysis rather than in physical test, and better if ACVIP prevents those issues from emerging entirely. Only by tracking the effort and rework in virtual and physical integration and test will you know what effect ACVIP has had.

You should design and refine metrics to measure the technology and process. With respect to ACVIP in complex embedded systems, rework is the major cost issue addressed. You should define, isolate, and track rework costs in the context of when issues were discovered, and define issue discovery itself so that it can be captured consistently. Only after measurement is effective can it be used to tune the process, thus maximizing return on investment and reducing integration risks. In the same way, more generalized MBSE would need targeted metrics.

Finally, digital engineering should provide an ASOT, i.e., the digital engineering products should be used by most stakeholders across the development to communicate and review. Demonstrating ASOT was not the purpose of Capstone, but some of the issues suggested that the interfaces and handoffs were less than ideal. The collection of individual tools must function within an overall ecosystem. It is important to assure that tools have complete information, but at the appropriate levels of abstraction. It is also desirable that the results from one tool can be transferred to another, and preferably back again.

CITE
SHARE

This post has been shared 2 times.

Get updates on our latest work.

Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.

Subscribe Get our RSS feed