icon-carat-right menu search cmu-wordmark

The Importance of Automated Testing in Open Systems Architecture Initiatives

To view a video of the introduction, please click here.
The Better Buying Power 2.0 initiative is a concerted effort by the United States Department of Defense to achieve greater efficiencies in the development, sustainment, and recompetition of major defense acquisition programs through cost control, elimination of unproductive processes and bureaucracy, and promotion of open competition. This SEI blog posting describes how the Navy is operationalizing Better Buying Power in the context of their Open Systems Architecture and Business Innovation initiatives.

This posting also presents the results from a recent online war game that underscore the importance of automated testing in these initiatives to help avoid common traps and pitfalls of earlier cost containment measures.

Overview of the Navy's Open Systems Architecture Initiative
To view a video of this section, please click here.

Given the expense of our major defense acquisition programs--coupled with budget limitations stemming from the fiscally constrained environment--the Department of Defense (DoD) has made cost containment a top priority. In response, the Navy has devised and supported various Open Systems Architecture initiatives, such as the Future Airborne Capability Environment (FACE), which is a technical standard aimed at enhancing interoperability and software portability for avionics software applications across DoD aviation platforms. The goals of these types of initiatives are to deliver enhanced integrated warfighting capability at lower cost across the enterprise and throughout the lifecycle via

  1. the use of modular, loosely coupled, and explicitly-articulated architectures that provide many shared and reusable capabilities to warfighter applications
  2. fully disclosing requirements, architecture, and design specifications and development work products to program performers
  3. adopting common components based on published open interfaces and standards
  4. achieving interoperability between hardware and/or software applications and services by applying common protocols and data models
  5. amortizing the effort needed to create conformance and regression test suites that help automate the continuous verification, validation, and optimization of functional and non-functional requirements

Overview of the Navy's Business Innovation Initiative
To view a video of this section, please click here.

Achieving the goals of Open Systems Architecture requires the Navy to formulate a strategy for decomposing large monolithic programs and technical designs into manageable, capability-oriented frameworks that can integrate innovation more rapidly and lower total ownership costs. A key element of this strategy is the Navy's Business Innovation Initiative, which is investigating various changes in business relationships between an acquisition organization and its contractor(s) to identify rational, actionable reform for new acquisition strategies, policies, and processes. These business relationship changes aim to open up competition, incentivize better contractor performance, increase access to innovative products and services from a wider array of sources, decrease time to field new capabilities, and achieve lower acquisition and lifecycle costs while sustaining fair industry profitability.

Although there's a clear and compelling need for new business and technical models for major defense acquisition programs, aligning the Naval acquisition community to the new Open System Architecture and Business Innovation initiatives presents a complex set of challenges and involves many stakeholders. To better understand these challenges, and to identify incentives that meet its future demands, the Navy ran two Massive Multiplayer Online Wargames Leveraging the Internet (MMOWGLI) in 2013. The Navy used these games to crowd-source ideas from contractors, government staff, and academics on ways to encourage industry and the acquisition workforce to use an Open Systems Architecture strategy.

Overview of the Massive Multiplayer Online Wargame Leveraging the Internet (MMOWGLI)
To view a video of this section, please click here.

The MMOWGLI platform was developed by the Naval Post Graduate School in Monterey, California. This web-based platform supports thousands of distributed players who work together in a crowd-sourcing manner to encourage innovative thinking, generate problem solving ideas, and plan actions that realize those ideas.

The first Navy Business Innovation Initiative MMOWGLI game was held in January 2013. The primary objective of the game was to validate the use of the MMOWGLI platform to gather innovative ideas for improving the business of Naval systems acquisition. This game was extremely successful, and generated 890 ideas and 11 action plans. In addition, the results validated the soundness of the overall Navy Open System Architecture strategy and illuminated many ideas for further exploration in subsequent events with broader audiences.

A second Navy Business Innovation Initiative MMOWGLI was conducted from July 15 to August 1, 2013. The purpose of this game was to generate ideas from a wider audience of acquisition professionals on how to best incentivize industry, and how to motivate the government workforce to adopt OSA business models in the procurement, sustainment, and recompetition of national defense systems. The 1,750 ideas presented through this exercise were later validated and translated into 15 action plans for implementing the Navy's Open System Architecture strategy. More than half of the nearly 300 participants in the game were from industry, and many of these were from the small business community.

Results from the Second MMOWGLI on the Navy's Business Innovation Initiative
To view a video of this section, please click here.

Given the current fiscal climate in the DoD, it's not surprising that many action plans in the second Business Innovation Initiative MMOWGLI war game dealt with cost-containment strategies. Below, I have listed several actions plans (followed by the goal of that action plan in italics) that came out of the second Business Innovation Initiative MMOWGLI war game:

  • providing a bonus to Navy team members who save money on acquisition programs

The goal is to incentivize program office teams to take both a short- and long-term view toward efficient acquisitions by optimizing prompt/early delivery of artifacts with accrued savings over the lifecycle.

  • rewarding a company for saving money on an acquisition contract: top savers would be publicly recognized and rewarded


The goal is to allow effective public image improvement for both government and industry partners of all sizes and types to receive tangible recognition of cost-saving innovations.

  • increasing the incentive paid to a contractor if the actual cost of their delivered solution was less than the targeted cost

The goal is to give industry a clear mechanism for reporting cost savings, a clear way to calculate the reward for cost savings, and a transparent method for inspecting actuals over time.

Avoiding Common Traps and Pitfalls of Cost Containment via Automated Testing
To view a video of this section, please click here.

Although cutting costs is an important goal, it's critical to recognize that cost containment alone may be a hollow victory if it yields less costly-- but lower quality--solutions that don't meet their operational requirements and that can't be sustained effectively and affordably over their lifecycles. It's therefore essential to balance cost savings, on one hand, with ensuring stringent quality control on the other. What is needed are methods, techniques, tools, and processes that enable software and systems engineers, program managers, and other acquisition professionals to ensure that cost-cutting strategies don't compromise the quality and sustainability of their integrated solutions.

In particular, MMOWGLI action plans that identify reward structures need to be balanced with action plans that avoid situations where contractors--or even government acquisition professionals--game the system by cutting costs (to get a bonus), while ignoring key system quality attributes (such as dependability, maintainability, and security) to the detriment of both the end-users (warfighters, planners, operators, et al.) and the organizations responsible for the long-term sustainment of the systems. Ideally, contractors and the government should be incentivized to control costs, while still ensuring that a quality product is delivered in a manner that is both operationally capable and affordably sustainable over the program lifecycle.

The $640 billion dollar question is "how can we help acquisition professionals and technologists better achieve the balance between quality and affordability?" The Business Innovation Initiative MMOWGLI participants collaborated to provide a key piece of the solution. In particular, some of the highest ranked action plans from the second MMOWGLI game addressed the need for automated testing and retesting as an evaluator and enforcer of system quality.

Testing stimulates an executable component or work product with known inputs and under known preconditions followed by the comparison of its actual and expected outputs and post-conditions to determine whether its actual behavior is consistent with its required behavior. Automated testing is essential to achieve positive cost-saving results in OSA initiatives by ensuring that the components and products delivered at lower costs have the requisite quality as well as the ability to reduce the time and effort required to conduct the testing processes.

MMOWGLI Action Plans for Automated Testing
To view a video of this section, please click here.

The top rated action plan from the second MMOWGLI game proposed using automated tools and industry best-practices to reduce manual testing and evaluation effort and increase the coverage of automated regression testing in mission-critical Naval systems. When there are many frequent development blocks--as is the case with iterative and incremental development methods--it is necessary to perform regression testing on the previously developed software to verify that it continues to operate properly after being (1) integrated with the new software and (2) evolved as defects are fixed and improvements are made. Iterative and incremental development cycles greatly increase the need for regression testing, and this additional testing becomes infeasible when it is performed manually.

Another testing-related action plan was also well received, being ranked 8th out of a total of 15 action plans. This action plan recommended reducing certification costs by requiring integrated validation and verification processes to involve automated testing, including assurance cases; test plans, procedures, reports, and scripts; as well as test data, tools, environments, and labs. The goal is to replace time-consuming manual testing methods with formalized automated testing across the lifecycle by defining, delivering, and maintaining testing work products with components to enable effective, efficient, and repeatable testing during component development, system integration, sustainment, and recompetition.

Ironically, the action plans that focused on cost containment alone were ranked lower by the participants (10th, 12th, and 14th out of the total 15 action plans). Based on an analysis of comments, it didn't appear that their low ranking stemmed from a lack of interest in controlling costs, but rather from the realization that without effective automation of testing and retesting, the benefits of cost savings and efficiencies from OSA initiatives may be compromised by inadequate quality.

A Way Forward for Automated Testing in Open Systems Architecture Initiatives
To view a video of this section, please click here.

After analyzing the results of the two MMOWGLI war games carefully, the Navy recommended that a subsequent study area in the Open Systems Architecture and Business Innovation initiatives focus on affordable testing. The goal is to establish quality assurance processes that are efficient and comprehensive during the design phase and initial delivery, as well as throughout the sustainment and re-competition phases. It's also critical that these automated tests be delivered with the system and include sufficient documentation so that government or sustainment contractors can both execute and maintain these tests.

Of course, it's also important to recognize the limitations of automated testing. There is a significant up-front cost in automating tests. Likewise, the resulting testing work products must be engineered so that they have high quality. Moreover, automation may not be appropriate--or even feasible--for every type of testing, such as usability tests and tiger teams performing penetration testing.

The take-home point from our experience with both Business Innovation Initiative MMOWGLI games is that by combining effective testing with other action plans, the benefits of cost savings and efficiencies from Open Systems Architecture initiatives may be achieved without compromising the quality of the results. We don't just want competition; we don't just want lower cost. Instead, we need to use competition to get the same or better quality at a cost we can afford.

Our next step is to help the Department of the Navy formulate a comprehensive whole lifecycle testing and quality assurance approach--together with a path towards standardization of automated testing and retesting methods and tools--to assist with lowering the cost and time to field innovative and robust capability. Our goal at the SEI is also to help promote the visibility and strategic importance of testing across government and industry to enhance the quality of delivered components and integrated products, as well as to spur innovation in quality assurance methods and tools by the defense industrial base and commercial information technology companies.

Additional Resources

Unfortunately, we see both program offices and defense contractors making the same mistakes repeatedly when it comes to testing. My colleague, Donald Firesmith, has collected and analyzed 92 commonly occurring testing pitfalls, organized them into 14 categories, and published them in the recent book Common System and Software Testing Pitfalls.

Get updates on our latest work.

Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.

Subscribe Get our RSS feed