search menu icon-carat-right cmu-wordmark

A Playbook for Early Architecture Analysis

Presentation
This talk describes a playbook to analyze early architecture artifacts' support for development-time quality attributes that affect long-term system success.
Publisher

Software Engineering Institute

Watch

Abstract

This talk was presented at the 2020 Virtual Systems and Mission Engineering Conference. For more information, see our collection of technical reports describing how to analyze a software architecture with respect to quality attribute requirements.

While the software engineering community has accepted the importance of architecture to long-term system success, performing early architecture analysis remains challenging for many organizations. Quality attributes that focus on run-time properties like latency or availability can be assessed at different points during the acquisition lifecycle. Rich modeling notations and formalisms currently exist, and these models can capture key architectural decisions that have been made. Then deployed systems can be instrumented and tested to confirm that these intended properties have actually been delivered. But quality attributes that focus on development-time properties like integrability or maintainability are harder to confirm on delivery and are typically not as well supported by formal architectural models. Fortunately, formal models are not the only options available to analysts.

In this talk we describe a "playbook" that we created to guide the activities of an analyst in assessing early architecture artifacts to determine a software architecture's support for development-time quality attributes. The motivation for this playbook emerged from our work with the Army's Combat Capabilities Development Command (CCDC) Aviation and Missile Center (AvMC) Aviation Development Directorate (ADD) on its AVE (Architecture Verification Environment) program. The major goal of the AVE is to make architecture requirements more enforceable. Given that architecture analysis is complicated, we focus on providing artifacts and techniques to support analysts in an effort to make the analysis process more rigorous and repeatable in the absence of detailed, formal models.

The playbook provides a process that has seven steps divided into three phases. The Preparation phase gathers the artifacts needed to perform the analysis and evaluation. The Orientation phase uses the information in the artifacts to understand the selected architecture approaches for satisfying the quality attribute requirement. The process ends with the Evaluation phase, wherein analysts apply their understanding of the requirement and architecture solution approaches to make judgments about those approaches. In this talk we exemplify the use of the playbook with a running example focusing on the quality attribute of integrability. This process can be applied at almost any point in the development lifecycle. The breadth, depth, and completeness of the architecture artifacts will inform the type of analysis and evaluation performed and the degree of confidence in the results. Early in the development lifecycle, lower confidence may be acceptable and analysts can work with incomplete or lower quality artifacts and provide simpler analyses until these artifacts are of the desired quality. Later in the lifecycle, analysts typically need higher confidence, thus leading to a need for higher quality artifacts and deeper analyses.

The steps of the playbook are

  • Step 1: Collect artifacts
  • Step 2: Identify the mechanisms used to satisfy the requirement
  • Step 3: Locate the mechanisms in the architecture
  • Step 4: Identify derived decisions and special cases
  • Step 5: Assess requirement satisfaction
  • Step 6: Assess impact on other quality attribute requirements
  • Step 7: Assess the cost/benefit of the architecture approaches

For each step, we describe its inputs and outputs as well as the system-specific and system-independent artifacts that we rely upon to carry out the step in a disciplined and repeatable fashion. System-specific artifacts include requirements and various forms of documentation. System-independent artifacts come from the architecture body of knowledge and include use of scenarios to make quality attribute requirements concrete, catalogs of tactics and patterns that are commonly used to support quality attributes, and quality-attribute-specific checklists and models that can be used for analysis. The playbook is not intended to completely remove the need for education, training, discipline, and thoughtfulness on the part of the analyst; rather it is intended to guide analysts and provide them with a structure for reasoning.