Agile Metrics: Seven Categories
More and more, suppliers of software-reliant Department of Defense (DoD) systems are moving away from traditional waterfall development practices in favor of agile methods. As described in previous posts on this blog, agile methods are effective for shortening delivery cycles and managing costs. If the benefits of agile are to be realized effectively for the DoD, however, personnel responsible for overseeing software acquisitions must be fluent in metrics used to monitor these programs. This blog post highlights the results of an effort by researchers at the Carnegie Mellon University Software Engineering Institute to create a reference for personnel who oversee software development acquisition for major systems built by developers applying agile methods. This post also presents seven categories for tracking agile metrics.
An Empirical Approach to Software
Increasingly, the DoD and federal agencies procure software-intensive systems instead of building them with internal resources. However, acquisition programs frequently have difficulty meeting aggressive cost, schedule, and technical objectives. The research reported in this blog is part of our ongoing work to help acquisition professionals in the DoD adopt and use agile software development methods more effectively to overcome these challenges. Our latest research focuses on progress measurement and contractors, which extends our previous research examining selected DoD management and acquisition concerns, including information assurance.
Many program offices in government and military organizations that we support have found that their providers or contractors are adopting agile methods. These program offices have found new ways to deliver software products more rapidly and in smaller increments than is customary in these environments. Program offices struggle with these techniques, however, because they lack experience with the metrics required to gain insight on progress. There is an elaborate infrastructure and a fairly well understood set of definitions for measures traditionally used in that space.
When we examined what is written, trained, and discussed about the term agile, the focus tends to be on a team of seven plus or minus two individuals working together in a self-directed setting. These teams have a keen focus on value to the user and employ an empirical approach, but most of the discussion centers on the small team.
It is important to note that in Scrum, decisions are made based on observation and experimentation rather than on detailed upfront planning. Empirical process control relies on three main ideas: transparency, inspection, and adaptation.
Use of agile methods in the context of major DoD programs is not unprecedented. Until recently, however, publications and training courses have focused too narrowly on the development team. Our findings show that organizations that apply agile methods are doing a good job filling in those abstractions between the small team-focused measurement and what is needed at an enterprise or program level. So, one of the challenges to overcome then, is meeting the needs of large-scale-program-management without violating the environment necessary for a self-directed team to succeed using agile methods.
In many medium-to-large sized organizations, measurement is often conducted at the request of another individual and typically an obligation imposed on one party to benefit another party. Far too often, people asked by team leaders and project managers to provide metrics get defensive. This dynamic may limit the value of agile methods, which are intended to serve as the basis for an empirical approach to software.
This empirical approach involves enacting Edward Deming's plan-do-check-act-cycle, but at a much more immediate and individually focused level. This approach to software involves more frequent conversation among developers. Another hallmark of the approach is that those conversations are very focused on the product itself.
When viewed in the context of the Agile Manifesto and its 12 principles, the best way to demonstrate progress is to demonstrate capability: an actual working product in lieu of an abstraction on paper. Obviously, the product could not have been built without the abstractions, but in the context of agile methods, the focus is on demonstrable results and data collected by the team for its own use.
For agile software development, one of the most important metrics is delivered business value. These progress measures, while observation-based, do not violate the team spirit. Our primary goal with this work was to help program managers measure progress more effectively. At the same time, we want teams to work in their own environment and use metrics specific to the team, while differentiating from metrics that are used at the program level.
The technical report that we published on this topic--Agile Metrics: Progress Monitoring of Agile Contractors, which was co-authored by myself, Suzanne Miller, Mary Ann Lapham, Eileen Wrubel, and Timothy A. Chick--details three key views of agile team metrics that are typical of most implementations of agile methods:
- Velocity. Simply stated, velocity is the volume of work accomplished in a specified period of time, by a given team. Typically, this is measured as story points accomplished per sprint. This measure is sometimes called "yesterday's weather" by agile practitioners, as if to indicate its sensitivity to local conditions as well as seasonal trends. Indeed most experts explain that velocity is "team-unique" and thinking of this measure as a parameter in an estimating model is a mistake. The team must establish its own velocity for the work at hand.
- Sprint Burn-Down Chart. As detailed in our technical note, this graphical technique provides a means for displaying progress for the development team during a sprint. As items in the backlog of work are completed, the chart displays the rate and amount of progress. This chart is typically provided for viewing on a team's common wall, or electronic dashboard.
- Release Burn-Up Chart. A complementary graphical technique for the sprint burn-down, the release burn-up chart is also commonly used. Many cling to the convention that sprints burn down and releases burn up--though there is no mathematical principle that governs this choice. With each completed sprint, the delivered functionality grows, and the release burn-up chart depicts this progress in an intuitively logical fashion. This concept makes use of workflow management tools and other extensions of the concept.
Our research involved interviewing professionals who manage agile contracts who gave us insight from professionals in the field who have successfully worked with agile suppliers in DoD acquisitions.
Based on our interviews with personnel who manage agile contracts, our technical note identified seven successful ways to monitor progress that help programs account for the regulatory requirements that are common in the DoD:
- Software Size is typically represented in story points when agile methods are used. This approach is supported by the decomposition of functionality from a user's perspective--into user stories. Tracing these user stories to system capabilities and functions, a hierarchy within the work can be meaningfully communicated and progress monitoring based on delivered functionality will focus on utility and function--rather than proxies like lines of code or function points.
- Effort and Staffing must be tracked because they tend to be the primary cost drivers in knowledge-intensive work. Use of agile methods will not change this fundamental fact, nor will it be necessary to make major changes to the mechanisms used to monitor progress. What does change, however, is the expected pattern of staff utilization. With the steady cadence of an integrated development team, the ebb and flow of labor in specialized staff categories is less prevalent when using agile methods. In general, agile teams are expected to have the full complement of needed skills within the development team--though some specialized skills may be included as part time members on the team. Rules of thumb applied in monitoring this element of performance on a contract must be revised. The expectation of a slow ramp-up in staffing during the early phases of a development effort may be problematic, and plans for declining use of development staff during the last half of the program (when testing activities traditionally take over) must be recalibrated. Organizations may establish test teams to perform system testing or regression testing outside the context of the development team.
- Schedule is traditionally viewed as a consequence of the pace of work performed. In agile development, the intent is to fix this variable, and work to maximize performance of the development team within well-defined time boxes. This places important requirements on stakeholders who must communicate the requirements and participate in prioritization of the work to be performed.
- Quality and Customer Satisfaction is an area where agile methods provide greater opportunity for insight than traditional development approaches tend to allow. The focus on frequent delivery of working software engages the customer in looking at the product itself, rather than the intermediate work products like requirements specifications and design documents. A strong focus on verification criteria (frequently called "definition of done") sharpens the understanding of needed functionality, and attributes of the product that are important to the customer.
- Cost and Funding structures can be tailored to leverage the iterative nature of agile methods. Using optional contract funding lines or indefinite delivery indefinite quantity (IDIQ) contract structures can add flexibility in planning and managing the work of the development organization. A more detailed discussion of the considerations for contracting structures to handle this is the subject of an upcoming publication.
- Requirements are often expressed very differently in the context of agile development--in contrast to traditional large-scale waterfall development approaches. A detailed and complete requirements specification document (as defined in DoD parlance) is not typically viewed as a prerequisite to the start of development activities when agile methods are employed. However, the flexibility to clarify, elaborate and re-prioritize requirements, represented as user stories, may prove advantageous for many large programs. The cost of changing requirements is often seen in ripple effects across the series of intermediate work products that must be maintained in traditional approaches. The fast-paced incremental approach that typifies agile development can help reduce the level of rework.
- Delivery and Progress monitoring is the area where perhaps the greatest difference is seen in agile development, compared to traditional approaches. The frequent delivery of working (potentially shippable) software products renders a more direct view of progress than is typically apparent through examination of intermediate work products. Demonstrations of system capabilities allow early opportunities to refine the final product, and to assure that the development team is moving toward the desired technical performance--not just to ask whether they will complete on schedule and within budget.
We continue to learn new and inventive ways of demonstrating progress and diagnosing performance from agile implementers. The value of this approach is that it represents a narrative driven by real-world experience.
Through this research, we have also observed that agile teams don't want to wait for data analysis. Future research needs to focus on earlier analysis of data streams. The graphical representations needed to analyze these data streams are not the traditional ones we have seen. The analysis techniques that need to be applied, as well as the available baselines, take on a different form in this context: more near-term, more immediate feedback and the intelligent use of historical baselines.
Acquisition researchers in the SEI's Software Solutions Division have published several technical notes that address different topics of interest to acquisition professionals that are contemplating or are currently using agile methods as part of their acquisition:
- Agile Metrics: Progress Monitoring of Agile Contractors
- Considerations for Using Agile in DoD Acquisition
- Agile Methods: Selected DoD Management and Acquisition Concerns
- A Closer Look at 804: A Summary of Considerations for DoD Program Managers
- DoD Information Assurance and Agile: Challenges and Recommendations Gathered Through Interviews with Agile Program Managers and DoD Accreditation Reviewers
- Parallel Worlds: Agile and Waterfall Differences and Similarities
More By The Author
More In Agile
Operator-Feedback Sessions in a Government Setting: The Good and Not-So-Good Parts
Considerations for Operator-Feedback Sessions in Government Settings
This post has been shared 1 times.
Get updates on our latest work.
Sign up to have the latest post sent to your inbox weekly.