Is Your Organization Ready for Agile? - Part 4
This blog post is the fourth in a series on Agile adoption in regulated settings, such as the Department of Defense, Internal Revenue Service, and Food and Drug Administration.
Government agencies, including the departments of Defense, Veteran Affairs, and Treasury, are being asked by their government program office to adopt Agile methods. These are organizations that have traditionally utilized a "waterfall" life cycle model (as epitomized by the engineering "V" charts). Programming teams in these organizations are accustomed to being managed via a series of document-centric technical reviews that focus on the evolution of the artifacts that describe the requirements and design of the system rather than its evolving implementation, as is more common with Agile methods. Due to these changes, many struggle to adopt Agile practices. For example, acquisition staff often wonder how to fit Agile measurement practices into their progress tracking systems.
They also find it hard to prepare for technical reviews that don't account for both implementation artifacts and the availability of requirements/design artifacts. My ongoing series on the Readiness & Fit Analysis (RFA) approach focuses on helping organizations understand the risks involved when contemplating or embarking on the adoption of new practices, in this case Agile methods. This posting explores project and customer environment, one of many challenging factors to assess when considering Agile adoption readiness.
A Framework for Determining Agile Readiness
The method for using RFA and the profile that supports CMMI for Development adoption is found in Chapter 12 of CMMI Survival Guide: Just Enough Process Improvement. Adopting new practices like those found in SEI's CMMI models involves adoption risk, as does the adoption of many other technologies. I first used RFA in the 1990s to identify adoption risks for software process tools. Since that time, I have used RFA to profile various technologies, including CMMI.
For the last several years, the SEI has researched adoption of Agile methods in U.S. DoD government settings. SEI researchers have adapted the RFA profiling technique to accommodate typical factors related to adopting Agile methods for any setting. We have also focused on other factors more uniquely associated with adopting Agile methods in the highly regulated DoD government acquisition environment.
This blog is the fourth in a seriesdiscussing RFA and how the SEI's work extends it to support profiling and adoption risk identification for DoD organizations that are considering or are in the middle of adopting Agile methods. This series of blog posts characterizes six categories to profile for readiness and fit:
- business and acquisition (discussed in the first post)
- organizational climate (discussed in the second post and continued in the third post)
- project and customer environment (discussed in this post)
- practices (discussed in the fifth post)
- system attributes (discussed in the sixth post)
- technology environment (discussed in the seventh post)
The categories and factors continue to evolve as we pilot the analysis in client settings, but these six listed above are the ones we're currently using.
Applying the Readiness & Fit Analysis
Each category of readiness and fit has a set of attributes that can be characterized by a statement representing your expectation if you were observing a successful Agile project or organization operating in relation to that attribute. For example, an attribute from the project and customer environment category is stated as follows:
The rhythm of review and oversight activities is compatible with the Agile/lean methods that are planned for or are already in use
Comments: Milestone technical reviews are an aspect of acquisition that can either support or disable an Agile project. So, having the rhythm of reviews in the larger program that are compatible with the iterative evolution of products that is typical with Agile methods reduces the risk of review and other oversight activities being counterproductive. If you were evaluating your organization's "fit" with this attribute, you would think about the timing and entry criteria for using different technical and management reviews in your current environment in comparison to the rhythm of reviews (typically iteration demonstrations and release demonstrations) that are typical in Agile settings.
Certainty regarding readiness for Agile adoption increases from early use of the RFA method (before initial pilots) to later use (after two or three releases using the newly adopted Agile method). Certainty also changes with respect to the importance of a specific factor to organizational success.
At the beginning of an Agile adoption project, organizations are often uncertain about their current state in terms of adoption factors or the importance of individual factors (such as alignment of oversight practices with our Agile practices) to organizational adoption success. Later in the adoption process, performing an RFA highlights adoption risk areas that were overlooked during early phases of adoption. The RFA also identifies areas where we now have more data to help guide the organization in developing adoption risk mitigation strategies.
For example, we may not initially understand that the approach to cost estimation in a larger organization doesn't easily accommodate certain Agile practices, such as relative estimation. After one or two pilots, however, we are more likely to understand the impact of relative estimation on our results and develop strategies to help connect the Agile estimation practices to those of the larger program. This may no longer be an area of adoption risk and we can move on to dealing with other issues. The key point here is to be prepared to apply RFA principles and techniques at multiple points in your adoption journey.
The remainder of this post is dedicated to the project and customer environment category, which deals with interactions among development team members and interactions between the development team and its customer, users, and management.
Project and Customer Environment
This category covers characteristics and aspects related to project and customer environments, many of which relate specifically to the 12 Agile principles in the Agile Manifesto. These principles are the cornerstone and building blocks for Agile. If some of these building blocks are not present or are weak, then successful adoption and institutionalization of Agile development may take longer to accomplish. A missing building block could even cause Agile efforts to provide less than expected benefits, or, at worst, the adoption could fail.
The following list has both a tag (a short title that summarizes the statement) followed by a statement that provides a condition or behavior that would be expected to be found in an organization successfully using engineering and management methods consistent with Agile principles. Here are the factors within project and customer environment category that we consider when performing an RFA:
Appropriately trained staff. All members of teams performing Agile/lean methods or using work products of Agile/lean methods are appropriately trained or experienced.
The fifth Agile principle states, "Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done." Techniques used within Agile methods are different than those used on traditional projects. Teams need to be trained in the specific Agile method they will be using to reap the maximum benefit.
Co-located teams. Teams performing Agile/lean methods are co-located (physically or virtually).
The sixth Agile principle states, "The most efficient and effective method of conveying information to and within a development team is face-to-face conversation." Distributed teams, which are common today, can also function successfully with modern communications tools. While face-to-face interaction is preferred and incorporated into team plans on regular intervals, tools such as video teleconferences and instant messenger have the ability to augment periodic face-to-face interactions.
Teams that can be physically co-located, however, will likely need to transform portions of their facilities into team rooms with no walls or cubicles, to allow space for the team to work together.
Agile/lean competent staff. Teams performing Agile/lean methods possess the competencies (skills, knowledge, process abilities) needed to perform their roles.
Not only do Agile teams (or any team for that matter) need training to perform their jobs, they also need skills, knowledge, and abilities, which are often different than those used in the traditional software development environment. Those performing the RFA must assess if the team has them. If the team is in transition, there will be a learning curve that will impact the project until the team learns new skills. It's also important to note that the entire team--not just developers--must have some understanding of Agile.
Rhythm of oversight compatible with Agile. The rhythm of review and oversight activities is compatible with the Agile/lean methods that are planned for or are already in use.
Within the DoD, most programs follow the typical acquisition lifecycle, which includes major milestones and major periodic reviews. This practice runs counter to the normal rhythm of Agile development: short iterations (2-to-4 weeks), test-driven development, and continuous integration. The difference is more than schedule, however; while it is true that an Agile lifecycle uses the same building blocks as traditional lifecycles (analyze, design, build, test and deploy), it does so for all the blocks of each iteration. Traditional lifecycles handle each block in isolation. Thus, when a traditional program is ready for a preliminary design review, an Agile program will already have working code in place. On the other hand, it may not have the same level of detail in its requirements as the traditional program that has not been focused on producing working software.
Review goals aligned with Agile. Oversight review goals and activities are aligned with the Agile or lean products and processes in use.
Traditional programs include major periodic reviews. They also use documents, among other purposes, to accomplish oversight. Agile is not void of documentation. The first principle, which states "Our highest priority is to satisfy the customer through early and continuous delivery of valuable software, prioritizes continuous delivery of valuable code, however, Agile emphasizes just-enough documentation. And yet the primary measure of progress is working software (seventh Agile principle). These two styles are antithetical and will cause issues if the appropriate tailoring of traditional reviews is not accomplished for Agile programs.
Requirements incompleteness acknowledged. Program requirements management processes allow for the reality of incomplete requirements throughout product evolution.
The second Agile principle states, "Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage" (in DoD settings, the customer may be the operational staff and not the acquisition customer). Traditional DoD programs determine all requirements up front, lock them down, and impose heavy processes (and often high costs) on late changes. DoD programs will need to adopt a more flexible view of requirements gathering to reap the full benefits of Agile. SEI researchers are at work on a technical note that deals explicitly with Agile and requirements.
Positive perception of Agile by team. Performers of Agile/lean methods and users of their work products have a positive perception of the methods they are using/going to use.
If the team performing the work does not view the process in a positive light, then their efforts will not be optimal. All stakeholders are members of the "team" for this purpose. This is true for any set of practices that an organization adopts. Positive perception of the practices being adopted is even more important for practices like Agile or lean that are not mainstream in the DoD setting, because you are likely to need "top cover" from management to be able to perform Agile or lean methods in an effective manner.
Appropriate use of cost-size factors. Program size and cost are considered factors to collect data about rather than to create a "desired state" statement from.
Agile practitioners use terms like story points, velocity, burndown charts, and burnup charts when they discuss cost, size, and remaining work. These terms are not readily translated to the more traditional views expressed as earned value. Agile estimation for cost and size uses a relative approach versus the absolute approach used by traditional projects. The differences between the two approaches need to be understood and accommodated when assessing program status. The SEI is working on a technical note addressing progress measurement in Agile programs in general, including discussion of earned value. In 2011, the SEI published the technical note Agile Methods: Selected DoD Management and Acquisition Concerns, which also addresses estimation issues in Agile settings in DoD.
Management as coaching function. Management is a coaching function (as opposed to traditional command-and-control) that helps to eliminate barriers to progress.
Agile managers take on a coaching function. In doing so, they facilitate, mentor, and champion their teams. The team is self-organizing and their work during an individual iteration is not directed by the manager, but rather coached and mentored. This new role is sometimes foreign to managers steeped in the traditional command-and-control style of management. Self-organized teams empower Agile, so managers that adapt to and adopt the role of coach are usually more successful in managing Agile projects.
High trust between management and teams. Teams are made up of task-mature individuals operating in high-trust groups.
As mentioned above, the fifth Agile principle states, "Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done." Agile promotes the creation of teams and trusts the individuals to complete the job. The managers coach the team, which is self-organized and has tools and competencies needed to accomplish the work. In many ways this trust environment is contrary to the culture of "trust but verify" one often finds in the DoD development environment. Where we have seen this environment of trust in DoD Agile settings, the project has typically been very successful.
Sustainable development pace. Management emphasizes a consistently sustainable pace of development.
The eighth Agile principle states, "Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely." For this to happen, the management needs to encourage and promote this paradigm as opposed to waiting for integration to discover that a "death march" will be needed to integrate and complete the software and its testing on time.
Many other factors influence readiness in the category of project and customer environment. The ones discussed in this posting, however, most closely reflect actual practices in the field. By paying attention to them when considering your readiness and fitness for Agile adoption, you can realize more successful pilots and implementations. Each category in the RFA offers insight into the risks that an organization will face when adopting Agile methods. Identifying these risks is an important first step in planning and executing mitigation strategies to address them.
In the next post in this series, I will delve into two other important, though shorter, RFA categories: technology environment and system attributes. These categories address factors related to management and technical tooling available for Agile development, as well as factors related to the character of the project being undertaken.
I would love to hear your experiences adopting Agile methods, especially those operating in regulated environments. Please leave a comment below or send an email to email@example.com.
Read all the posts in this series on the Readiness & Fit Analysis method.
Read a white paper about the Readiness & Fit Analysis method.
For more information on management and acquisition considerations in using agile methods in DoD environment, please our first two technical notes in the Agile Acquisition series:
This post has been shared 0 times.
More By The Author
Why Software Architects Must Be Involved in the Earliest Systems Engineering Activities
Five Models of Technology Transition to Bridge the Gap Between Digital Natives and Digital Immigrants
• By Suzanne Miller
Five Perspectives on Scaling Agile
More In Agile
Operator-Feedback Sessions in a Government Setting: The Good and Not-So-Good Parts
Considerations for Operator-Feedback Sessions in Government Settings
Get updates on our latest work.
Sign up to have the latest post sent to your inbox weekly.