SEI Insights

SATURN Blog

SEI Architecture Technology User Network (SATURN) News and Updates

SATURN 2014 Growing Great Architects Session (notes)

Posted on by in

Notes by Scott Shipp, edited by Tamara Marshall-Keim Metrics for Simplifying and Standardizing Enterprise Architecture: An Experience Report for an Oil and Gas Organization Alexis Ocampo (Ecopetrol) Jens Heidrich (Fraunhofer IESE) Constanza Lampasona (Fraunhofer IESE) Victor Basili (University of Maryland, Fraunhofer CESE) Some data about Ecopetrol S.A.

  • largest petroleum company in Colombia
  • One of four largest Latin American oil and gas companies
  • 1M barrels will be produced 2015
  • Top 40 world oil and gas companies
How can IT contribute? They have been working on answering this question for several years.

Their IT strategy is based on information. Goal: Provide reliable and secure information in real time (within 24 hours). If they can provide reliable information at the time that it is needed to make decisions, then they will benefit the company positively. Standardization and simplification of enterprise platforms is one of five components to support this overall goal. Example of a well-known method: GQM+Strategies (GQM = Goal-Question-Method). This approach connects business goals with IT/project goals. The presenters walked through how to use this method to produce a corresponding/supporting IT goal. How did they approach this challenge? 1. Took a survey.

  • Which ISO25010 quality characteristic is most important to each developer?
  • Type of software
  • Programming languages
  • Areas addressed
2. Held an onsite workshop. 3. Conducted the GQM. ...but the survey did not yield useful data, so they changed their approach. First, they informally applied the UMD approach and asked, “What do you think should not happen? What causes you the most problems?” Next, they used Quality Model Mapping from problems identified to ISO 25010 Quality Attributes. Then, they used a tool to visualize coupling between enterprise architecture components. They needed to understand the difficulty, due to downstream effects, of making a change in one place and how it echoes through the system. Using the visualizations, they were able to see what most needed to be addressed. The visualizations used various metrics such as provided interfaces not used by other applications, provided interfaces consumed by other applications, etc. They viewed complexity of change due to metrics within applications: cohesion, cyclomatic complexity, etc. This helped them understand reusability of various pieces within the enterprise architecture. During this process, they also found some application redundancy. Responded by identifying necessary platform migrations. Created visualizations of the platform landscape in 2012 versus the desired platform landscape for 2017. Began tracking metrics like "decreasing interfaces" and moved from 100% to 92% in 2012 and 2013. Should hit 74% in 2015. Other metrics like these are how they are tracking their progress. Combining Architectural Methods to Build a Reference Architecture for Ground Radar Monitoring Systems Alejandro Bianchi, LIVEWARE IS S.A. J. Andres Diaz-Pace, UNICEN University Leonardo Seminara, LIVEWARE IS S.A. Gustavo De Souza, INVAP S.E. Context for the work:
  • This company built ground radars and monitoring (GRM).
  • Standard communication protocol: Asterix, which periodically receives data from radar.
  • Main purpose is telemetry: analyze telemetry; trigger alarms.
The operator trusts the software as if it were a mirror of the hardware (the radar). Desired qualities: fidelity, performance, "diagnosticability" of problems. The organization was a multi-stakeholder landscape: physicists, engineers of various kinds, and some software guys. An existing GRM (brownfield) was already in operation. Developed by a contractor. Relationship had ended. Had little design documentation. The starting point:
  • The as-is system C&C view was provided in a slide.
  • High maintenance overhead due to business rules that were all written in JavaScript.
  • The solution didn't scale well. The problem went beyond design. They needed a more general solution.
Why an architecture-centric solution?
  • Needed a software family.
  • Wanted to speed up development cycles without sacrificing quality of the design.
  • Needed a modern tool to engage stakeholders.
  • Had limited and overwhelmed software developers.
  • Architectural principles provided leverage for product evolution.
Proposed a reference architecture. Key highlights: systematic reuse of domain knowledge, played an informative role (knowledge sharing) plus some design prescriptions, and created re-usable assets. Technical Approach 1. Identify business goals and quality attributes.
  • QAW
2. Create the reference architecture.
  • Mine assets from existing application code.
  • ADD
  • Views & Beyond
3. Evaluate the architecture on site.
  • ATAM-oriented
  • Provide technology guidelines aligned with QA drivers.
Tailored the methods as needed. Teams, processes, etc.
  • 3-person architecture team on their side
  • Small team on customer side
  • Used an iterative and incremental strategy and took about 5 months (a requirement from the customer) - 3-week iterations with status-sync meetings every 2 weeks; plan, measure, and report "design work"
  • Also doing some prototyping of the new system
DETAILS ON THREE STEPS OF TECHNICAL APPROACH 1. Analysis of quality attributes
  • Identified business goals. Created scenarios.
  • Identified QA's based on business goals: fidelity, performance, usability, "diagnosticability."
  • Created architectural plan with an extension: Identified risks of the architectural plan. Performed risk analysis. Compared with the existing system.
2. Design of the architecture.
  • ADD iterations
  • 4 scenarios per 3-week iterations
  • Addressed high-priority scenarios first
  • Used scenarios to conduct reviews with the client and identify unfolding flaws in the design as well as show progress. Provided great ongoing feedback to address flaws before they became problems.
  • Held a review/feedback meeting with the technical liaison after each iteration.
  • Used C&C views and sequence diagrams to drive the ADD process.
3. Architecture evaluation.
  • Had all customer stakeholders present, including some new ones.
  • Walked through each key scenario, but did not follow the standard ATAM method. Did use an ATAM template.
  • Mapped drivers on new architecture. Provided a view of the various tradeoffs.
  • Helped facilitate stakeholders' reasoning about tradeoffs.
So … the new system-to-be …
  • had clearly identified QA drivers
  • which were also analyzed
  • provided main deployment options
  • 40-page software architecture document (compared to the old system's 10-page document)
Also provided some direction for implementation. Main lessons learned:
  • Achieved convergence of stakeholders through architecture
  • Received feedback from stakeholders that they should follow this approach on other systems as well!
  • Allowed them to see a long-term vision for the product, rather than only fixes for known, short-term problems.
  • Had little functional information, which was a bottleneck on progress.
  • The architecture reconstruction based on mining existing functionality did not go as planned.
Follow-up:
  • Importance of visual metaphors.
  • Need to create a measurement framework for the approach and its architecting activities.
Presentation link: http://resources.sei.cmu.edu/asset_files/Presentation/2014_017_101_89528.pdf Teaching Architecture Metamodel-First George Fairbanks, Google Author of the book Just Enough Software Architecture. PhD from CMU. The presentation explains how he approaches teaching one 90-minute "course" to software engineers who may do architecture. “How many of you had to either formally teach architecture or informally mentor others to teach them architecture?” (Most hands went up.) Fairbanks has had that experience and taught OOAD, design, and architecture since the '90s. This was a frustrating experience because... Architecture is still different, and harder. It is programming-in-the-large. This talk applies to two kinds of people:
  1. teachers
  2. newcomers to architecture
It's also a peek behind the curtain for newcomers. If the teacher has a hard time teaching, it is understandable that the student will have a hard time learning. This talk is organized by obstacles with pedagogical strategies and learning points. Teaching architecture is hard: understatement of the year. Obstacle: Low motivation for people to learn architecture. Strategy to overcome: Fail fast. Headline-writing story. Establish that people are going to do it wrong, so they are more motivated to learn. Failure 1: Show that their diagrams stink. Failure 2: They focus on the problem, not the diagram. Obstacle: Abstract ideas. Some people seek abstractions, whereas some seek concrete expression. Strategy to overcome: Make the abstract tangible. Two ways: 1. Scope the course to focus on a concrete activity. 2. Course sequence is heavy on exercises. Concrete/abstract/concrete progression. Obstacle: Big investment before big payoff.
  • Architecture ideas form a complex web.
  • Must internalize the ideas before applying them.
  • Generally takes years to master.
Strategy to overcome: Teach a small, common task. Diagramming, for example. Teach them not how to make all diagrams perfectly but how to make some diagrams better. Alternative: Teach from failure examples. Obstacle: Wrong details.   Novices mix abstraction levels and omit critical details. Strategy to overcome: Teach metamodel-first.
  • Just enough theory to succeed at the concrete task. For example, focus on the legend. The details provided there show what you could possibly learn from the diagram itself.
  • Tantalizing glimpse of full cognitive model.
Obstacle: PITS vs. PITL (programming in the small vs. programming in the large). Strategy to overcome: Spend lots of time on exercises. General pedagogy: I do, we do, you do. Example: Course outline
  • Part 1: Student exercise (fail fast)
  • Part 2: Lecture
  • Part 3: Student exercise
 See the presentation slides for the full outline. Wants the students to see the same material differently, as though they were watching American football with an NFL coach sitting next to them, and compare what they see vs. what he sees. Example exercise described: building a diagram for a library system. Discussed "sequencing lecture topics" for designing an online course for this material and showed a matrix for the importance of an item (y axis) and its corresponding conceptual complexity (x axis). Discussed "lecture topics," including the topics for the first course and references in the book Just Enough Software Architecture. Reflections:
  • Overall, it is effective in a short time to get people hooked on building better software diagrams.
  • Positive feedback.
  • Always a problem with mission creep. Fixing each problem identified requires putting more into the course.
  • Students backslide superfast. Two weeks later, their diagrams exhibit bad habits, such as not having everything have a semantic meaning.
  • But they use legends!
  • Poor summary for course from students ("use a legend").
Presentation link: http://resources.sei.cmu.edu/asset_files/Presentation/2014_017_101_89883.pdf

About the Author

Bill Pollak

Contact Bill Pollak
Visit the SEI Digital Library for other publications by Bill
View other blog posts by Bill Pollak

Comments

Policy

We welcome comments with a wide range of opinions and views. To keep the conversation focused on topic, we reserve the right to moderate comments.

Add a Comment

Comments*


Type the characters you see in the picture above.