SEI Insights

SEI Blog

The Latest Research in Software Engineering and Cybersecurity

Coordinated Vulnerability Disclosure, Ransomware, Scaling Agile, and Android App Analysis: The Latest Work from the SEI

Posted on by in

By Douglas C. Schmidt
Principal Researcher

As part of an ongoing effort to keep you informed about our latest work, this blog post summarizes some recently published SEI reports, podcasts and webinars highlighting our work in coordinated vulnerability disclosure, scaling Agile methods, automated testing in Agile environments, ransomware, and Android app analysis. These publications highlight the latest work of SEI technologists in these areas. One SEI Special Report presents data related to DoD software projects and translated it into information that is frequently sought-after across the DoD. This post includes a listing of each publication, author(s), and links where they can be accessed on the SEI website.

The CERT Guide to Coordinated Vulnerability Disclosure
By Allen D. Householder, Garret Wassermann, Art Manion, Christopher King

Security vulnerabilities remain a problem for vendors and deployers of software-based systems alike. Vendors play a key role by providing fixes for vulnerabilities, but they have no monopoly on the ability to discover vulnerabilities in their products and services. Knowledge of those vulnerabilities can increase adversarial advantage if deployers are left without recourse to remediate the risks they pose. Coordinated vulnerability disclosure (CVD) is the process of gathering information from vulnerability finders, coordinating the sharing of that information between relevant stakeholders, and disclosing the existence of software vulnerabilities and their mitigations to various stakeholders including the public. The CERT Coordination Center has been coordinating the disclosure of software vulnerabilities since its inception in 1988. This document is intended to serve as a guide to those who want to initiate, develop, or improve their own CVD capability. In it, the reader will find an overview of key principles underlying the CVD process, a survey of CVD stakeholders and their roles, and a description of CVD process phases, as well as advice concerning operational considerations and problems that may arise in the provision of CVD and related services.
Read the special report.

Department of Defense Software Factbook
By Brad Clark, Christopher Miller, James McCurley, David Zubrow, Rhonda Brown, Mike Zuccher (No Affiliation)

This Department of Defense (DoD) Software Factbook provides an analysis of the most extensive collection of software engineering data owned and maintained by the DoD, the software resources data report (SRDR). The SRDR is the primary source of data on software projects and their performance.

The Software Engineering Institute analyzed the SRDR data and translated it into information that is frequently sought-after across the DoD. Basic facts are provided about software projects, such as averages, ranges, and heuristics for requirements, size, effort, and duration. Factual, quantitatively derived statements provide easily digestible and usable benchmarks.

Findings are also presented by system type or super domain. The analysis in this area focuses on identifying the most and least expensive projects and the best and worst projects within three super domains: real time, engineering, and automated information systems. It also provides insight into the differences between system domains and contains domain-specific heuristics.

Finally, correlations are explored among requirements, size, duration, and effort and the strongest models for predicting change are described. The goal of this work was to determine how well the data could be used to answer common questions related to planning or replanning software projects.
Download the Special Report.

Five Keys to Effective Agile Test Automation for Government Programs
By Robert V. Binder, Suzanne Miller

In this discussion-focused webinar, Bob Binder and Suzanne Miller will discuss five key questions that are likely in government organizations contemplating adopting automated test techniques and tools in an Agile environment. Although the answers to these questions will naturally be context-specific, webinar participants will discuss factors that differentiate one context (and therefore, one answer) from another.
View the webinar.

Ransomware: Best Practices for Prevention and Response
By Alexander Volynkin, Angela Horneman

On May 12, 2017, in the course of a day, the WannaCry ransomware attack infected nearly a quarter million computers. WannaCry is the latest in a growing number of ransomware attacks where, instead of stealing data, cyber criminals hold data hostage and demand a ransom payment. WannaCry was perhaps the largest ransomware attack to date, taking over a wide swath of global computers from FedEx in the United States to the systems that power Britain's healthcare system to systems across Asia, according to the New York Times. In this podcast, CERT researchers spell out several best practices for prevention and response to a ransomware attack.
View the podcast.

Scaling Agile Methods
By Eileen Wrubel, Will Hayes

All major defense contractors in the market can tell you about their approaches to implementing the values and principles found in the Agile Manifesto. Published frameworks and methodologies are rapidly maturing, and a wave of associated terminology is part of the modern lexicon. We are seeing consultants feuding on Internet forums as well, each claiming to have the "true" answer for what Agile is and how to make it work in your organization. The challenge now is to scale Agile to work in complex settings with larger teams, larger systems, longer timelines, diverse operating environments, and multiple engineering disciplines. In this podcast, Will Hayes and Eileen Wrubel present five perspectives on scaling Agile from leading thinkers in the field, including Scott Ambler, Steve Messenger, Craig Larman, Jeff Sutherland, and Dean Leffingwell.
View the podcast.

DidFail: Coverage and Precision Enhancement
By Karan Dwivedi (No Affiliation), Hongli Yin (No Affiliation), Pranav Bagree (No Affiliation), Xiaoxiao Tang (No Affiliation), Lori Flynn, William Klieber, William Snavely

This report describes recent enhancements to Droid Intent Data Flow Analysis for Information Leakage (DidFail), the CERT static taint analyzer for sets of Android apps. The enhancements are new analytical functionality for content providers, file accesses, and dynamic broadcast receivers. Previously, DidFail did not analyze taint flows involving ContentProvider components; however, now it analyzes taint flows involving all four types of Android components. The latest version of DidFail tracks taint flow across file access calls more precisely than it did in prior versions of the software. DidFail was also modified to handle dynamically declared BroadcastReceiver components in a fully automated way, by integrating it with a recent version of FlowDroid and working to fix remaining un-analyzed taint flows. Finally, a new command line argument optionally disables static field analysis in order to reduce DidFail's memory usage and analysis time.

These new features make DidFail's taint tracking more precise (for files) and more comprehensive for dynamically registered BroadcastReceiver and ContentProvider components. We implemented the new features and tested them on example apps that we developed and on real-world apps from different categories in the Google Play app store.
Download the report.

Additional Resources

View the latest SEI research in the SEI library.

View the latest installments in the SEI Podcast Series.

View the latest installments in the SEI Webinar Series.

About the Author

Douglas C. Schmidt

Contact Douglas C. Schmidt
Visit the SEI Digital Library for other publications by Douglas
View other blog posts by Douglas C. Schmidt

Comments

Policy

We welcome comments with a wide range of opinions and views. To keep the conversation focused on topic, we reserve the right to moderate comments.

Add a Comment

Comments*


Type the characters you see in the picture above.