SEI Insights

SEI Blog

The Latest Research in Software Engineering and Cybersecurity

New data sources, ranging from diverse business transactions to social media, high-resolution sensors, and the Internet of Things, are creating a digital tidal wave of big data that must be captured, processed, integrated, analyzed, and archived. Big datasystems storing and analyzing petabytes of data are becoming increasingly common in many application areas. These systems represent major, long-term investments requiring considerable financial commitments and massive scale software and system deployments.

When life- and safety-critical systems fail (and this happens in many domains), the results can be dire, including loss of property and life. These types of systems are increasingly prevalent, and can be found in the altitude and control systems of a satellite, the software-reliant systems of a car (such as its cruise control and anti-lock braking system), or medical devices that emit radiation. When developing such systems, software and systems architects must balance the need for stability and safety with stakeholder demands and time-to-market constraints. The Architectural Analysis & Design Language (AADL) helps software and system architects address the challenges of designing life- and safety-critical systems by providing a modeling notation with well-defined real-time and architectural semantics that employ textual and graphic representations. This blog posting, part of an ongoing series on AADL, focuses on the initial foundations of AADL.

Agile projects with incremental development lifecycles are showing greater promise in enabling organizations to rapidly field software compared to waterfall projects. There is a lack of clarity, however, regarding the factors that constitute and contribute to success of Agile projects. A team of researchers from Carnegie Mellon University's Software Engineering Institute, including Ipek Ozkaya, Robert Nord, and myself, interviewed project teams with incremental development lifecycles from five government and commercial organizations. This blog posting summarizes the findings from this study to understand key success and failure factors for rapid fielding on their projects.

Exclusively technical approaches toward attaining cyber security have created pressures for malware attackers to evolve technical sophistication and harden attacks with increased precision, including socially engineered malware and distributed denial of service (DDoS) attacks. A general and simple design for achieving cybersecurity remains elusive and addressing the problem of malware has become such a monumental task that technological, economic, and social forces must join together to address this problem. At the Carnegie Mellon University Software Engineering Institute's CERT Division, we are working to address this problem through a joint collaboration with researchers at the Courant Institute of Mathematical Sciences at New York University led by Dr. Bud Mishra. This blog post describes this research, which aims to understand and seek complex patterns in malicious use cases within the context of security systems and develop an incentives-based measurement system that would evaluate software and ensure a level of resilience to attack.

The power and speed of computers have increased exponentially in recent years. Recently, however, modern computer architectures are moving away from single-core and multi-core (homogenous) central processing units (CPUs) to many-core (heterogeneous) CPUs. This blog post describes research I've undertaken with my colleagues at the Carnegie Mellon Software Engineering Institute (SEI)--including colleagues Jonathan Chu and Scott McMillan of the Emerging Technology Center (ETC) as well as Alex Nicoll, a researcher with the SEI's CERT Division--to create a software library that can exploit the heterogeneous parallel computers of the future and allow developers to create systems that are more efficient in terms of computation and power consumption.

Department of Defense (DoD) program managers and associated acquisition professionals are increasingly called upon to steward the development of complex, software-reliant combat systems. In today's environment of expanded threats and constrained resources (e.g., sequestration), their focus is on minimizing the cost and schedule of combat-system acquisition, while simultaneously ensuring interoperability and innovation. A promising approach for meeting these challenging goals is Open Systems Architecture (OSA), which combines (1) technical practices designed to reduce the cycle time needed to acquire new systems and insert new technology into legacy systems and (2) business models for creating a more competitive marketplace and a more effective strategy for managing intellectual property rights in DoD acquisition programs. This blog posting expands upon our earlier coverage of how acquisition professionals and system integratorscan apply OSA practices to decompose large monolithic business and technical designs into manageable, capability-oriented frameworks that can integrate innovation more rapidly and lower total ownership costs.

As part of an ongoing effort to keep you informed about our latest work, I would like to let you know about some recently published SEI technical reports and notes. Three of these reports highlight the latest work of SEI technologists on insider threat in international contexts, unintentional insider threats, and attributes and mitigation strategies. The last reportprovides the results of several exploratory research initiatives conducted by SEI staff in fiscal year 2012. This post includes a listing of each report, author(s), and links where the published reports can be accessed on the SEI website.

In our work with the Department of Defense (DoD) and other government agencies such as the U.S. Department of Veteran Affairs and the U.S. Department of the Treasury, we often encounter organizations that have been asked by their government program office to adopt agile methods. These are organizations that have traditionally utilized a "waterfall" life cycle model (as epitomized by the engineering "V" charts) and are accustomed to being managed via a series of document-centric technical reviews that focus on the evolution of the artifacts that describe the requirements and design of the system rather than its evolving implementation, as is more common with agile methods.