SEI Insights

SEI Blog

The Latest Research in Software Engineering and Cybersecurity

Although software is increasingly important to the success of government programs, there is often little consideration given to its impact on early key program decisions. The Carnegie Mellon University Software Engineering Institute (SEI) is conducting a multi-phase research initiative aimed at answering the question: is the probability of a program's success improved through deliberately producing a program acquisition strategy and software architecture that are mutually constrained and aligned?

Code clones are implementation patterns transferred from program to program via copy mechanisms including cut-and-paste, copy-and-paste, and code-reuse. As a software engineering practice there has been significant debate about the value of code cloning. In its most basic form, code cloning may involve a codelet (snippets of code) that undergoes various forms of evolution, such as slight modification in response to problems. Software reuse quickens the production cycle for augmented functions and data structures. So, if a programmer copies a codelet from one file into another with slight augmentations, a new clone has been created stemming from a founder codelet. Events like these constitute the provenance or historical record of all events affecting a codelet object. This blog posting describes exploratory research that aims to understand the evolution of source and machine code and, eventually, create a model that can recover relationships between codes, files, or executable formats where the provenance is not known.

As part of an ongoing effort to keep you informed about our latest work, I would like to let you know about some recently published SEI technical reports and notes. These reports highlight the latest work of SEI technologists in systems of systems integration from an architectural perspective, unintentional insider threat that derives from social engineering, identifying physical security gaps in international mail processing centers and similar facilities, countermeasures used by cloud service providers, the Team Software Process (TSP), and key automation and analysis techniques. This post includes a listing of each report, author(s), and links where the published reports can be accessed on the SEI website.

The process of designing and analyzing software architectures is complex. Architectural design is a minimally constrained search through a vast multi-dimensional space of possibilities. The end result is that architects are seldom confident that they have done the job optimally, or even satisfactorily. Over the past two decades, practitioners and researchers have used architectural patterns to expedite sound software design. Architectural patterns are prepackaged chunks of design that provide proven structural solutions for achieving particular software system quality attributes, such as scalability or modifiability. While use of patterns has simplified the architectural design process somewhat, key challenges remain. This blog explores these challenges and our solutions for achieving system security qualities through use of patterns.

Many types of software systems, including big data applications, lend them themselves to highly incremental and iterative development approaches. In essence, system requirements are addressed in small batches, enabling the delivery of functional releases of the system at the end of every increment, typically once a month. The advantages of this approach are many and varied. Perhaps foremost is the fact that it constantly forces the validation of requirements and designs before too much progress is made in inappropriate directions. Ambiguity and change in requirements, as well as uncertainty in design approaches, can be rapidly explored through working software systems, not simply models and documents. Necessary modifications can be carried out efficiently and cost-effectively through refactoring before code becomes too 'baked' and complex to easily change. This posting, the second in a series addressing the software engineering challenges of big data, explores how the nature of building highly scalable, long-lived big data applications influences iterative and incremental design approaches.

Software used in safety-critical systems--such as automotive, avionics, and healthcare applications, where failures could result in serious harm or loss of life--must perform as prescribed. How can software developers and programmers offer assurance that the system will perform as needed and with what level of confidence? In the first post in this series, I introduced the concept of the assurance case as a means to justify safety, security, or reliability claims by relating evidence to the claim via an argument. In this post I will discuss Baconian probability and eliminative induction, which are concepts we use to explore properties of confidence that the assurance case adequately justifies its claim about the subject system.

As part of our mission to advance the practice of software engineering and cybersecurity through research and technology transition, our work focuses on ensuring that software-reliant systems are developed and operated with predictable and improved quality, schedule, and cost. To achieve this mission, the SEI conducts research and development activities involving the Department of Defense (DoD), federal agencies, industry, and academia. As we look back on 2013, this blog posting highlights our many R&D accomplishments during the past year.

Hacking the CERT FOE

By on in

Occasionally this blog will highlight different posts from the SEI blogosphere. Today we are highlighting a recent post by Will Dormann, a senior member of the technical staff in the SEI's CERT Division, from the CERT/CC Blog. In this post, Dormann describes how to modify the CERT Failure Observation Engine (FOE),when he encounters apps that "don't play well" with the FOE. The FOE is a software testing tool that finds defects in applications running on the Windows platform.