SEI Insights

SEI Blog

The Latest Research in Software Engineering and Cybersecurity

To view a video of the introduction, please click here.
The Better Buying Power 2.0 initiative is a concerted effort by the United States Department of Defense to achieve greater efficiencies in the development, sustainment, and recompetition of major defense acquisition programs through cost control, elimination of unproductive processes and bureaucracy, and promotion of open competition. This SEI blog posting describes how the Navy is operationalizing Better Buying Power in the context of their Open Systems Architecture and Business Innovation initiatives.

According to a report issued by the Government Accountability Office (GAO) in February 2013, the number of cybersecurity incidents reported that could impact "federal and military operations; critical infrastructure; and the confidentiality, integrity, and availability of sensitive government, private sector, and personal information" has increased by 782 percent--from 5,503 in 2006 to 48,562 in 2012. In that report, GAO also stated that while there has been incremental progress in coordinating the federal response to cyber incidents, "challenges remain in sharing information among federal agencies and key private sector entities, including critical infrastructure owners."

In 2012, the White House released its federal digital strategy. What's noteworthy about this release is that the executive office distributed the strategy using Bootstrap, an open source software (OSS) tool developed by Twitter and made freely available to the public via the code hosting site GitHub. This is not the only evidence that we have seen of increased government interest in OSS adoption. Indeed, the 2013 report The Future of Open Source Software revealed that 34 percent of its respondents were government entities using OSS products.

Although software is increasingly important to the success of government programs, there is often little consideration given to its impact on early key program decisions. The Carnegie Mellon University Software Engineering Institute (SEI) is conducting a multi-phase research initiative aimed at answering the question: is the probability of a program's success improved through deliberately producing a program acquisition strategy and software architecture that are mutually constrained and aligned?

Code clones are implementation patterns transferred from program to program via copy mechanisms including cut-and-paste, copy-and-paste, and code-reuse. As a software engineering practice there has been significant debate about the value of code cloning. In its most basic form, code cloning may involve a codelet (snippets of code) that undergoes various forms of evolution, such as slight modification in response to problems. Software reuse quickens the production cycle for augmented functions and data structures. So, if a programmer copies a codelet from one file into another with slight augmentations, a new clone has been created stemming from a founder codelet. Events like these constitute the provenance or historical record of all events affecting a codelet object. This blog posting describes exploratory research that aims to understand the evolution of source and machine code and, eventually, create a model that can recover relationships between codes, files, or executable formats where the provenance is not known.

As part of an ongoing effort to keep you informed about our latest work, I would like to let you know about some recently published SEI technical reports and notes. These reports highlight the latest work of SEI technologists in systems of systems integration from an architectural perspective, unintentional insider threat that derives from social engineering, identifying physical security gaps in international mail processing centers and similar facilities, countermeasures used by cloud service providers, the Team Software Process (TSP), and key automation and analysis techniques. This post includes a listing of each report, author(s), and links where the published reports can be accessed on the SEI website.

The process of designing and analyzing software architectures is complex. Architectural design is a minimally constrained search through a vast multi-dimensional space of possibilities. The end result is that architects are seldom confident that they have done the job optimally, or even satisfactorily. Over the past two decades, practitioners and researchers have used architectural patterns to expedite sound software design. Architectural patterns are prepackaged chunks of design that provide proven structural solutions for achieving particular software system quality attributes, such as scalability or modifiability. While use of patterns has simplified the architectural design process somewhat, key challenges remain. This blog explores these challenges and our solutions for achieving system security qualities through use of patterns.

Many types of software systems, including big data applications, lend them themselves to highly incremental and iterative development approaches. In essence, system requirements are addressed in small batches, enabling the delivery of functional releases of the system at the end of every increment, typically once a month. The advantages of this approach are many and varied. Perhaps foremost is the fact that it constantly forces the validation of requirements and designs before too much progress is made in inappropriate directions. Ambiguity and change in requirements, as well as uncertainty in design approaches, can be rapidly explored through working software systems, not simply models and documents. Necessary modifications can be carried out efficiently and cost-effectively through refactoring before code becomes too 'baked' and complex to easily change. This posting, the second in a series addressing the software engineering challenges of big data, explores how the nature of building highly scalable, long-lived big data applications influences iterative and incremental design approaches.