SEI Insights

SEI Blog

The Latest Research in Software Engineering and Cybersecurity

As part of an ongoing effort to keep you informed about our latest work, this blog post summarizes some recently published books, SEI technical reports, and webinars in cybersecurity engineering, performance and dependability, cyber risk and resilience management, cyber intelligence, secure coding, and the latest requirements for chief information security offficers (CISOs).

These publications highlight the latest work of SEI technologists in these areas. This post includes a listing of each publication, author(s), and links where they can be accessed on the SEI website.

Late last month, Internet users across the eastern seaboard of the United States had trouble accessing popular websites, such as Reddit, Netflix, and the New York Times. As reported in Wired Magazine, the disruption was the result of multiple distributed denial of service (DDoS) attacks against a single organization: Dyn, a New Hampshire-based Internet infrastructure company.

DDoS attacks can be extremely disruptive, and they are on the rise. The Verisign Distributed Denial of Service Trends Report states that DDoS attack activity increased 85 percent in each of the last two years with 32 percent of those attacks in the fourth quarter of 2015 targeting IT services, cloud computing, and software-as-a-service companies. In this blog post, I provide an overview of DDoS attacks and best practices for mitigating and responding to them based on cumulative experience in this field.

This post was co-authored by Nancy Mead.

Cyber threat modeling, the creation of an abstraction of a system to identify possible threats, is a required activity for DoD acquisition. Identifying potential threats to a system, cyber or otherwise, is increasingly important in today's environment. The number of information security incidents reported by federal agencies to the U.S. Computer Emergency Readiness Team (US-CERT) has increased by 1,121 percent from 5,503 in fiscal year 2006 to 67,168 in fiscal year 2014, according to a 2015 Government Accountability Office report. Yet, our experience has been that it is often conducted informally with few standards. Consequently, important threat scenarios are often overlooked.

Given the dynamic cyber threat environment in which DoD systems operate, we have embarked on research work aimed at making cyber threat modeling more rigorous, routine, and automated. This blog post evaluates three popular methods of cyber threat modeling and discusses how this evaluation will help develop a model that fuses the best qualities of each.

Over the past six months, we have developed new security-focused modeling tools that capture vulnerabilities and their propagation paths in an architecture. Recent reports (such as the remote attack surface analysis of automotive systems) show that security is no longer only a matter of code and is tightly related to the software architecture. These new tools are our contribution toward improving system and software analysis. We hope they will move forward other work on security modeling and analysis and be useful to security researchers and analysts. This post explains the motivation of our work, the available tools, and how to use them.

The exponential increase in cybercrime is a perfect example of how rapidly change is happening in cyberspace and why operational security is a critical need. In the 1990s, computer crime was usually nothing more than simple trespass. Twenty-five years later, computer crime has become a vast criminal enterprise with profits estimated at $1 trillion annually. One of the primary contributors to this astonishing success is the vulnerability of software to exploitation through defects. How pervasive is the problem of vulnerability? The average cost of a data breach is $4 million, up 29 percent since 2013, according to Ponemon Institute and IBM data. Ponemon also concluded that there's a 26-percent probability that an enterprise will be hit by one or more data breaches of 10,000 records over the next 2 years. Increased system complexity, pervasive interconnectivity, and widely distributed access have increased the challenges for building and acquiring operationally secure capabilities. This blog post introduces a set of seven principles that address the challenges of acquiring, building, deploying, and sustaining software systems to achieve a desired level of confidence for software assurance.

As part of an ongoing effort to keep you informed about our latest work, this blog post summarizes some recently published SEI technical reports, white papers, and webinars in resilience, effective cyber workforce development, secure coding, data science, insider threat, and scheduling. These publications highlight the latest work of SEI technologists in these areas. This post includes a listing of each publication, author(s), and links where they can be accessed on the SEI website.

This post was co-authored by Sagar Chaki

In 2011, the U.S. Government maintained a fleet of approximately 8,000 unmanned aerial systems (UAS), commonly referred to as "drones," a number that continues to grow. "No weapon system has had a more profound impact on the United States' ability to provide persistence on the battlefield than the UAVs," according to a report from the 2012 Defense Science Board. Making sure government and privately owned drones share international air space safely and effectively is a top priority for government officials. Distributed Adaptive Real-Time (DART) systems are key to many areas of Department of Defense (DoD) capability, including the safe execution of autonomous, multi-UAS missions having civilian benefits. DART systems promise to revolutionize several such areas of mutual civilian-DoD interest, such as robotics, transportation, energy, and health care. To fully realize the potential of DART systems, however, the software controlling them must be engineered for high-assurance and certified to operate safely and effectively. In short, these systems must satisfy guaranteed and highly-critical safety requirements (e.g., collision avoidance) while adapting smartly to achieve application requirements, such as protection coverage, while operating in dynamic and uncertain environments. This blog post describes our architecture and approach to engineering high-assurance software for DART systems.

Network flow plays a vital role in the future of network security and analysis. With more devices connecting to the Internet, networks are larger and faster than ever before. Therefore, capturing and analyzing packet capture data (pcap) on a large network is often prohibitively expensive. Cisco developed NetFlow 20 years ago to reduce the amount of information collected from a communication by aggregating packets with the same IP addresses, transport ports, and protocol (also known as the 5-tuple) into a compact record. This blog post explains why NetFlow is still important in an age in which the common wisdom is that more data is always better. Moreover, NetFlow will become even more important in the next few years as communications become more opaque with the development of new protocols that encrypt payloads by default.