Many system and software developers and testers, especially those who have primarily worked in business information systems, assume that systems--even buggy systems--behave in a deterministic manner. In other words, they assume that a system or software application will always behave in exactly the same way when given identical inputs under identical conditions. This assumption, however, is not always true. While this assumption is most often false when dealing with cyber-physical systems, new and even older technologies have brought various sources of non-determinism, and this has significant ramifications on testing. This blog post, the first in a series, explores the challenges of testing in a non-deterministic world.
As we have done each year since the blog's inception in 2011, this blog post presents the10 most-visited posts in 2016 in descending order ending with the most popular post. While the majority of our most popular posts were published in the last 12 months, a few, such as Don Firesmith's 2013 posts about software testing, continue to be popular with readers.
Software with timers and clocks (STACs) exchange clock values to set timers and perform computation. STACs are key elements of safety-critical systems that make up the infrastructure of our daily lives. They are particularly used to control systems that interact (and must be synchronized) with the physical world. Examples include avionics systems, medical devices, cars, cell phones, and other devices that rely on software not only to produce the right output, but also to produce it at the correct time. An airbag, for example, must deploy as intended, but just as importantly, it must deploy at the right time. Thus, when STACs fail to operate as intended in the safety-critical systems that rely on them, the result can be significant harm or loss of life. Within the Department of Defense (DoD), STACs are used widely, ranging from real-time thread schedulers to controllers for missiles, fighter planes, and aircraft carriers. This blog post presents exploratory research to formally verify safety properties of sequential and concurrent STACs at the source-code level.
The growth and change in the field of robotics in the last 15 years is tremendous, due in large part to improvements in sensors and computational power. These sensors give robots an awareness of their environment, including various conditions such as light, touch, navigation, location, distance, proximity, sound, temperature, and humidity. The increasing ability of robots to sense their environments makes them an invaluable resource in a growing number of situations, from underwater explorations to hospital and airport assistants to space walks. One challenge, however, is that uncertainty persists among users about what the robot senses; what it predicts about its state and the states of other objects and people in the environment; and what it believes its outcomes will be from the actions it takes. In this blog post, I describe research that aims to help robots explain their behaviors in plain English and offer greater insights into their decision making.
DDoS attacks can be extremely disruptive, and they are on the rise. The Verisign Distributed Denial of Service Trends Report states that DDoS attack activity increased 85 percent in each of the last two years with 32 percent of those attacks in the fourth quarter of 2015 targeting IT services, cloud computing, and software-as-a-service companies. In this blog post, I provide an overview of DDoS attacks and best practices for mitigating and responding to them based on cumulative experience in this field.
Cyber threat modeling, the creation of an abstraction of a system to identify possible threats, is a required activity for DoD acquisition. Identifying potential threats to a system, cyber or otherwise, is increasingly important in today's environment. The number of information security incidents reported by federal agencies to the U.S. Computer Emergency Readiness Team (US-CERT) has increased by 1,121 percent from 5,503 in fiscal year 2006 to 67,168 in fiscal year 2014, according to a 2015 Government Accountability Office report. Yet, our experience has been that it is often conducted informally with few standards. Consequently, important threat scenarios are often overlooked.
Given the dynamic cyber threat environment in which DoD systems operate, we have embarked on research work aimed at making cyber threat modeling more rigorous, routine, and automated. This blog post evaluates three popular methods of cyber threat modeling and discusses how this evaluation will help develop a model that fuses the best qualities of each.
Over the past six months, we have developed new security-focused modeling tools that capture vulnerabilities and their propagation paths in an architecture. Recent reports (such as the remote attack surface analysis of automotive systems) show that security is no longer only a matter of code and is tightly related to the software architecture. These new tools are our contribution toward improving system and software analysis. We hope they will move forward other work on security modeling and analysis and be useful to security researchers and analysts. This post explains the motivation of our work, the available tools, and how to use them.
The threat of insiders causing physical harm to fellow employees or themselves at an organization is real. In 2015 and 2016 alone, there were shootings in the U.S. by current or former employees in various workplaces, including at a television...