The future of autonomy in the military could include unmanned cargo delivery; micro-autonomous air/ground systems to enhance platoon, squad, and soldier situational awareness; and manned and unmanned teaming in both air and ground maneuvers, according to a 2016 presentation by Robert Sadowski, chief roboticist for the U.S. Army Tank Automotive Research Development and Engineering Center (TARDEC), which researches and develops advanced technologies for ground systems. One day, robot medics may even carry wounded soldiers out of battle. The system behind these feats is ROS-M, the militarized version of the Robot Operating System (ROS), an open-source set of software libraries and tools for building robot applications. In this post, I will describe the work of SEI researchers to create an environment within ROS-M for developing unmanned systems that spurs innovation and reduces development time.
The growth and change in the field of robotics in the last 15 years is tremendous, due in large part to improvements in sensors and computational power. These sensors give robots an awareness of their environment, including various conditions such as light, touch, navigation, location, distance, proximity, sound, temperature, and humidity. The increasing ability of robots to sense their environments makes them an invaluable resource in a growing number of situations, from underwater explorations to hospital and airport assistants to space walks. One challenge, however, is that uncertainty persists among users about what the robot senses; what it predicts about its state and the states of other objects and people in the environment; and what it believes its outcomes will be from the actions it takes. In this blog post, I describe research that aims to help robots explain their behaviors in plain English and offer greater insights into their decision making.