AI Engineering Report Highlights Needs and Challenges to Focus Emerging Discipline
• Article
October 29, 2020—A report released this week by the SEI highlights three areas of focus for the growing artificial intelligence (AI) engineering movement: robust and secure AI, scalable AI, and human-centered AI. The report is based on ideas shared in a 2019 workshop the SEI convened to identify challenges and opportunities for AI engineering for defense and national security.
The report defines and presents needs and challenges for each theme. Robust and secure AI systems must work as expected and be resilient to threats, including attacks related to adversarial machine learning. Scalable AI systems must be able to operate under different conditions related to size, speed, and complexity. Human-centered systems must reflect organizational and socio-technical considerations, from ethics to interpretability.
The needs listed in the report include
- tools for testing, monitoring, and mitigating AI system robustness
- available, scalable, and adaptive AI computing infrastructure
- mechanisms and frameworks to enforce ethical principles
The day-long community of interest workshop, facilitated by the SEI in October 2019, laid a foundation for identifying challenges and opportunities for future initiatives in AI engineering for defense and national security. The 26 workshop participants were thought leaders in defense, national security, industry, and academia from organizations including the Defense Advanced Research Projects Agency (DARPA), the Office of Naval Research, IBM Research, and MIT Lincoln Laboratory, among others.
"Bringing together a diverse group of participants was critical to understanding both the challenges and opportunities for AI engineering," said Matt Gaston, the director of the SEI Emerging Technology Center and lead for AI engineering at the SEI. "And the themes that emerged were common across sectors."
The workshop’s overall conclusion was, “Where possible, the [Department of Defense] and related organizations should identify opportunities to build, share, evolve, and mature processes, practices, tools, and technologies for reliably engineering AI systems.” This conclusion aligns with the Defense Innovation Board’s own 2019 recommendations to “cultivate and grow the field of AI engineering.”
The recently released AI Engineering for Defense and National Security: A Report from the October 2019 Community of Interest Workshop details the nine capabilities and tool types that the AI engineering discipline will need in order to create robust and secure, scalable, and human-centered AI systems for national defense and security applications.
Download the report at https://resources.sei.cmu.edu/library/asset-view.cfm?assetid=648541.