CMU SEI Research Review 2022 Day 1 Artifacts
• Collection
Publisher
Software Engineering Institute
Topic or Tag
Watch
Abstract
Watch and download presentations from Day 1 of the 2022 SEI Research Review event. At the 2022 Research Review, our researchers detail how they are forging a new path for software engineering by executing the SEI’s technical strategy to deliver tangible results. They highlight methods, prototypes, and tools aimed at the most important problems facing the Department of Defense (DoD), industry, and academia, including artificial intelligence (AI) engineering, computing at the tactical edge, threat hunting, continuous integration/continuous delivery, and machine learning trustworthiness. Learn how our researchers' work in areas such as model-based systems engineering, DevSecOps, automated design conformance, software/cyber/AI integration, and AI network defense—to name a few—has produced value for the U.S. DoD and advanced the state of the practice.
Day 1 included the following sessions:
- Welcome | Dr. Tom Longstaff and Dr. Paul Nielsen
- Keynote: Dr. William Streilein, Chief Technology Officer for the Chief Digital and AI Office, Department of Defense | Dr. William (Bill) W. Streilein
- Knowing When You Don't Know: Quantifying Reasoning about Reducing the Effect of Uncertainty Machine Learning Techniques | Dr. Eric Heim
- Portable High-performance Inference on the Tactical Edge (PHITE) | Dr. Scott McMillan
- Automating Mismatch Detection and Testing in ML Systems | Dr. Grace Lewis
- A Prototype Software Framework for Digital Content Forgery Detection | Dr. Shannon Gallagher
- AI Evaluation Methodology for Defensive Cyber Operator Tools | Dr. Shing-hon Lau
Collection Items
Knowing When You Don't Know: Quantifying and Reasoning about Uncertainty in Machine Learning Models
• Presentation
By Eric Heim
This project focuses on detecting model uncertainty and mitigating its effects on the quality of model inference.
Learn MorePHITE: Portable High-Performance Inference at the Tactical Edge
• Presentation
By Scott McMillan
This project applies performance engineering processes to the analysis of existing open source ML frameworks for embedded systems, to inform the development and optimization of a portable software library.
Learn MoreAutomating Mismatch Detection and Testing in ML Systems
• Presentation
By Grace Lewis
This project improves the formalization of the detection of machine learning (ML) mismatch.
Learn MoreA Machine Learning Pipeline for Deepfake Detection
• Presentation
By Shannon Gallagher
The aim of this project is to develop a deepfake detection prototype framework with at least 85% accuracy.
Learn MoreAI Evaluation Methodology for Defensive Cyber Operator Tools
• Presentation
By Shing-hon Lau
The objective of this project is to develop a methodology for evaluating the capabilities of an AI defense using publicly available information of defensive network capabilities.
Learn MorePart of a Collection
CMU SEI Research Review 2022