by Birk Torpmann-Hagen, UiT: The Arctic University of Tromsø
On April 7th, at 11:00 AM in room FC6 029 of the DCC-FCUP, Birk Torpmann-Hagen will give a talk entitled "Runtime Assurance of Deep Neural Networks."
The talk is organized by the DCC-FCUP.
Short Bio
Birk Torpmann-Hagen is a final-year doctoral student at UiT: The Arctic University of Tromsø, in collaboration with SimulaMet. His thesis explores methods for detecting and characterizing distributional shift and how these methods can be leveraged toward credible and accurate runtime performance estimates. He spent five semesters as a teaching assistant in courses covering AI, image analysis, and mechatronics, and interned at the Norwegian Defense Research Institute (FFI) and Domos, an Oslo-based tech startup. His primary research interests are generalization in deep learning, malware in neural networks, and distributional shift detection.
Title
Runtime Guarantee of Deep Neural Networks
Abstract
Despite achieving excellent performance on benchmarks, deep neural networks are known to fail in deployment scenarios. This phenomenon can largely be attributed to the sensitivity of neural networks to minor and even imperceptible changes in the nature of the input data, often referred to as distributional shifts. These shifts are common in real-world scenarios but are rarely accounted for in evaluations, often resulting in inflated performance metrics that misrepresent the network's performance. Effective and responsible deployment thus requires accounting for the incidence of distributional shifts. This presentation represents a summary of my research to this end and encompasses the development of a comprehensive toolset for the runtime verification, evaluation, and risk-assessment of deep neural networks at runtime.
