To content
Fakultät für Informatik
Research

Uncertainty Quantification


Visualization of uncertainty for classification © Philipp Oberdiek​/​TU Dortmund
Scatter plot visualization of an embedding space with class clusters colored in different colors © Philipp Oberdiek​/​TU Dortmund

While the advancements in Deep Neural Network research have made a huge impact on many industries in the recent years, deploying these models in the real world still poses challenges in many applications. Especially if applied to safety critical areas such as e.g. medicine, autonomous driving or surveillance, special care needs to be taken in order to detect misclassifications and scenarios the neural networks was not designed for.
In the field of uncertainty quantification we address these requirements by assigning confidence scores to the outputs of neural networks, which can be used to judge the trustworthiness of predictions.