# A General Framework for Uncertainty Estimation in Deep Learning

@article{Loquercio2020AGF, title={A General Framework for Uncertainty Estimation in Deep Learning}, author={Antonio Loquercio and Mattia Segu and Davide Scaramuzza}, journal={IEEE Robotics and Automation Letters}, year={2020}, volume={5}, pages={3153-3160} }

Neural networks predictions are unreliable when the input sample is out of the training distribution or corrupted by noise. Being able to detect such failures automatically is fundamental to integrate deep learning algorithms into robotics. Current approaches for uncertainty estimation of neural networks require changes to the network and optimization process, typically ignore prior knowledge about the data, and tend to make over-simplifying assumptions which underestimate uncertainty. To… Expand

#### Supplemental Code

Github Repo

Via Papers with Code

This repository provides the code used to implement the framework to provide deep learning models with total uncertainty estimates as described in "A General Framework for Uncertainty Estimation in Deep Learning" (Loquercio, Segù, Scaramuzza. RA-L 2020).

#### Figures, Tables, and Topics from this paper

#### Paper Mentions

#### 72 Citations

Trust Your Robots! Predictive Uncertainty Estimation of Neural Networks with Sparse Gaussian Processes

- Computer Science
- ArXiv
- 2021

It is argued that the practical and principled combination of DNNs with sparse Gaussian Processes can pave the way towards reliable and fast robot learning systems with uncertainty awareness. Expand

On the Practicality of Deterministic Epistemic Uncertainty

- Computer Science
- ArXiv
- 2021

It is found that, while DUMs scale to realistic vision tasks and perform well on OOD detection, the practicality of current methods is undermined by poor calibration under realistic distributional shifts. Expand

A Survey of Uncertainty in Deep Neural Networks

- Computer Science, Mathematics
- ArXiv
- 2021

A comprehensive introduction to the most crucial sources of uncertainty in neural networks is given and their separation into reducible model uncertainty and not reducible data uncertainty is presented. Expand

Robust Neural Regression via Uncertainty Learning

- Computer Science, Mathematics
- 2021 International Joint Conference on Neural Networks (IJCNN)
- 2021

This work proposes a simple solution by extending the time-tested iterative reweighted least square (IRLS) in generalised linear regression, using two sub-networks to parametrise the prediction and uncertainty estimation, enabling easy handling of complex inputs and nonlinear response. Expand

Sparsity Increases Uncertainty Estimation in Deep Ensemble

- Computer Science
- Comput.
- 2021

It is empirically showed that the ensemble members’ disagreement increases with pruning, making models sparser by zeroing irrelevant parameters, which helps in making more robust predictions, and an energy-efficient compressed deep ensemble is appropriate for memory-intensive and uncertainty-aware tasks. Expand

Bayesian Deep Learning Hyperparameter Search for Robust Function Mapping to Polynomials with Noise

- Computer Science, Mathematics
- ArXiv
- 2021

This paper attempts to study the question that an appropriate neural architecture and ensemble configuration can be found to detect a signal of any n-th order polynomial contaminated with noise, and suggests the possible existence of an optimal network depth as well as an optimal number of ensembles for prediction skills and uncertainty quantification. Expand

Uncertainty Estimation for Data-Driven Visual Odometry

- Computer Science
- IEEE Transactions on Robotics
- 2020

This work proposes uncertainty-aware VO (UA-VO), a novel deep neural network (DNN) architecture that computes relative pose predictions by processing sequence of images and, at the same time, provides uncertainty measures about those estimations. Expand

Uncertainty Estimation for Deep Learning-Based Segmentation of Roads in Synthetic Aperture Radar Imagery

- Computer Science
- Remote. Sens.
- 2021

A deep learning model for road extraction in SAR is created and used to compare standard model outputs against the most popular methods for uncertainty estimation, MCD and DE, finding that these methods are not effective as an indicator of segmentation quality when measuring uncertainty but are effective when uncertainty is measured from the set of road predictions only. Expand

Bayesian deep learning of affordances from RGB images

- Computer Science
- ArXiv
- 2021

A Bayesian deep learning method to predict the affordances available in the environment directly from RGB images based on a multiscale CNN that combines local and global information from the object and the full image. Expand

Uncertainty-Aware Self-Supervised Learning of Spatial Perception Tasks

- Computer Science
- IEEE Robotics and Automation Letters
- 2021

Quantitative results show that the general self-supervised learning approach for spatial perception tasks, such as estimating the pose of an object relative to the robot, from onboard sensor readings, works well and explicitly accounting for uncertainty yields statistically significant performance improvements. Expand

#### References

SHOWING 1-10 OF 48 REFERENCES

Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles

- Computer Science, Mathematics
- NIPS
- 2017

This work proposes an alternative to Bayesian NNs that is simple to implement, readily parallelizable, requires very little hyperparameter tuning, and yields high quality predictive uncertainty estimates. Expand

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

- Mathematics, Computer Science
- ICML
- 2016

A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. Expand

What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision?

- Computer Science
- NIPS
- 2017

A Bayesian deep learning framework combining input-dependent aleatoric uncertainty together with epistemic uncertainty is presented, which makes the loss more robust to noisy data, also giving new state-of-the-art results on segmentation and depth regression benchmarks. Expand

Concrete Dropout

- Computer Science, Mathematics
- NIPS
- 2017

This work proposes a new dropout variant which gives improved performance and better calibrated uncertainties, and uses a continuous relaxation of dropout’s discrete masks to allow for automatic tuning of the dropout probability in large models, and as a result faster experimentation cycles. Expand

Deep Reinforcement Learning in a Handful of Trials using Probabilistic Dynamics Models

- Computer Science, Mathematics
- NeurIPS
- 2018

This paper proposes a new algorithm called probabilistic ensembles with trajectory sampling (PETS) that combines uncertainty-aware deep network dynamics models with sampling-based uncertainty propagation, which matches the asymptotic performance of model-free algorithms on several challenging benchmark tasks, while requiring significantly fewer samples. Expand

Lightweight Probabilistic Deep Networks

- Computer Science, Mathematics
- 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
- 2018

This paper proposes probabilistic output layers for classification and regression that require only minimal changes to existing networks and shows that activation uncertainties can be propagated in a practical fashion through the entire network, again with minor changes. Expand

Probabilistic Backpropagation for Scalable Learning of Bayesian Neural Networks

- Computer Science, Mathematics
- ICML
- 2015

This work presents a novel scalable method for learning Bayesian neural networks, called probabilistic backpropagation (PBP), which works by computing a forward propagation of probabilities through the network and then doing a backward computation of gradients. Expand

Assumed Density Filtering Methods for Learning Bayesian Neural Networks

- Computer Science
- AAAI
- 2016

This paper rigorously compares the recently proposed assumed density filtering based methods for learning Bayesian neural networks – Expectation and Probabilistic backpropagation and develops several extensions, including a version of EBP for continuous regression problems and a PBP variant for binary classification. Expand

Self-Supervised Deep Reinforcement Learning with Generalized Computation Graphs for Robot Navigation

- Computer Science, Mathematics
- 2018 IEEE International Conference on Robotics and Automation (ICRA)
- 2018

A generalized computation graph is proposed that subsumes value-based model-free methods and model-based methods, and is instantiate to form a navigation model that learns from raw images and is sample efficient, and outperforms single-step and double-step double Q-learning. Expand

Risk versus Uncertainty in Deep Learning: Bayes, Bootstrap and the Dangers of Dropout

- Computer Science
- 2016

The implication of these results is clear: combine deep learning with Bayesian inference for the best decisions from data. Expand