Skip to yearly menu bar Skip to main content


Poster

Make Me a BNN: A Simple Strategy for Estimating Bayesian Uncertainty from Pre-trained Models

Gianni Franchi · Olivier Laurent · Maxence Leguéry · Andrei Bursuc · Andrea Pilzer · Angela Yao

Arch 4A-E Poster #249
[ ] [ Project Page ] [ Paper PDF ]
[ Slides [ Poster
Thu 20 Jun 10:30 a.m. PDT — noon PDT

Abstract:

Deep Neural Networks (DNNs) are powerful tools for various computer vision tasks, yet they often struggle with reliable uncertainty quantification — a critical requirement for real-world applications. Bayesian Neural Networks (BNN) are equipped for uncertainty estimation but cannot scale to large DNNs where they are highly instable to train.To address this challenge, we introduce the Adaptable Bayesian Neural Network (ABNN), a simple and scalable strategy to seamlessly transform DNNs into BNNs in a post-hoc manner with minimal computational and training overheads.ABNN preserves main predictive properties of DNNs while enhancing their uncertainty quantification abilities through simple BNN adaptation layers (attached to normalization layers) and few fine-tuning steps on pre-trained models. We conduct extensive experiments across multiple datasets for image classification and semantic segmentation tasks, and our results demonstrate that ABNN achieves state-of-the-art performance without the computational budget typically associated with ensemble methods.

Chat is not available.