Skip to content

Deep-UQ Documentation

Purpose: Deep-UQ helps you train predictive models that know when they are uncertain, so decisions can use both predictions and confidence.

Unified uncertainty quantification toolkit in PyTorch. Build and compare Deep Ensembles, Bayes by Backprop, Laplace, SGLD, MC Dropout, and Gaussian Process methods in one package.

Get Started API Reference

Developer First

Clear wrappers, examples, notebooks, and API-level primitives for direct integration into existing PyTorch code.

Method Breadth

Deep Bayesian VI, Laplace backends, MCMC sampling, stochastic dropout, and exact/sparse Gaussian processes.

Reproducible Workflows

Tutorial-driven structure with runnable scripts under examples/ and notebooks under notebooks/.

Why This Library Exists

Deep-UQ is built for engineers and researchers who need uncertainty-aware modeling without stitching together multiple libraries. It provides one package with shared workflows for:

  • model training,
  • posterior or predictive uncertainty estimation,
  • method-to-method comparison,
  • tutorials and examples that run from the same codebase.

What You Get

  • Six UQ families in one interface surface.
  • A regression-first deep ensemble baseline for deterministic backbones.
  • Consistent regression/classification uncertainty outputs through UQResult.
  • Native Laplace backends for diag, fisher_diag, lowrank_diag, block_diag, kron, and full.
  • Full Gaussian Process suite: exact, sparse, classification, heteroscedastic, multitask, spectral, and deep-kernel variants.
  • Reproducible tutorials, examples, and benchmark scripts.

Model Architectures

Deep-UQ now documents predictive backbones separately from uncertainty methods. Use the architecture inventory to see which models are available for 1D, 2D, and 3D tasks and which UQ methods pair naturally with them.

Method Families

The website is the canonical reading surface for the method guides below. Each family section here gives a compact comparison table, then links to the full method page, API reference, and tutorial guide.

Legend: direct support, not a primary capability for that method.

Deep Ensembles

Deep ensembles are the main multi-model uncertainty baseline for deterministic backbones. They are especially useful for convolutional surrogates where MC Dropout is natural and last-layer Laplace is not.

Read more: Deep Ensembles method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
Deep Ensemble Regressor DeepEnsembleRegressor, DeepEnsembleWrapper Guide
API
ADR Tutorial
Heteroscedastic Deep Ensemble Regressor HeteroscedasticDeepEnsembleRegressor Guide
ADR + Noise Tutorial
Deep Ensemble Classifier DeepEnsembleClassifier Guide
Elasticity Classification
Multi-Output Deep Ensemble Regressor MultiOutputDeepEnsembleRegressor Guide
Elastic Bar Tutorial
Heteroscedastic Multi-Output Deep Ensemble HeteroscedasticMultiOutputDeepEnsembleRegressor Guide
Transport2D Tutorial

Variational Inference

The VI family covers end-to-end Bayesian neural networks, including plain Bayes by Backprop, heteroscedastic regression variants, multi-output heads, and last-layer VI for scalable Bayesian heads on deterministic feature extractors.

Read more: Variational Inference method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
Bayes by Backprop BayesianLinear, BayesByBackpropMLP, vi_elbo_step, predict_vi_uq Guide
API
Tutorial
Heteroscedastic Bayes by Backprop HeteroscedasticBayesByBackpropRegressor, predict_vi_uq Guide
ADR1D Tutorial
Multi-Output Bayes by Backprop MultiOutputBayesByBackpropRegressor, predict_vi_uq Guide
Elastic Bar Tutorial
Heteroscedastic Multi-Output Bayes by Backprop HeteroscedasticMultiOutputBayesByBackpropRegressor, predict_vi_uq Guide
Transport2D Tutorial
Last-Layer Variational Inference optional LastLayerVariationalInference, predict_vi_uq Guide
Heat2D Classification Tutorial

Laplace Approximation

Laplace methods wrap a trained MAP model with a Gaussian posterior defined by a chosen curvature structure. They are a good fit when you want strong post-hoc uncertainty with less retraining cost than full Bayesian neural nets.

Read more: Laplace method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
Diagonal Laplace LaplaceWrapper(hessian_structure="diag") Guide
API
Tutorial
Fisher-Diagonal Laplace LaplaceWrapper(hessian_structure="fisher_diag") Guide
API
Tutorial
Low-Rank + Diagonal Laplace LaplaceWrapper(hessian_structure="lowrank_diag") Guide
API
Tutorial
Block-Diagonal Laplace LaplaceWrapper(hessian_structure="block_diag") Guide
API
Tutorial
Kronecker-Factored Laplace LaplaceWrapper(hessian_structure="kron") Guide
API
Tutorial
Full-Hessian Laplace LaplaceWrapper(hessian_structure="full") Guide
API
Tutorial

MCMC / SGLD

SGLD is the package's posterior-sampling method for deep networks. Use it when you want sampled parameter trajectories and Monte Carlo predictive uncertainty.

Read more: MCMC / SGLD method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
Stochastic Gradient Langevin Dynamics SGLDOptimizer, collect_posterior_samples, predict_with_samples_uq Guide
API
Tutorial

MC Dropout

MC Dropout is the fastest neural-network UQ baseline in the package. Use it when you want uncertainty estimates with minimal changes to an existing dropout-enabled model.

Read more: MC Dropout method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
MC Dropout MCDropoutWrapper, predict_uq Guide
API
Tutorial

Gaussian Processes

The GP family covers the broadest set of structured nonparametric models in the package: exact and sparse regression, GP classification, heteroscedastic noise, multi-task coupling, spectral kernels, and deep kernel learning.

Read more: Gaussian Processes method guide

Method Reg. Cls. Multi Model UQ Noise UQ Main Interface Learn More
Exact GP Regression GaussianProcessRegressor Guide
API
Tutorial
Sparse Variational GP SparseGaussianProcessRegressor Guide
API
Tutorial
GP Classifier GaussianProcessClassifier Guide
API
Tutorial
OvR GP Classifier OneVsRestGaussianProcessClassifier Guide
API
Tutorial
Heteroscedastic GP HeteroscedasticGaussianProcessRegressor Guide
API
Tutorial
Multi-task ICM GP MultiTaskGaussianProcessRegressor Guide
API
Tutorial
Spectral Mixture GP SpectralMixtureGaussianProcessRegressor Guide
API
Tutorial
Deep Kernel GP DeepKernelGaussianProcessRegressor Guide
API
Tutorial

Choosing a Method

Goal Recommended Start Why
Fast UQ baseline for deep nets MC Dropout Minimal training changes, simple inference
Better local posterior around MAP Laplace (diag or kron) Strong uncertainty quality vs cost
Full Bayesian weight posterior approximation VI (Bayes by Backprop) End-to-end posterior learning
Posterior sampling perspective SGLD Direct sample-based uncertainty
Calibration-oriented nonparametric baseline Exact/Sparse GP Strong uncertainty behavior with kernel priors

Unified Output (UQResult)

All major methods provide a standardized uncertainty output with fields:

  • mean
  • epistemic_var
  • aleatoric_var
  • total_var
  • probs, probs_var (classification)
  • metadata
from deepuq.methods import LaplaceWrapper

la = LaplaceWrapper(model, likelihood="regression", hessian_structure="diag")
la.fit(train_loader)
uq = la.predict_uq(x_test, n_samples=100)
print(uq.mean.shape, uq.total_var.shape)

Quick Install

pip install uqdeepnn

Benchmarks

Deep-UQ includes a multi-dataset benchmark runner for regression metrics and runtime comparisons:

python benchmarks/run_benchmarks.py --preset quick

Outputs:

  • benchmarks/results/results.csv
  • benchmarks/results/summary.md

Start Here