Deep-UQ Documentation
Purpose: Deep-UQ helps you train predictive models that know when they are uncertain, so decisions can use both predictions and confidence.
Unified uncertainty quantification toolkit in PyTorch. Build and compare Bayes by Backprop, Laplace, SGLD, MC Dropout, and Gaussian Process methods in one package.
Developer First
Clear wrappers, examples, notebooks, and API-level primitives for direct integration into existing PyTorch code.
Method Breadth
Deep Bayesian VI, Laplace backends, MCMC sampling, stochastic dropout, and exact/sparse Gaussian processes.
Reproducible Workflows
Tutorial-driven structure with runnable scripts under examples/ and notebooks under notebooks/.
Why This Library Exists¶
Deep-UQ is built for engineers and researchers who need uncertainty-aware modeling without stitching together multiple libraries. It provides one package with shared workflows for:
- model training,
- posterior or predictive uncertainty estimation,
- method-to-method comparison,
- tutorials and examples that run from the same codebase.
What You Get¶
- Five UQ families in one interface surface.
- Consistent regression/classification uncertainty outputs through
UQResult. - Native Laplace backends for
diag,fisher_diag,lowrank_diag,block_diag,kron, andfull. - Full Gaussian Process suite: exact, sparse, classification, heteroscedastic, multitask, spectral, and deep-kernel variants.
- Reproducible tutorials, examples, and benchmark scripts.
Method Summary¶
| Method Family | Method Name | Implemented Variants | Main Wrapper / Class | Tutorial |
|---|---|---|---|---|
| Variational Inference | Bayes by Backprop | Mean-field VI for neural nets | BayesianLinear, vi_elbo_step, predict_vi_uq | notebooks/BayesByBackprop_Tutorial.ipynb |
| Laplace Approximation | Laplace | diag, fisher_diag, lowrank_diag, block_diag, kron, full | LaplaceWrapper, predict_uq | notebooks/laplace/Laplace_HessianComparison_Tutorial.ipynb |
| MCMC | Stochastic Gradient Langevin Dynamics | Posterior sampling with SGLD | SGLDOptimizer, predict_with_samples_uq | notebooks/SGLD_Tutorial.ipynb |
| MC Dropout | MC Dropout | Stochastic dropout inference | MCDropoutWrapper, predict_uq | notebooks/MC_Dropout_Tutorial.ipynb |
| Gaussian Processes | GaussianProcessRegressor, SparseGaussianProcessRegressor, GaussianProcessClassifier, OneVsRestGaussianProcessClassifier, HeteroscedasticGaussianProcessRegressor, MultiTaskGaussianProcessRegressor, SpectralMixtureGaussianProcessRegressor, DeepKernelGaussianProcessRegressor | Exact, sparse, classification (binary + OvR), heteroscedastic, multi-task ICM, spectral mixture, deep kernel, and kernel composition (RBF, Matérn, RQ, Periodic, Linear) | GP kernel classes + predict_uq | notebooks/gp/GP_Model_Comparison.ipynb (see all under notebooks/gp/) |
Choosing a Method¶
| Goal | Recommended Start | Why |
|---|---|---|
| Fast UQ baseline for deep nets | MC Dropout | Minimal training changes, simple inference |
| Better local posterior around MAP | Laplace (diag or kron) | Strong uncertainty quality vs cost |
| Full Bayesian weight posterior approximation | VI (Bayes by Backprop) | End-to-end posterior learning |
| Posterior sampling perspective | SGLD | Direct sample-based uncertainty |
| Calibration-oriented nonparametric baseline | Exact/Sparse GP | Strong uncertainty behavior with kernel priors |
Unified Output (UQResult)¶
All major methods provide a standardized uncertainty output with fields:
meanepistemic_varaleatoric_vartotal_varprobs,probs_var(classification)metadata
from deepuq.methods import LaplaceWrapper
la = LaplaceWrapper(model, likelihood="regression", hessian_structure="diag")
la.fit(train_loader)
uq = la.predict_uq(x_test, n_samples=100)
print(uq.mean.shape, uq.total_var.shape)
Quick Install¶
Benchmarks¶
Deep-UQ includes a multi-dataset benchmark runner for regression metrics and runtime comparisons:
Outputs:
benchmarks/results/results.csvbenchmarks/results/summary.md