Bayesian Machine Learning
This is a comprehensive collection of notebooks about Bayesian methods for machine learning, like Bayesian regression, Gaussian processes, Bayesian optimization or variational inference in Bayesian neural networks. The goal of this project was to create an educational resource that makes Bayesian machine learning methods accessible and understandable to a wide audience.
Each of the following notebooks covers a single topic and contains an introduction, mathematical basics and a simple implementation.
-
Bayesian regression with linear basis function models
Introduction to Bayesian linear regression. Implementation with plain NumPy and scikit-learn. See also the PyMC3 implementation. -
Gaussian processes
Introduction to Gaussian processes for regression. Implementation with plain NumPy/SciPy as well as with scikit-learn and GPy. -
Gaussian processes for classification
Introduction to Gaussian processes for classification. Implementation with plain NumPy/SciPy as well as with scikit-learn. -
Sparse Gaussian processes
Introduction to sparse Gaussian processes using a variational approach. Example implementation with JAX. -
Bayesian optimization
Introduction to Bayesian optimization. Implementation with plain NumPy/SciPy as well as with libraries scikit-optimize and GPyOpt. Hyper-parameter tuning as application example. -
Variational inference in Bayesian neural networks
Demonstrates how to implement a Bayesian neural network and variational inference of weights. Example implementation with Keras. -
Reliable uncertainty estimates for neural network predictions
Uses noise contrastive priors for Bayesian neural networks to get more reliable uncertainty estimates for OOD data. Implemented with Tensorflow 2 and Tensorflow Probability. -
Latent variable models, part 1: Gaussian mixture models and the EM algorithm
Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Implementation with plain NumPy/SciPy and scikit-learn. See also the PyMC3 implementation. -
Latent variable models, part 2: Stochastic variational inference and variational autoencoders
Introduction to stochastic variational inference with a variational autoencoder as application example. Implementation with Tensorflow 2.x. -
Deep feature consistent variational autoencoder
Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example implementation with Keras. -
Conditional generation via Bayesian optimization in latent space
Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space learned by a variational autoencoder. Example application implemented with Keras and GPyOpt.
Links: