Awesome
New default branch
Branch master is no longer the default branch in this repository. Please use branch dev instead.
Bayesian machine learning notebooks
This repository is a collection of notebooks about Bayesian Machine Learning. The following links display the notebooks via nbviewer to ensure a proper rendering of formulas. Update: PyMC3 and PyMC4 implementations are now available for some notebooks (more planned).
-
Latent variable models, part 1: Gaussian mixture models and the EM algorithm. Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example implementation with plain NumPy/SciPy and scikit-learn for comparison (see also PyMC3 implementation).
-
Latent variable models, part 2: Stochastic variational inference and variational autoencoders. Introduction to stochastic variational inference with variational autoencoder as application example. Implementation with Tensorflow 2.x.
-
Variational inference in Bayesian neural networks. Demonstrates how to implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras (see also PyMC4 implementation).
-
Bayesian regression with linear basis function models. Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison (see also PyMC4 implementation and PyMC3 implementation).
-
Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.
-
Bayesian optimization. Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries scikit-optimize and GPyOpt. Hyperparameter tuning as application example.
-
Deep feature consistent variational autoencoder. Describes how a perceptual loss can improve the quality of images generated by a variational autoencoder. Example implementation with Keras.
-
Conditional generation via Bayesian optimization in latent space. Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space of variational autoencoders. Example application implemented with Keras and GPyOpt.