Home

Awesome

Variational Bayesian Monte Carlo (VBMC) - v1.0.12

News:

What is it?

VBMC is an approximate inference method designed to fit and evaluate computational models with a limited budget of potentially noisy likelihood evaluations (e.g., for computationally expensive models). Specifically, VBMC simultaneously computes:

Extensive benchmarks on both artificial test problems and a large number of real model-fitting problems from computational and cognitive neuroscience show that VBMC generally — and often vastly — outperforms alternative methods for sample-efficient Bayesian inference [1,2].

VBMC runs with virtually no tuning and it is very easy to set up for your problem (especially if you are already familiar with BADS, our model-fitting algorithm based on Bayesian optimization).

Should I use VBMC?

VBMC is effective when:

Conversely, if your model can be written analytically, you should exploit the powerful machinery of probabilistic programming frameworks such as Stan or PyMC3.

Installation

Download the latest version of VBMC as a ZIP file.

Quick start

The VBMC interface is similar to that of MATLAB optimizers. The basic usage is:

[VP,ELBO,ELBO_SD] = vbmc(FUN,X0,LB,UB,PLB,PUB);

with input parameters:

The output parameters are:

The variational posterior vp can be manipulated with functions such as vbmc_moments (compute posterior mean and covariance), vbmc_pdf (evaluates the posterior density), vbmc_rnd (draw random samples), vbmc_kldiv (Kullback-Leibler divergence between two posteriors), vbmc_mtv (marginal total variation distance between two posteriors); see also this question.

Next steps

For BADS users

If you already use Bayesian Adaptive Direct Search (BADS) to fit your models, setting up VBMC on your problem should be particularly simple; see here.

How does it work?

VBMC combines two machine learning techniques in a novel way:

VBMC iteratively builds an approximation of the true, expensive target posterior via a Gaussian process (GP), and it matches a variational distribution — an expressive mixture of Gaussians — to the GP.

This matching process entails optimization of the evidence lower bound (ELBO), that is a lower bound on the log marginal likelihood (LML), also known as log model evidence. Crucially, we estimate the ELBO via Bayesian quadrature, which is fast and does not require further evaluation of the true target posterior.

In each iteration, VBMC uses active sampling to select which points to evaluate next in order to explore the posterior landscape and reduce uncertainty in the approximation.

VBMC demo

In the figure above, we show an example VBMC run on a "banana" function. The left panel shows the ground truth for the target posterior density. In the middle panel we show VBMC at work (contour plots of the variational posterior) across iterations. Red crosses are the centers of the mixture of Gaussians used as variational posterior, whereas dots are sampled points in the training set (black: previously sampled points, blue: points sampled in the current iteration). The right panel shows a plot of the estimated ELBO vs. the true log marginal likelihood (LML).

In the figure below, we show another example VBMC run on a "lumpy" distribution.

Another VBMC demo

See the VBMC paper for more details [1].

VBMC with noisy likelihoods

VBMC v1.0 (June 2020) introduced support for noisy models [2]. See the presentations section below for recorded talks that discuss the new version of VBMC. To run VBMC on a noisy problem, first you need to ensure that your target function fun returns:

Noisy evaluations of the log-likelihood often arise from simulation-based models, for which a direct expression of the (log) likelihood is not available. We recommend Inverse Binomial Sampling (IBS) as a method that conveniently computes both an unbiased estimate of the log-likelihood and an estimate of its variability entirely through simulation — however VBMC is compatible with any estimation technique.

Once you have set up fun as above, run VBMC by specifying that the target function is noisy

OPTIONS.SpecifyTargetNoise = true;
[VP,ELBO,ELBO_SD] = vbmc(FUN,X0,LB,UB,PLB,PUB,OPTIONS);

For more information, see the VBMC FAQ and Example 6 in the VBMC tutorial.

In the figure below, we show the difference in performance between the original VBMC (old) and VBMC v1.0 (new) when dealing with noisy target evaluations.

VBMC2020 demo

Troubleshooting

The VBMC toolbox is under active development. The toolbox has been extensively tested in several benchmarks and published papers, but as with any approximate inference technique you need to double-check your results. See the FAQ for more information on diagnostics.

If you have trouble doing something with VBMC:

This project is under active development. If you find a bug, or anything that needs correction, please let us know.

Presentations

Work related to VBMC has been presented at seminars in Oxford (UK), Bristol (UK), NYU (NY, USA), Helsinki (Finland), Brown University (RI, USA), NTNU (Trondheim, Norway), etc., and at several conferences. Recent presentations cover both VBMC papers (2018, 2020) and related work on simulator-based inference, with titles such as ``Practical sample-efficient Bayesian inference for models with and without likelihoods''.

References

  1. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 31: 8222-8232. (paper + supplement on arXiv, NeurIPS Proceedings)
  2. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Advances in Neural Information Processing Systems 33: 8211-8222 (paper + supplement on arXiv, NeurIPS Proceedings).

Please cite both references if you use VBMC in your work (the 2018 paper introduced the framework, and the 2020 paper includes a number of major improvements, including but not limited to support for noisy likelihoods). You can cite VBMC in your work with something along the lines of

We estimated approximate posterior distibutions and approximate lower bounds to the model evidence of our models using Variational Bayesian Monte Carlo (VBMC; Acerbi, 2018, 2020). VBMC combines variational inference and active-sampling Bayesian quadrature to perform approximate Bayesian inference in a sample-efficient manner.

Besides formal citations, you can demonstrate your appreciation for VBMC in the following ways:

You may also want to check out Bayesian Adaptive Direct Search (BADS), our method for fast Bayesian optimization.

Additional references

  1. Acerbi, L. (2019). An Exploration of Acquisition and Mean Functions in Variational Bayesian Monte Carlo. In Proc. Machine Learning Research 96: 1-10. 1st Symposium on Advances in Approximate Bayesian Inference, Montréal, Canada. (paper in PMLR)

BibTeX

@article{acerbi2018variational,
  title={{V}ariational {B}ayesian {M}onte {C}arlo},
  author={Acerbi, Luigi},
  journal={Advances in Neural Information Processing Systems},
  volume={31},
  pages={8222--8232},
  year={2018}
}

@article{acerbi2020variational,
  title={{V}ariational {B}ayesian {M}onte {C}arlo with noisy likelihoods},
  author={Acerbi, Luigi},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  pages={8211--8222},
  year={2020}
}

@article{acerbi2019exploration,
  title={An Exploration of Acquisition and Mean Functions in {V}ariational {B}ayesian {M}onte {C}arlo},
  author={Acerbi, Luigi},
  journal={PMLR},
  volume={96},
  pages={1--10},
  year={2019}
}

Acknowledgments

The Python port of VBMC was supported by the Academy of Finland Flagship programme: Finnish Centre for Artificial Intelligence FCAI.

License

VBMC is released under the terms of the BSD 3-clause license (BSD new).