Home

Awesome

Inverse binomial sampling (IBS)

This repository contains MATLAB implementations and examples of IBS. For Python implementations, see PyIBS.

What is it?

Inverse binomial sampling (IBS) is a technique to obtain unbiased, efficient estimates of the log-likelihood of a model by simulation. [1]

The typical scenario is the case in which you have a simulator, that is a model from which you can randomly draw synthetic observations (for a given parameter vector), but cannot evaluate the log-likelihood analytically or numerically. In other words, IBS affords likelihood-based inference for models without explicit likelihood functions (also known as implicit models).

Quickstart

The main function is ibslike.m, which computes the estimate of the negative log-likelihood for a given simulator model and dataset via IBS. We recommend to start with the tutorial in ibs_example.m, which contains a full walkthrough with working example usages of IBS.

IBS is commonly used as a part of an algorithm for maximum-likelihood estimation or Bayesian inference:

For practical recommendations and any other question, check out the FAQ on the IBS wiki.

Code

We describe below the files in this repository:

The code used to produce results in the paper [1] is available in the development repository here.

References

  1. van Opheusden*, B., Acerbi*, L. & Ma, W.J. (2020). Unbiased and efficient log-likelihood estimation with inverse binomial sampling. PLoS Computational Biology 16(12): e1008483. (* equal contribution) (link)

You can cite IBS in your work with something along the lines of

We estimated the log-likelihood using inverse binomial sampling (IBS; van Opheusden, Acerbi & Ma, 2020), a technique that produces unbiased and efficient estimates of the log-likelihood via simulation.

If you use IBS in conjunction with Bayesian Adaptive Direct Search, as recommended in the paper, you could add

We obtained maximum-likelihood estimates of the model parameters via Bayesian Adaptive Direct Search (BADS; Acerbi & Ma, 2017), a hybrid Bayesian optimization algorithm which affords stochastic objective evaluations.

and cite the appropriate paper:

  1. Acerbi, L. & Ma, W. J. (2017). Practical Bayesian optimization for model fitting with Bayesian Adaptive Direct Search. In Advances in Neural Information Processing Systems 30:1834-1844.

Similarly, if you use IBS in combination with Variational Bayesian Monte Carlo, you should cite these papers:

  1. Acerbi, L. (2018). Variational Bayesian Monte Carlo. In Advances in Neural Information Processing Systems 31: 8222-8232.
  2. Acerbi, L. (2020). Variational Bayesian Monte Carlo with Noisy Likelihoods. In Advances in Neural Information Processing Systems 33: 8211-8222.

Besides formal citations, you can demonstrate your appreciation for our work in the following ways:

BibTex

@article{vanOpheusden2020unbiased,
  title = {Unbiased and Efficient Log-Likelihood Estimation with Inverse Binomial Sampling},
  author = {van Opheusden, Bas and Acerbi, Luigi and Ma, Wei Ji},
  year = {2020},
  journal = {PLOS Computational Biology},
  volume = {16},
  number = {12},
  pages = {e1008483},
  publisher = {{Public Library of Science}},
  issn = {1553-7358},
  doi = {10.1371/journal.pcbi.1008483},
}

License

The IBS code is released under the terms of the MIT License.