Awesome
CLUB
This repository contains source code to our ICML2020 paper:
CLUB is a sample-based estimator to mutual information (MI), which can not only provide reliable upper bound MI estimation, but also effectively minimize correlation in deep models as a learning critic.
Mutual Information Estimation
We provide toy simulations in mi_estimation.ipynb
to show the estimation performance of CLUB and other MI estimators. The code in this section is written with Pytorch (latest version).
The implementation of our CLUB estimator, along with other baselines (NWJ, MINE, InfoNCE, VUB, L1Out), is in mi_estimators.py
. VUB and L1Out are implemented in the variational forms proposed in our paper.
Follow the steps in mi_estimation.ipynb
to demonstrate the MI estimation performance of different MI estimators.
Mutual Information Minimization
We test the MI minimization performance of our CLUB estimator on two real-world tasks: Information Bottleneck (IB) and Domain Adaptation (DA). We provide the instructions to reproduce the results of IB and DA in the folder MI_IB and MI_DA respectively.
Besides, we provide another toy example in mi_minimization.ipynb
to visualize how our MI minimization algorthm works under multivariate Gaussian setups.
Citation
Please cite our ICML 2020 paper if you found the code useful.
@inproceedings{cheng2020club,
title={Club: A contrastive log-ratio upper bound of mutual information},
author={Cheng, Pengyu and Hao, Weituo and Dai, Shuyang and Liu, Jiachang and Gan, Zhe and Carin, Lawrence},
booktitle={International conference on machine learning},
pages={1779--1788},
year={2020},
organization={PMLR}
}