Home

Awesome

News

An extension of this work has been released on https://github.com/Yunfan-Li/Twin-Contrastive-Learning. It significantly improves the clustering performance and supports multi-gpu training.

Contrastive Clustering (CC)

This is the code for the paper "Contrastive Clustering" (AAAI 2021)

<div align=center><img src="Figures/Figure1.png" width = "30%"/></div> <div align=center><img src="Figures/Figure2.png" width = "70%"/></div>

Dependency

Usage

Configuration

There is a configuration file "config/config.yaml", where one can edit both the training and test options.

Training

After setting the configuration, to start training, simply run

python train.py

Since the traning strategy for STL-10 is slightly different from others (unlabeled data is used on ICH only while training and test split are used on both ICH and CCH), to start training on STL-10, run

python train_STL10.py

Test

Once the training is completed, there will be a saved model in the "model_path" specified in the configuration file. To test the trained model, run

python cluster.py

We uploaded the pretrained model which achieves the performance reported in the paper to the "save" folder for reference.

Dataset

CIFAR-10, CIFAR-100, STL-10 will be automatically downloaded by Pytorch. Tiny-ImageNet can be downloaded from http://cs231n.stanford.edu/tiny-imagenet-200.zip. For ImageNet-10 and ImageNet-dogs, we provided their description in the "dataset" folder.

Citation

If you find CC useful in your research, please consider citing:

@inproceedings{li2021contrastive,
  title={Contrastive clustering},
  author={Li, Yunfan and Hu, Peng and Liu, Zitao and Peng, Dezhong and Zhou, Joey Tianyi and Peng, Xi},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={35},
  number={10},
  pages={8547--8555},
  year={2021}
}