Home

Awesome

Self-PU: Self Boosted and Calibrated Positive-Unlabeled Training

[ICML2020] Xuxi Chen*, Wuyang Chen*, Tianlong Chen, Ye Yuan, Chen Gong, Kewei Chen, Zhangyang Wang

Overview

We proposed Self-PU Framework that introduces self-paced, self-calibrated and self-supervised learning to the PU field.

Method

framework

Set-up

Environment

conda install pytorch==0.4.1 cuda92 torchvision -c pytorch
conda install matplotlib scikit-learn tqdm
pip install opencv-python

Preparing Data

Download cifar-10 and extract it into cifar/.

Evaluation

Pretrained Model

MNIST: Google Drive, Accuracy: 94.45%

CIFAR-10: Google Drive, Accuracy: 90.05%

Evaluation Code

MNIST:

python evaluation.py --model mnist.pth.tar 

CIFAR-10:

python evaluation.py --model cifar.pth.tar --datapath cifar --dataset cifar

Training

Baseline

MNIST

python train.py --self-paced False --mean-teacher False 

CIFAR-10

python train.py --self-paced False --mean-teacher False --dataset cifar --datapath cifar

Self-PU (without self-calibration)

Training with self-calibation would be expensive. A cheap alternative:

MNIST

python train_2s2t.py --soft-label

CIFAR-10

python train_2s2t.py --dataset cifar --datapath cifar --soft-label

Self-PU

MNIST

python train_2s2t_mix.py --soft-label

CIFAR-10

python train_2s2t_mix.py --dataset cifar --datapath cifar --soft-label

Reproduce

SeedAccuracy on MNISTAccuracy on CIFAR-10
393.87%89.68%
1394.68%90.15%
2394.44%89.38%
3393.84%89.69%