Home

Awesome

CVPR 2022: RSCFed: Random Sampling Consensus Federated Semi-supervised Learning

Introduction

This is the official PyTorch implementation of CVPR 2022 paper "RSCFed: Random Sampling Consensus Federated Semi-supervised Learning". RSCFed: pipeline

Preparation

  1. Create conda environment:

     conda create -n RSCFed python=3.8
     conda activate RSCFed
    
  2. Install dependencies:

     pip install -r requirements.txt
     
    

SVHN and CIFAR-100 dataset will be downloaded automatically once training started.

Run the code

  1. Train model for each dataset. To produce the claimed results for SVHN dataset:
python train_main.py --dataset=SVHN \
	--model=simple-cnn \
	--unsup_num=9 \
	--batch_size=64 \
	--lambda_u=0.02 \
	--opt=sgd \
	--base_lr=0.03 \
	--unsup_lr=0.021 \
	--max_grad_norm=5 \
	--resume \
	--from_labeled \
	--rounds=1000 \
	--meta_round=3 \
	--meta_client_num=5 \
	--w_mul_times=6 \
	--sup_scale=100 \
	--dist_scale=1e4 \

For CIFAR-100 dataset:

python train_main.py --dataset=cifar100 \
	--model=simple-cnn \
	--unsup_num=9 \
	--batch_size=64 \
	--lambda_u=0.02 \
	--opt=sgd \
	--base_lr=0.03 \
	--unsup_lr=0.021 \
	--max_grad_norm=5 \
	--resume \
	--from_labeled \
	--rounds=1000 \
	--meta_round=3 \
	--meta_client_num=5 \
	--w_mul_times=6 \
	--sup_scale=100 \
	--dist_scale=1e4 \

For ISIC 2018 dataset, please find the warm-up model here. To produce the claimed result:

python train_main.py --dataset=skin \
	--model=resnet18 \
	--unsup_num=9 \
	--batch_size=12 \
	--lambda_u=0.02 \
	--opt=sgd \
	--base_lr=2e-3 \
	--unsup_lr=1e-3 \
	--max_grad_norm=5 \
	--rounds=800 \
	--meta_round=3 \
	--meta_client_num=5 \
	--w_mul_times=200 \
	--pre_sz=250 \
	--input_sz=224 \
	--dist_scale=0.01 \
	--sup_scale=0.01 \
	--resume \
	--from_labeled \

To produce all the claimed results, please modify the path of warm-up model accordingly. Warm-up models are trained only on labeled clients.

Parameters

ParameterDescription
datasetdataset used
modelbackbone structure
unsup_numnumber of unlabeled clients
batch_sizebatch size
lambda_uratio of loss on unlabeled clients
optoptimizer
base_lrlr on labeled clients
unsup_lrlr on unlabeled clients
max_grad_normlimit maximum gradient
resumeresume
from_labeledwhether resume from warm-up model
roundsmaximum global communication rounds
meta_roundnumber of sub-consensus models
meta_client_numnumber of clients in each subset
w_mul_timesscaling times for labeled clients
sup_scalescaling weights for labeled clients when computing model distance
dist_scalescaling weights when computing model distance

Evaluation

For SVHN and CIFAR-100 dataset, the best model is placed in final_model. For ISIC2018 dataset, please find the best model here.

Use the following command to generate the claimed results:

python test.py --dataset=SVHN \
	--batch_size=5 \
	--model=simple-cnn \

For CIFAR-100:

python test.py --dataset=cifar100 \
	--batch_size=5 \
	--model=simple-cnn \

For ISIC 2018:

python test.py --dataset=skin \
	--batch_size=5 \
	--model=resnet18 \
	--pre_sz=250 \
	--input_sz=224 \

For different datasets, please modify file path, arguments "dataset" and "model" correspondingly.

Citation

If this code is useful for your research, please consider citing:

@inproceedings{liang2022rscfed,
title={RSCFed: Random Sampling Consensus Federated Semi-supervised Learning},
author={Liang, Xiaoxiao and Lin, Yiqun and Fu, Huazhu and Zhu, Lei and Li, Xiaomeng},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
year={2022}
}