Home

Awesome

Introduction

This is the implementation of our paper An Upload-Efficient Scheme for Transferring Knowledge From a Server-Side Pre-trained Generator to Clients in Heterogeneous Federated Learning (accepted by CVPR 2024).

Key words: pre-trained generative model, knowledge transfer, federated learning, data heterogeneity, model heterogeneity

Take away: We introduce FedKTL, a Federated Knowledge Transfer Loop (KTL) that (1) transfers common knowledge from a server-side pre-trained generator to client small models, regardless of the generator's pre-training datasets, and (2) shares task-specific knowledge among clients through federated learning.

An example of our FedKTL for a 3-class classification task. Rounded and slender rectangles denote models and representations, respectively; dash-dotted and solid borders denote updating and frozen components, respectively; the segmented circle represents the ETF classifier.

Citation

@inproceedings{zhang2024upload,
  title={An Upload-Efficient Scheme for Transferring Knowledge From a Server-Side Pre-trained Generator to Clients in Heterogeneous Federated Learning},
  author={Zhang, Jianqing and Liu, Yang and Hua, Yang and Cao, Jian},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2024}
}

Datasets and Environments

Due to the file size limitation, we only upload the statistics (config.json) of the Cifar10 dataset in the practical setting ($\beta=0.1$). Please refer to our popular repository PFLlib and HtFLlib to generate all the datasets and create the required python environment.

System

Training and Evaluation

All codes are stored in ./system. Just run the following commands.

cd ./system
sh run_me.sh