Home

Awesome

<img src="https://img.shields.io/badge/Contributions-Welcome-278ea5" alt="Contrib"/> Stars Forks

PyTorch Implementation of Federated Learning Baselines

PyTorch-Federated-Learning provides various federated learning baselines implemented using the PyTorch framework. The codebase follows a client-server architecture and is highly intuitive and accessible.

If you find this repository useful, please let me know with your stars:star:. Thank you!

English | 简体中文

Installation

Dependencies

Install requirements

Run: pip install -r requirements.txt to install the required packages.

Federated Dataset Preprocessing

This preprocessing aims to divide the entire datasets into a dedicated number of clients with respect to federated settings. Depending on the the number of classes in each local dataset, the entire dataset are split into Non-IID datasets in terms of label distribution skew.

Execute the Federated Learning Baselines

Test Run

Hyperparameters are defined in a yaml file, e.g. "./config/test_config.yaml", and then just run with this configuration:

python fl_main.py --config "./config/test_config.yaml"

Evaluation Procedures

Please run python postprocessing/eval_main.py -rr 'results' to plot the testing accuracy and training loss by the increasing number of epochs or communication rounds. Note that the labels in the figure is the name of result files

Citation

Our recent work about FedBEVT and ResFed:

@ARTICLE{song2023fedbevt,
  author={Song, Rui and Xu, Runsheng and Festag, Andreas and Ma, Jiaqi and Knoll, Alois},
  journal={IEEE Transactions on Intelligent Vehicles}, 
  title={FedBEVT: Federated Learning Bird's Eye View Perception Transformer in Road Traffic Systems}, 
  year={2023},
  pages={1-12},
  doi={10.1109/TIV.2023.3310674}}
@ARTICLE{song2022resfed,
  author={Song, Rui and Zhou, Liguo and Lyu, Lingjuan and Festag, Andreas and Knoll, Alois},
  journal={IEEE Internet of Things Journal}, 
  title={ResFed: Communication Efficient Federated Learning With Deep Compressed Residuals}, 
  year={2023},
  volume={},
  number={},
  pages={1-15},
  doi={10.1109/JIOT.2023.3324079}}