Home

Awesome

FedLAP-DP: Federated Learning by Sharing Differentially Private Loss Approximations

alt text

Note: this repo is implemented in a sequentially running manner. We are working on a parallel implementation with the Flower framework.

FedLAP-DP: Federated Learning by Sharing Differentially Private Loss Approximations (PoPETS 2024)
Hui-Po Wang, Dingfan Chen, Raouf Kerkouche, Mario Fritz
A novel federated optimization approach overcoming privacy and non-IID challenges via gradient approximation

[Paper (PoPETS 2024)]


Quick Start

Follow the instructions below to reproduce the main result of our work quickly.

  1. Modify the --data-path argument in main.py and main_baseline.py accordingly (or specify it in the console)

  2. Set up the environment

pip install -r requirements.txt
  1. Run the following scripts
bash script-fed/baseline-nondp.sh
bash script-fed/fedlap-nondp.sh
bash script-fed/baseline-dp.sh
bash script-fed/fedlap-dp.sh

More Experiments

The ablation studies in the paper can be reproduced by the following instructions, which have been provided in the scripts, e.g., script-fed/fedlap-nondp.sh.

python -W ignore main.py -cfg settings-fed/fedlap/cifar10-fedlap-median.yaml
python -W ignore main.py -cfg settings-fed/fedlap/cifar10-fedlap-max.yaml
python -W ignore main.py -cfg settings-fed/fedlap/cifar10-fedlap-fixed.yaml

Installation

We provide the environment setup we tested here. This repo might be downward compatible but opacus must be 0.15.0.


Citation

@article{wang2023fedlap,
  title={FedLAP-DP: Federated Learning by Sharing Differentially Private Loss Approximations},
  author={Wang, Hui-Po and Chen, Dingfan and Kerkouche, Raouf and Fritz, Mario},
  journal={Proc. Priv. Enhancing Technol.}
  year={2024}
}

Acknowledgement

This project is partially built upon PrivateSet and Dataset Condensation