Awesome
README
This folder contains the implementation of Amplifying Membership Exposure via Data Poisoning.
Requirements
To install requirements:
pip install -r requirements.txt
Dataset
- MNIST (part of TensorFlow)
- CIFAR-10 (part of TensorFlow)
- STL-10. Please download the dataset from the official link and uncompress the downloaded file (will get the
stl10_binary
folder). Then move the folderstl10_binary
into thedataset
. - CelebA. Please download
img_align_celeba
andimg_align_celeba.csv
from this link. Then, create a subfolderceleba
underdataset
, and move the filelist_attr_celeba.csv
and the folderimg_align_celeba
intoceleba
. - PatchCamelyon. Please download
camelyonpatch_level_2_split_test_x.h5
andcamelyonpatch_level_2_split_test_y.h5
from this link. Then, create a subfolderpathcamelyon
underdataset
, and move the two files intopatchcamelyon
.
The structure of the dataset
folder should be
dataset
+-- stl10_binary
| +-- train_X.bin
| +-- train_Y.bin
| +-- test_X.bin
| +-- test_Y.bin
| ...
+-- celeba
| +-- img_align_celeba
| +-- list_attr_celeba.csv
+-- patchcamelyon
| +-- camelyonpatch_level_2_split_test_x.h5
| +-- camelyonpatch_level_2_split_test_y.h5
Basic Usage
File poisoning_attack.py
allows to run our poisoning attacks.
Example:
python poisoning_attack.py --target_class 0 \
--dataset cifar10 \
--encoder xception \
--seed_amount 1000 \
--attack_type clean_label \
--device_no 0
Example Attack
We provide example attacks in attack_example.sh
. Directly run:
./attack_example.sh
Evaluation
To evaluate our poisoning attacks, first run:
./poisoning_models.sh
to generate poisoning datasets and poisoned models.
Then, run:
./evaluate_attack.sh
to get the evaluation results.
Citation
@inproceedings{CSSWZ22,
author = {Yufei Chen and Chao Shen and Yun Shen and Cong Wang and Yang Zhang},
title = {{Amplifying Membership Exposure via Data Poisoning}},
booktitle = {{Annual Conference on Neural Information Processing Systems (NeurIPS)}},
publisher = {NeurIPS},
year = {2022}
}