Home

Awesome

SI-NI-FGSM

This repository contains code to reproduce results from the paper:

Nesterov Acceralated Gradient and Scale Invariance for Adversarial Attacks (ICLR2020)

openreview report: https://openreview.net/forum?id=SJlHwkBYDH

REQUIREMENTS

EXPERIMENTS

The code consists of five Python scripts. You should download the data and pretrained models before running the code. Then place the data and pretrained models in dev_data/ and models/, respectively.

Running the code

Example usage

After cloning the repository you can run the giving attack code to generate adversarial examples and then evaluate the attack success rate.

python si_ni_fgsm.py
python simple_eval.py

Acknowledgments

Code refer to: Momentum Attack