Home

Awesome

DualAttentionAttack

This paper is accepted by CVPR'2021(Oral).

This paper proposed a dual attention supression attack approach, which exploits both the modle attention and human attention. Specifically, we distract the model attention to obtain a better attack ability, and moreover, we evade the human attention to help improving the naturalness.

Framework

<img src="media/framework.png" width="80%">

Running

before running

you need:

training

python train.py --datapath=[path to dataset] --content=[path to seed content] --canny=[path to edge mask]

results will be stored in src/logs/, include:

testing

python test.py --texture=[path to texture]

results will be stored in src/acc.txt