Home

Awesome

FastFlow

An unofficial pytorch implementation of FastFlow: Unsupervised Anomaly Detection and Localization via 2D Normalizing Flows (Jiawei Yu et al.).

As the paper doesn't give all implementation details, it's kinda difficult to reproduce its result. A very close AUROC is achieved in this repo. But there are still some confusions and a lot of guesses:

Really appreciate the inspiring discussion with the community. Feel free to comment, raise new issues or PRs.

Installation

pip install -r requirements.txt

Data

We use MVTec-AD to verify the performance.

The dataset is organized in the following structure:

mvtec-ad
|- bottle
|  |- train
|  |- test
|  |- ground_truth
|- cable
|  |- train
|  |- test
|  |- ground_truth
...

Train and eval

Take ResNet18 as example

# train
python main.py -cfg configs/resnet18.yaml --data path/to/mvtec-ad -cat [category]
# a folder named _fastflow_experiment_checkpoints will be created automatically to save checkpoints

# eval
python main.py -cfg configs/resnet18.yaml --data path/to/mvtec-ad -cat [category] --eval -ckpt _fastflow_experiment_checkpoints/exp[index]/[epoch#].pt

Performance

As the training process is not stable, I paste both the performance of the last (500th) epoch and the best epoch.

AUROC (last/best)wide-resnet-50resnet18DeiTCaiT
bottle0.987/0.9890.975/0.9790.931/0.9590.926/0.976
cable0.950/0.9780.942/0.9620.976/0.9790.975/0.981
capsule0.987/0.9890.979/0.9850.982/0.9880.987/0.990
carpet0.988/0.9890.986/0.9860.991/0.9940.981/0.993
grid0.991/0.9930.973/0.9850.965/0.9800.968/0.970
hazel nut0.957/0.9840.922/0.9630.982/0.9900.981/0.992
leather0.995/0.9960.991/0.9960.991/0.9940.994/0.996
metal nut0.968/0.9860.950/0.9660.980/0.9880.977/0.984
pill0.968/0.9770.955/0.9680.977/0.9890.984/0.990
screw0.969/0.9870.952/0.9570.990/0.9900.991/0.993
tile0.955/0.9710.916/0.9510.966/0.9660.946/0.972
toothbrush0.985/0.9860.967/0.9780.983/0.9880.989/0.992
transistor0.956/0.9750.970/0.9750.959/0.9700.967/0.969
wood0.948/0.9640.894/0.9540.960/0.9630.950/0.959
zipper0.980/0.9870.969/0.9790.966/0.9740.972/0.984
MEAN0.972/0.9830.956/0.9720.973/0.9810.973/0.983
Paper0.9810.9720.9810.985