Home

Awesome

Image Quality Assessment Toolbox

:rocket: Note: This repository is not currently updated. For the latest metrics, consider exploring: https://github.com/chaofengc/IQA-PyTorch

<img src="https://user-images.githubusercontent.com/34084019/124798339-cf824e80-df85-11eb-948f-c0612834c404.gif" width="70%">

:e-mail: Feel free to contact: ryanxingql@gmail.com.

0. Archive

1. Content

metricclassdescriptionbetterrangeref
Peak signal-to-noise ratio (PSNR)FRThe ratio of the maximum pixel intensity to the power of the distortion.higher[0, inf)[WIKI]
Structural similarity (SSIM) indexFRLocal similarity of luminance, contrast and structure of two image.higher(?, 1][paper] [WIKI]
Multi-scale structural similarity (MS-SSIM) indexFRBased on SSIM; combine luminance information at the highest resolution level with structure and contrast information at several down-sampled resolutions, or scales.higher(?, 1][paper] [code]
Learned perceptual image patch similarity (LPIPS)FRObtain L2 distance between AlexNet/SqueezeNet/VGG activations of reference and distorted images; train a predictor to learn the mapping from the distance to similarity score. Trainable.lower[0, ?)[paper] [official repo]
Blind/referenceless image spatial quality evaluator (BRISQUE)NRModel Gaussian distributions of mean subtracted contrast normalized (MSCN) features; obtain 36-dim Gaussian parameters; train an SVM to learn the mapping from feature space to quality score.lower[0, ?)[paper]
Natural image quality evaluator (NIQE)NRMahalanobis distance between two multi-variate Gaussian models of 36-dim features from natural (training) and input sharp patches.lower[0, ?)[paper]
Perception based image quality evaluator (PIQE)NRSimilar to NIQE; block-wise. PIQE is less computationally efficient than NIQE, but it provides local measures of quality in addition to a global quality score.lower[0, 100][paper]

Notations:

Archived:

metricclassdescriptionbetterrangerefwhere
Ma et al. (MA)NRExtract features in DCT, wavelet and PCA domains; train a regression forest to learn the mapping from feature space to quality score. Very slow!higher[0, 10][paper] [official repo][v2]
perceptual index (PI)NR0.5 * ((10 - MA) + NIQE). Very slow due to MA!lower[0, ?)[paper] [official repo][v2]
Fréchet inception distance (FID)FRWasserstein-2 distance between two Gaussian models of InceptionV3 activations (fed with reference and distorted image data-sets, respectively).lower[0, ?)[paper] [cleanfid repo][v2]

Subjective quality metric(s):

metricdescriptionbetterrangeref
mean opinion score (MOS)Image rating under certain standards.higher[0, 100][BT.500]
degradation/difference/differential MOS (DMOS)Difference between MOS values of reference and distorted images.lower[0, 100][ref1] [ref2]

2. Dependency

conda create -n iqa python=3.7 -y && conda activate iqa
python -m pip install pyyaml opencv-python tqdm pandas

# for psnr/ssim
python -m pip install scikit-image==0.18.2

# for ms-ssim/lpips
# test under cuda 10.x
python -m pip install torch==1.6.0+cu101 torchvision==0.7.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html

# for lpips
python -m pip install lpips==0.1.3

For BRISQUE and NIQE, MATLAB >= R2017b is required; for PIQE, MATLAB >= R2018b is required.

If you want to use main.py to run MATLAB scripts, i.e., call MATLAB in Python, you should install MATLAB package in Conda environment. Check here. My solution:

# given linux
cd "matlabroot/extern/engines/python"  # e.g., ~/Matlab/R2019b/extern/engines/python
conda activate iqa && python setup.py install

3. Evaluation

  1. Edit opt.yml.
  2. Run: conda activate iqa && [CUDA_VISIBLE_DEVICES=0] python main.py -case div2k_qf10 [-opt opt.yml -clean]. [<args>] are optional.
  3. Output: CSV log files at ./logs/.

Note:

4. License

We adopt Apache License v2.0. For other licenses, please refer to the references.

If you find this repository helpful, you may cite:

@misc{2021xing3,
  author = {Qunliang Xing},
  title = {Image Quality Assessment Toolbox},
  howpublished = "\url{https://github.com/ryanxingql/image-quality-assessment-toolbox}",
  year = {2021},
  note = "[Online; accessed 11-April-2021]"
}