Home

Awesome

Beyond the Pixel


This repository contains the calibration code (Calibration) and the deep learning code (Learning Tasks) for the paper Beyond the Pixel: a Photometrically Calibrated HDR Dataset for Luminance and Color Prediction

Requirements

pip install pytorch_lightning matplotlib skylibs glob2 opencv-contrib-python collections omegaconf natsort configargparse

Additionally, MATLAB is required for the HDR-VDP-3 metric.

Calibration

To run the calibration process for your own camera setup as it is done in the paper, follow the README in the Calibration folder.

Learning Tasks

The pipeline used for learning depends on the desired task. alt text

Per-pixel luminance outputs an HDR image and the scale needed to bring it to absolute luminance.

Per-pixel color predicts the temperature map directly.

Planar illuminance generate only the illuminance scalar.

Dataset preparation

If working with the full dataset (available at http://hdrdb.com/indoor-hdr-photometric/), it needs to be split in train/test/val , inpainted and rescaled to manageable size. The following script automate this to the setup used in the paper.

python prepare_dataset.py [path_to_full_dataset]

Training

For convenience, 3 config files are provided in configs/, one for each task.

python train.py --config [config_file]

Testing

When testing, it is best to link to the config file generated by the training script (by default in checkpoints/[name]/lightning_logs/version_[x]/config.txt)

The test.py script generate the inference predictions.

python test.py --config [config_file]

The metrics.py script computes the metrics and generate visualisations from the inference predictions.

python metrics.py --config [config_file]

Fine-tuning

When fine-tuning, the config file must be modified to increase the n_epoch property.

When fine-tuning, it is best to link to the config file generated by the training script (by default in checkpoints/[name]/lightning_logs/version_[x]/config.txt)

python fine-tune.py --config [config_file]

Pre-trained weights

If you wish to test or fine-tune the weights used in the paper, you can download them and place them in the checkpoints/ folder.

https://hdrdb-public.s3.valeria.science/indoor_photometric/[Experiment_name].zip

Experiment_nameModeIn paperLink
Luminance_linearLuminanceTable 1link
Luminance_gammaLuminanceTable 1link
Luminance_noiseLuminanceTable 1link
Luminance_quantizedLuminanceTable 1link
Luminance_LDRLuminanceTable 1link
Luminance_LDR_fine_tuneLuminanceTable 3link
Temperature_WB_rand_augmentTemperatureFigure 7link
Temperature_WB_augment_theta_fine_tuneTemperatureTable 3link
illum_hemi_HDRIlluminanceTable 2link
illum_hemi_LDR_scaleIlluminanceTable 2link
illum_hemi_LDRIlluminanceTable 2link
illum_120_HDRIlluminanceTable 2link
illum_120_LDR_scaleIlluminanceTable 2link
illum_120_LDRIlluminanceTable 2link
illum_60_HDRIlluminanceTable 2link
illum_60_LDR_scaleIlluminanceTable 2link
illum_60_LDRIlluminanceTable 2link
illum_rand_HDRIlluminanceTable 2link
illum_rand_LDR_scaleIlluminanceTable 2link
illum_rand_LDRIlluminanceTable 2link
illum_120_LDR_thetaIlluminanceTable 3link

Paper

Beyond the Pixel: a Photometrically Calibrated HDR Dataset for Luminance and Color Prediction
Christophe Bolduc, Justine Giroux, Marc Hébert, Claude Demers, Jean-François Lalonde
International Conference on Computer Vision (ICCV), 2023
Project page / Paper / Dataset / BibTeX