Awesome
<p align="center"> <img src="ICELUT/ICELUT_logo.png" height=110> </p>Taming Lookup Tables for Efficient Image Retouching (ECCV2024)
:star: If ICELUT is helpful to your images or projects, please help star this repo. Thanks! :hugs:
TODO
- Add training code and config files
- Add LUT and inference code
Requirements and Dependencies
- python==3.7
- cuda==11.1
- numpy==1.21.5
- torch==1.8.0
- torchvision==0.9.0
- opencv-python==4.5.5.62
- matplotlib==3.5.1
- scipy==1.7.3
- GPU: NVIDIA GeForce RTX 3090
- CPU: Intel(R) Xeon(R) Platinum
How to run the codes
1. Install the tilinear interpolation package
1.1 For GPU User
- Check your cuda version
$ls /usr/local/
- Change the cuda path in your setting
$cd trilinear_cpp
$vim setup.sh
export CUDA_HOME=/usr/local/your_cuda_version && python setup.py install
- Install the package
$sh setup.sh
1.2 For CPU User
- Install the package
$cd trilinear_cpp
$python setup.py
ATTENTION: If you follow the CPU install instruction with the GPU in your device, the default programme will still install the GPU version. If you want to only install the CPU version, please follow this step:
- Check the setup.py codes
$vim setup.py
- Substitute the codes
# line 5
if torch.cuda.is_available():
# substitute line 5 with :
# if False
2. Inference the demo images
$python inference_demo.py
3. Training
Prepare the datasets
Please refer to 3D LUT to prepare the FiveK datasets.
Training model
$python train_ICELUT.py
4. Transfer to LUTs
$python transfer2LUT.py
5. Optional: Check the retouched results
The retouched images generated by running inference_demo.py will be saved in the default dir: ./test_image_output
6. Optional: Check the total LUT size
cd ./ICELUT
du -sh *.npy
% output
52K Basis_lut.npy
204K classifier_int8.npy
164K Model_lsb_fp32.npy
164K Model_msb_fp32.npy
Note that we use 10 Basis in codes rather than 20 in the paper. So the storage could be smaller (588KB v.s. 780KB).
Acknowledgement
This project is based on CLUT. Thanks for these awesome codes!