[go: nahoru, domu]

Skip to content

Latest commit

 

History

History
55 lines (43 loc) · 2.62 KB

README.md

File metadata and controls

55 lines (43 loc) · 2.62 KB

[ECCV 2024] LightenDiffusion: Unsupervised Low-Light Image Enhancement with Latent-Retinex Diffusion Models [Paper]

Hai Jiang1,5, Ao Luo2,5, Xiaohong Liu4, Songchen Han1, Shuaicheng Liu3,5

1.Sichuan University, 2.Southwest Jiaotong University,

3.University of Electronic Science and Technology of China,

4.Shanghai Jiaotong University, 5.Megvii Technology

Pipeline

Dependencies

pip install -r requirements.txt

Download the raw training and evaluation datasets

Paired datasets

LOL dataset: Chen Wei, Wenjing Wang, Wenhan Yang, and Jiaying Liu. "Deep Retinex Decomposition for Low-Light Enhancement". BMVC, 2018. [Baiduyun (extracted code: sdd0)] [Google Drive]

LSRW dataset: Jiang Hai, Zhu Xuan, Ren Yang, Yutong Hao, Fengzhu Zou, Fang Lin, and Songchen Han. "R2RNet: Low-light Image Enhancement via Real-low to Real-normal Network". Journal of Visual Communication and Image Representation, 2023. [Baiduyun (extracted code: wmrr)]

Unpaired datasets

Please refer to [Project Page of RetinexNet].

Pre-trained Models

You can download our pre-trained model from [Google Drive] and [Baidu Yun (extracted code:cjzk)]

How to train?

You need to modify datasets/dataset.py slightly for your environment, and then

python train.py  

How to test?

python evaluate.py

Visual comparison

Citation

If you use this code or ideas from the paper for your research, please cite our paper:

@InProceedings{Jiang_2024_ECCV,
    author    = {Jiang, Hai and Luo, Ao and Liu, Xiaohong and Han, Songchen and Liu, Shuaicheng},
    title     = {LightenDiffusion: Unsupervised Low-Light Image Enhancement with Latent-Retinex Diffusion Models},
    booktitle = {European Conference on Computer Vision},
    year      = {2024},
    pages     = {}
}

Acknowledgement

Part of the code is adapted from previous works: WeatherDiff and MIMO-UNet. We thank all the authors for their contributions.