GenColor achieves expressive color enhancement with superior texture preservation.
Color enhancement is a crucial yet challenging task in digital photography. It demands methods that are (i) expressive enough for fine-grained adjustments, (ii) adaptable to diverse inputs, and (iii) able to preserve texture. Existing approaches typically fall short in at least one of these aspects, yielding unsatisfactory results.
We propose GenColor, a novel diffusion-based framework for sophisticated, texture-preserving color enhancement. GenColor reframes the task as conditional image generation. Leveraging ControlNet and a tailored training scheme, it learns advanced color transformations that adapt to diverse lighting and content. We train GenColor on ARTISAN, our newly collected large-scale dataset of 1.2M high-quality photographs specifically curated for enhancement tasks.
To overcome texture preservation limitations inherent in diffusion models, we introduce a color transfer network with a novel degradation scheme that simulates texture–color relationships. This network achieves pixel-perfect texture preservation while enabling fine-grained color matching with the diffusion-generated reference images. Extensive experiments show that GenColor produces visually compelling results comparable to those of expert colorists and surpasses state-of-the-art methods in both subjective and objective evaluations.
GenColor addresses three key challenges through a three-phase process:
Click on the buttons to compare different methods against the Input and GenColor.
Even the latest commercial closed-source models still have texture change problems. GenColor can effectively help optimize their results, achieving pixel-perfect texture preservation.
Click on the buttons to compare different commercial models.
@inproceedings{dong2025gencolor,
title={GenColor: Generative and Expressive Color Enhancement with Pixel-Perfect Texture Preservation},
author={Dong, Yi and Wang, Yuxi and Lin, Xianhui and Ouyang, Wenqi and Shen, Zhiqi and Ren, Peiran and Fan, Ruoxi and Lau, Rynson W. H.},
booktitle={NeurIPS},
year={2025}
}