This permanently (and irreversibly) bakes an ICC profile's color corrections into an image. Nice for brute forcing color management onto wallpapers, in software without color management, and so forth. Just don't use the images on any other screens, ever, or try to view them in any properly color managed programs. :)
Using images that have had ICC corrections baked in with color managed software will get the same corrections applied a second time and therefore look wrong. The only real use-case for this is viewing images in non-managed software (setting desktop wallpapers is a great example of where this is harmless, except that your screenshots will look wrong on other screens).
Written as a proof of concept for possibly adding color management to MComix. Doesn't seem like it'll be that hard after all!
I have since also created a variant that applies a 3D lookup table on images. Currently, it expects (and is therefore limited to) images in the sRGB gamut. It will not behave properly on wider gamut images without some small changes. This is for simplicity on my end, since cube files do not contain metadata to specify what their source color space is. On images with wider gamuts, they will first be converted to sRGB and then have the lookup table applied.
- Python 3
- Pillow (Don't use PIL; it's been dead for a decade).
- (for the cube burner only) pillow_lut
The ICC files here are ones I made for my own personal use. They are included for testing purposes, but if you want to try them on your screens you can go ahead. I just won't promise any good results :)
The Dell U2412M one will have minimal visible changes on an sRGB monitor, since my U2412M is very close to covering 100% of sRGB.
The ThinkPad X201 Tablet profile, on the other hand, will drastically boost some colours; its screen covers about 51% of sRGB, if I recall correctly, and therefore if you just want to see that it works you should probably try that profile for pretty obvious results.
Right now, the program is hardcoded to always use a 'perceptual' rendering
intent. Since this is just a proof-of-concept, I don't mind, but adding
support for the other intents should be pretty trivial if you look at the
Pillow ImageCms
documentation.
The first test image is from Wikimedia Commons, and has an embedded Adobe RGB profile. It is used to test behavior when dealing with wide gamut images, as well as to verify that embedded profiles can be used.
The other image is a wallpaper I use, which is based on a screenshot that I essentially redrew in GIMP to clean up chroma subsampling artifacts. This image represents a typical sRGB input profile picture with easy-to-see color changes (large, flat surfaces).
One other nice use of this tool is that it can help to colour-correct games played in RetroArch via its 'reshade' LUT shader. Note that I have only tested this in the GLSL version of the shader, but I think it should work with any of RetroArch's supported shader types where Reshade is available.
I am a novice to shaders in general, but I was able to figure this out. There may be much better ways to accomplish what I'm talking about.
Basically, you can take the shaders from https://github.com/libretro/glsl-shaders and modify some stuff.
For a 16-bit LUT (maybe less accurate than a 32-bit or 64-bit one, but easier
to describe here because there's already a preset file for it), you run
icc-burn on shaders/LUT/16.png
(perhaps outputting to a different file name).
Then, copy the lut.glslp
shader preset to some new file (maybe
my-monitor-lut.glslp
) and edit the new file so that the reference to 16.png
is replaced with your burned-in version.
To use a different size LUT (e.g. 32 or 64-bit), there's a parameter you can set in Retroarch called 'LUT Size.' Alternatively, you can edit the files.
To edit the files, first copy reshade/lut.glslp
to some other filename
(maybe lut64.glslp
for instance). Then, edit your new file. Change the
reference to shaders/LUT/16.png
to be to the LUT png you want to use, and
copy shaders/LUT/LUT.glsl to some new filename (maybe
LUT64.glsl). Edit your new GLSL file on the line reading
#define LUT_Size 16.0` to instead
specify the size of your LUT. Make your GLSLP (preset) file point to the new
GLSL file. That should be it.
I think there may be another way to change the LUT size with a parameter, but I just don't really want to learn GLSL.
There's also one bug I've found in Reshade as implemented in retroarch; if blue shows as black on your system, you've hit the same bug. Try the patch I am suggesting here and see if that helps.
By using the new shader preset for your games, you should get color correction.