Eric Chan has informed me that there are two image-processing pipelines in Lightroom: output-referred, and scene-referred. Raw files get the scene-referred pipeline. Integer TIFFs get the output-referred pipeline. Therefore, all my TIFF test images were getting a different set of processing than LR applies to raw files.
I’ve done some testing with real raw files, and determined that Lightroom Exposure adjustments have mean errors of around 1 Delta-E when photographing the same test pattern off the monitor, and worst-case errors of about 4 Delta-E. However, there’s a lot of noise in the real camera testing, and the actual situation is probably better than that.
All of the work I’ve done with real raw images has taught me to appreciate working with synthetic images. No lens flare. No need to align the images (aligning has the really unpleasant side effect of making the pixels in the aligned image not in the same place as pixels in the camera, so dust spots on the sensor look like inter-exposure differences). No photon noise. Faster.
On the other hand, like looking for your keys under lamp-post, it doesn’t do much good to have a nice, tight testing regimen if if doesn’t say anything about how real raw processing takes place. Eric says floating-point TIFFs get the same scene-referred pipeline as raw images, so I thought I’d try to use 32-bit floating-point synthetic TIFFs for testing. It took me a while to learn enough about TIFF tags to have Matlab write floating point files that Photoshop and Lightroom like, but I’m there as of this morning.
Everything looks great in Photoshop. However, when the files are brought into Lightroom, they are much more chromatic and brighter than they are in Photoshop. When exported from LR as 16-bit integer TIFFs, they are still too bright and too chromatic. In addition, when analyzed in Matlab, the exported images have greatly distorted CIELab scatter plots, possibly because of gamut mapping, and possible because of something else.
I went back to LR, created a set of virtual copies, and cranked the Exposure adjustment back one stop on each. When I exported them and read them into Photoshop, the L* values were about right, but they were too chromatic.
In 3d:
And in just the chrominance plane:
It looks like LR is looking at the fact that the files I’m feeding it are floating point and is invoking some default processing that it considers appropriate for HDR images. If that’s the case, I need to find out where that processing is, and figure out how to turn it off.
I suppose I could call the above the new reference image and see what Lightroom does with exposure compensation of one stop for each stop of underexposure. I may yet do that. But I’m a little worried that I can’t import an Image into LR, do nothing to it, export it, and have it be pretty much the same.
Leave a Reply