I have long argued that the best way to judge the exposure of a raw file is by looking at the raw data, so the title of this post should come as no surprise. However, I have recently been looking at a raw file that demonstrates this to an extent that surprised me, and I thought I’d share it with you all. It’s not my image, so I won’t show it to you. It appears to be a sunrise or sunset.
Here’s what the raw histogram looks like:
The horizontal axis is the raw count, or data number. Full scale on the Sony a7R, which was the camera used, is about 16000 counts. The black point is 512.
Looking at it with a logarithmic vertical axis:
You can see that, except for isolated single-pixel buckets from outliers on the sensor, that there is no data above a count of about 1500 in the red and blue channels, and above about 1700 in the green channels. Subtracting the black point leaves us with about 600 in the red and blue and about 1200 in the green. That means, in ETTR terms, that the image is almost 4 stops underexposed.
Bringing the image into the current version of Adobe Camera Raw (220.127.116.118) with the controls at default gets us this:
It looks underexposed. Is it more than three stops under? Pushing up the Exposure control until the clipping warning lights up:
It now looks like it is less than one and a half stops underexposed.
There is more than a two stop difference between how you’d judge the exposure using ACR and the real raw data. Lr uses the same engine as ACR, and would thus give the same results. Looking at Lr and ACR usually makes you think a file is more heavily exposed than it is, but not always. There is no substitute for looking at the raw data.
If you look at the ACR data and weren’t thinking clearly, you might think the red raw channel would be first to clip if you increased the exposure, since it is the first channel to clip after conversion to the ProPhoto RGB primaries that ACR uses. If you look at the raw data, you can see that the green channels would be the first ones to clip.