• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Do color space conversions degrade image quality?

Do color space conversions degrade image quality?

October 1, 2014 JimK Leave a Comment

There is a persistent legend in the digital photography world that color space conversions cause color shifts and should be avoided unless absolutely necessary.  Fifteen years ago, there were strong reasons for that way of thinking, but times have changed, and I think it’s time to take another look.

First off, there are several color space conversions that are unavoidable. Your raw converter needs to convert from your camera’s “color” space to your preferred working space. I put the word “color” in quotes because your camera doesn’t actually see colors the way your eye does. Once the image is in your chosen working space, whether it be ProPhotoRGB, Adobe RGB, or — God help you — sRGB, it needs to be converted into your monitor’s color space before you can see it. It needs to be converted into your printer’s (or printer driver’s) color space before you can print it.

So what the discussion about changing color spaces is about is changing the working color space of an image.

The reason why changing the working color space used to be dangerous is that images were stored with 8 bits per color plane. That was barely enough to represent colors accurately enough for quality prints, and not really enough to allow aggressive editing without creating visible problems. To make matters worse, different color spaces had different problem areas, so moving your image from one color space to another and back could cause posterization and the dreaded “histogram depopulation”.

Many years ago, image editors started the gradual migration towards (15 or) 16 bit per color plane representation, allowing about (30,000 or) 60,000 values in each plane rather than the 256 of the 8-bit world. This changed the fit of images into editing representations from claustrophobic to wide-open. Unless you’re trying to break something, there is hardly a move you can make that’s going to cause posterization.

But the fear of changing working spaces didn’t abate. Instead of precision (the computer science word for bit depth) being the focus, the spotlight turned to the conversion process itself being inaccurate.

Before I get to that, there’s another thing I need to get out of the way. Not all working spaces can represent all the colors you can see. The ones that can’t don’t exclude the same set of colors. So, if you’ve got an image in, say, Adobe RGB, and you’d like to convert it to, say, sRGB, if there are colors in the original image that can’t be represented in sRGB, they will be mapped to sRGB colors. If you decide to take your newly sRGB image and convert it back to Adobe RGB, you won’t get those remapped colors back. one name for this phenomenon is gamut clipping.

There are two ways of specifying color spaces. The most accurate way is to specify a mathematical model for converting to and from some lingua franca color space such as CIE 1931 XYZ or CIE 1976 CIEL*a*b*. If this method is used, assuming infinite precision for the input color space and for all intermediate computations, perfect accuracy is theoretically obtainable. Stated with the epsilon-delta formulation beloved my mathematicians the world over, given a color an allowable error epsilon, there exists a precision, delta, which allows conversion of a given color triplet between any pair of model based spaces, assuming that the color can be represented in both spaces. Examples of model-defined color spaces are Adobe (1998) RGB, sRGB, ProPhoto RGB, CIEL*a*b*, and CIEL*u*v*.

The other way to define a color space is to take a bunch of measurements and build three-dimensional lookup tables for converting to and from a lingua franca color space. These conversions are inherently inaccurate, being limited by the accuracy of the measurement devices, the number of measurements,  the number of entries in the lockup table, the precision of those entries, the interpolation algorithm, the stability of the device itself, and the phase of the moon. Fortunately, but not coincidentally, all of the working color spaces available to photographers are model-based.

I set up a test. I took an sRGB version of this image of Bruce Lindbloom’s imaginary, synthetic, desk:

DeltaELrNoSharp

I brought it into Matlab, and converted it to 64-bit floating point representation, with each color plane mapped into the region [0, 1].

I converted it to Adobe RGB, then back to sRGB, and computed the distance between the original and the round-trip-converted image in CIELab DeltaE. I measured the average error, the standard deviation, and the worst-case error and recorded them.

Then I did the pair of conversions again.

And again, and again, for a total of 100 round trips. Here’s the code:

rtCode

Here’s what I got:

meanFPRT sigmaFPRT wcFPRT

The first thing to notice is how small the errors are. One DeltaE is roughly the amount of difference in color that you can just notice. We’re looking at worst-case errors after 100 conversions that are five trillionths of that just-noticeable difference.

Unfortunately, the working color spaces of our image editors don’t normally have that much precision. 16-bit integer precision is much more common. If we run the program above and tell it to convert every color in the image to 16-bit integer precision after every conversion, this is what we get:

rtsigmavsiter rtwcvsiter

It’s a lot worse, but the worst-case error is still about 5/100 of a DeltaE, and we’re not going to be able to see that.

How do the color space conversion algorithms in Photoshop compare to the ones I was using in Matlab. Stay tuned.

 

 

 

 

The Last Word

← Noise reduction with nonlinear tools and downsampling Photoshop color space conversion accuracy →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • Javier Sanchez on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • Mike MacDonald on Your photograph looks like a painting?
  • Mike MacDonald on Your photograph looks like a painting?
  • bob lozano on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes
  • Wedding Photographer in DC on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Fujifilm GFX 100S II precision
  • Renjie Zhu on Fujifilm GFX 100S II precision
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.