• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Comparing Photoshop and algorithmic color space conversion errors

Comparing Photoshop and algorithmic color space conversion errors

October 11, 2014 JimK Leave a Comment

I took my 256 megapixel image that’s been filled with random 16-bit entries with uniform probability density function, brought it into Photoshop, assigned it the sRGB profile, converted it to Adobe (1998) RGB and wrote it out. I went back to the sRGB image, converted it to ProPhoto RGB, and wrote that out.

Then I took the sRGB image on round trip color space conversions to and from Adobe RGB, ProPhoto RGB and CIELab, writing out the sRGB results.

In Matlab, I compared the one-way and roundtrip images with ones that I created by performing the color space conversions using double precision floating point. I quantized to 16-bit integer precision after each conversion.

Here are the results, with the vertical axis units being CIELab DeltaE:

psvsml

You can see that the conversions that I did myself are more accurate. In the case of the worst-case round trip conversions, they are about one order of magnitude more accurate. In the case of the one-way conversions, they are relatively more accurate. The striking Photoshop worst-case error with the one-way conversion to Adobe RGB makes me think that either Adobe or I have a small error in the nonlinearity of that space.

Photoshop did its conversions faster than I did mine. I suspect that they’re not using double precision floating point. In fact, from the amount of the errors, I’d be surprised to find that they’re using single precision floating point.

Except for the one-way conversion to Adobe RGB, even the Photoshop worst-case errors are not bad enough to scare me off from doing working color space conversions whenever I think a different color space would help me out. Still, it would be nice if Adobe offered a high-accuracy mode for color space conversions, and let the user decide when she wants speed and when she wants accuracy.

The Last Word

← Chained color space conversion errors with many rgb color spaces ICM vs ACE →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on How Sensor Noise Scales with Exposure Time
  • Štěpán Kaňa on Calculating reach for wildlife photography
  • Štěpán Kaňa on How Sensor Noise Scales with Exposure Time
  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • Javier Sanchez on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • Mike MacDonald on Your photograph looks like a painting?
  • Mike MacDonald on Your photograph looks like a painting?

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.