the last word

Photography meets digital computer technology. Photography wins -- most of the time.

  • site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge
You are here: Home / The Last Word / Color from non-Luther cameras

Color from non-Luther cameras

December 8, 2015 JimK 7 Comments

This is the third in a series of posts on color reproduction. The series starts here.

I stated in the last post that, in the general case of arbitrary subject matter and arbitrary lighting, we couldn’t get accurate color – even using out limited definition of accurate – from cameras that don’t meet the Luther-Ives condition. I also said there weren’t any consumer cameras that met that condition.

Because the filters in the Bayer array of your camera don’t meet the Luther-Ives condition, there are some spectra that your eyes see as matching, and the camera sees as different, and there are some spectra that the camera sees as matching and your eyes see as different. This is called capture metameric error, or, less precisely, camera metamerism. No amount of post-exposure processing can fix this.

What to do?

The simplest approach is to pretend that the camera does meet the Luther condition, and come up with use a compromise three-by-three matrix intended to minimize color errors for common objects lit by common light sources, and multiply the RGB values in the demosaiced image by this matrix as if Luther-Ives were met to get to some linear variant of a standard color space like sRGB or Adobe (1998) RGB.

Many cameras provide a compromise matrix in the EXIF metadata as an aid to a downstream raw converter. Here is the matrix from a Sony a7II:

comp matrix a7ii

Here’s the matrix for a Sony a7RII:

comp matrix a7Rii

There are ways to get more accurate color, such as three-dimensional lookup tables, but they are memory and computation intensive and are not generally used in cameras. They may be used in some raw converters. I’d appreciate input on this.

However, no matter how colors are reconstructed from a non-Luther camera, as I said earlier, they can’t be 100% accurate for all subjects and all illuminants.

Next: constructing a compromise matrix.

The Last Word

← How cameras and people see color Constructing a compromise matrix →

Comments

  1. Pedro Côrte-Real says

    December 8, 2015 at 3:56 pm

    Hi, I’m one of the developers of darktable[1]. I’m really enjoying this series of articles. I was particularly intrigued by this bit:

    > There are ways to get more accurate color, such as three-dimensional lookup tables, but they are memory and computation intensive and are not generally used in cameras. They are certainly used in raw converters.

    It’s common to use LUTs for display and printing profiles but I’ve never seen any used in raw processing. Everything in darktable uses a 3×3 and I haven’t seen that Adobe does anything different. Do you have any references to use of 3D LUTs for color transformation of raw files?

    [1] https://www.darktable.org/

    Reply
    • Jim says

      December 8, 2015 at 4:28 pm

      I thought Adobe used LUTs. I’m not sure why I thought that, except that compromise matrixes are such blunt tools. Until I find out more, I’m going to soften the statement.

      Thanks for the information.

      Jim

      Reply
      • Iliah Borg says

        December 8, 2015 at 4:52 pm

        > I thought Adobe used LUTs

        Quite so, all current Adobe DCP profiles are LUT-based.

        Reply
        • Jack Hogan says

          December 9, 2015 at 11:40 am

          I understand that Nikon uses LUT profiles in its CaptureNXx products. Anybody knows for sure?

          Reply
          • Iliah Borg says

            December 9, 2015 at 2:54 pm

            They do.

            Reply
  2. CarVac says

    December 8, 2015 at 5:02 pm

    I read somewhere (maybe it was just implied?) that the [xyz<-camera] matrices in dcraw are taken from metadata in dng's generated by the Adobe dng converter…

    Then it multiplies them by the [colorspace<-xyz] matrix to put the image into the final color space.

    This stuff honestly makes my head hurt after too long, though, and I can't remember where in the chain which white balance multipliers are applied…

    Reply
  3. Jack Hogan says

    December 9, 2015 at 11:37 am

    Jim, feast your eyes on Anders Torger excellent site and camera profiles: http://www.ludd.ltu.se/~torger/dcamprof.html

    Jack

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

March 2023
S M T W T F S
 1234
567891011
12131415161718
19202122232425
262728293031  
« Jan    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • Good 35-70 MF lens
  • How to…
    • Backing up photographic images
    • How to change email providers
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on Fujifilm GFX 100S pixel shift, visuals
  • Sarmed Mirza on Fujifilm GFX 100S pixel shift, visuals
  • lancej on Two ways to improve the Q2 handling
  • JimK on Sony 135 STF on GFX-50R, sharpness
  • K on Sony 135 STF on GFX-50R, sharpness
  • Mal Paso on Christmas tree light bokeh with the XCD 38V on the X2D
  • Sebastian on More on tilted adapters
  • JimK on On microlens size in the GFX 100 and GFX 50R/S
  • Kyle Krug on On microlens size in the GFX 100 and GFX 50R/S
  • JimK on Hasselblad X2D electronic shutter scan time

Archives

Copyright © 2023 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.