• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Automatic sensor characterization, an introduction

Automatic sensor characterization, an introduction

December 9, 2014 JimK 5 Comments

You may have noticed a decrease in the number of my postings over the last month or so. It’s not that I’ve been idle. Quite the contrary, I’ve been working on a major project. In this post, I’d like to introduce you to it.

Usually, I work alone. However, in the effort I’m going to describe to you, a colleague has provided assistance. Jack Hogan may be familiar to you if you spend much time on the technical parts of LuLa and dpr. He is knowledgeable, writes well, and — not as common as it should be — is unfailingly patient and civil while participating in contentious discussions.  Although he has written little of the code, he’s provided strategic and tactical guidance that has been invaluable. The program would be nowhere near as good  as it is without his aid.

Let’s get to it.

I’ve been evaluating camera sensor systems and reporting on my findings for several years now. As time has gone by, my testing methodology has grown more sophisticated. Still, when I get a new camera to test, my procedures are somewhat haphazard, with each stage being driven by what I found before.

I decided to develop a program to do automated analysis of image test files. Before I get into the details of what I’m trying to do this program, let me be clear on what areas it does and does not address. For this tool, I am interested in what’s called the photon transfer function of the sensor. That encompasses the characteristics of the sensor in converting incident photon flux to digital codes in the raw file. The sensor parameters that this program addresses are things like:

  • Photon response nonuniformity
  • Full well capacity
  • Readout noise, a.k.a. read noise

This test regime specifically does not address the characteristics of the color filter array. In fact, I take pains to calibrate out those effects. It does not concern itself with the spatial properties of the sensor, except in the case of pattern errors, or when it becomes obvious that the camera manufacturer is sneaking some digital filtering into the raw processing.

My objectives for this program are:

No iteration. I want to get the boring part, the making of the test photographs, over at the beginning of the project. I don’t want to have to return to the test stand in the middle of the analysis and make ad hoc exposures.

Standard metrics. There is a fairly well-developed science for sensor evaluation – more on that in the next post. A part of that science is the assumption of a standard sensor model, and reporting on the test results in terms of the parameters of that model.

Accurate assessments in the presence of black point subtraction. Some cameras subtract the black point before they write the raw files, clipping off the left-hand side of the read noise histogram in a dark-field image. This causes the noise from sensors that do this to look different from the noise from similar sensors that don’t. There are ways around this. I have ignored some of them previously. I don’t want to do that anymore.

ISO testing in one-third stop increments. One of the interesting ways that cameras differ from the standard models is in the way that their performance changes as the in-camera ISO knob is adjusted. These changes, when they happen, usually occur in discrete steps. When the test points are separated by whole-stop increments, it’s often difficult to see what’s going on.

Exposure testing in one-third stop increments. This is essential to doing some of my anticipated analyses of the data set without iteration. Actually, I would prefer doing the exposure testing in 10th stop increments, but I don’t have a convenient way to do that. So, for now, one-third stop will have to do.

Automated data set cleanup. In making the initial exposure set, with thousands of images involved, I make mistakes. Occasionally, I miss a data point. More often, I duplicate one. As you will see, the bulk of the testing involves making pairs of photographs. I want the analysis program to automatically deal with these situations.

Batch operation. Some conventional methods of sensor characterization require lines to be extended by hand, and human-mediated curve fitting. While the performance of any useful sensor analysis program will need to be tailored, I’d like the tailoring to be done parametrically, at set-up time, and the operation of the computer program for each analysis phase to be autonomous. Part of the reason for doing this is to make the results more repeatable by others. To be completely honest, part of the reason is to avoid having to spend much time constructing graphical user interfaces, a task which could easily end up taking more effort than the rest of the project.

Automatic operation with virtually all raw files. I’d like the decoding of the raw files to be accomplished entirely under program control. I’d also like as little as possible restrictions as to what files can be decoded. To that end, I’ll be using Dave Coffin’s DCRAW program, but calling it from within the main analysis program.

An open-source project. I’m collaborating with Jack on the initial production of this tool, although I’m writing the bulk of the code myself. However, if there’s interest in this in the future, I would be delighted if other people used the tool. I’d be even more delighted if they made changes to it to make it more useful. I’d be ecstatic if they send me those changes so that I could incorporate them into the master code set. There was one significant fly in the ointment for some of you. I’m writing the tool in Matlab, which, although it is a common image processing development environment, is neither cheap nor universal. There is a free clone of it called Octave, however.

I believe these objectives have been largely accomplished. Tomorrow we’ll get started on the details.

The Last Word

← Stay with medium format? A single-pixel sensor model →

Comments

  1. n/a says

    December 9, 2014 at 10:42 am

    can’t you get a MATLAB Compiler license say for “academic” use ?

    Reply
    • Jim says

      December 9, 2014 at 12:22 pm

      I don’t know. I do know that some (most?) big universities have site licenses.

      Jim

      Reply
      • n/a says

        December 9, 2014 at 3:21 pm

        well, then you might have some friends who can compile your code for you once polished – as it is a non commercial endeavor it shall not violate anything (put some EULA to that effect) and then people can run tests themselves w/o getting ML… like people can do tests using W. Claff software themselves… $0.02

        Reply
        • Jim says

          December 9, 2014 at 3:43 pm

          That’s a possibility down the road. The code packager is $5K, and I don’t want to spend that.

          Another problem is that the current code has no user interface at all. To make configuration changes involves editing Matlab source code.

          Jim

          Reply
    • Jack Hogan says

      December 10, 2014 at 5:01 am

      After having played with Octave for a while I recently decided to purchase Matlab. I was expecting to have to shell out a couple of thousand dollars but I was pleasantly surprised to find that as of this year they have a Home version: it’s exactly the same as the more expensive option but without support (and I assume updates). This is what it displays every time you start it up:

      “Home License — for personal use only. Not for government, academic, research, commercial, or other organizational use.”

      Less than $200 did it including IPT, so definitely no longer prohibitive for experimenting photographers. I run 2014b happily on my main machine and the laptop that comes with me when I travel. Longish learning curve but loving it.

      Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes
  • Wedding Photographer in DC on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Fujifilm GFX 100S II precision
  • Renjie Zhu on Fujifilm GFX 100S II precision
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.