You may have noticed a decrease in the number of my postings over the last month or so. It’s not that I’ve been idle. Quite the contrary, I’ve been working on a major project. In this post, I’d like to introduce you to it.
Usually, I work alone. However, in the effort I’m going to describe to you, a colleague has provided assistance. Jack Hogan may be familiar to you if you spend much time on the technical parts of LuLa and dpr. He is knowledgeable, writes well, and — not as common as it should be — is unfailingly patient and civil while participating in contentious discussions. Although he has written little of the code, he’s provided strategic and tactical guidance that has been invaluable. The program would be nowhere near as good as it is without his aid.
Let’s get to it.
I’ve been evaluating camera sensor systems and reporting on my findings for several years now. As time has gone by, my testing methodology has grown more sophisticated. Still, when I get a new camera to test, my procedures are somewhat haphazard, with each stage being driven by what I found before.
I decided to develop a program to do automated analysis of image test files. Before I get into the details of what I’m trying to do this program, let me be clear on what areas it does and does not address. For this tool, I am interested in what’s called the photon transfer function of the sensor. That encompasses the characteristics of the sensor in converting incident photon flux to digital codes in the raw file. The sensor parameters that this program addresses are things like:
- Photon response nonuniformity
- Full well capacity
- Readout noise, a.k.a. read noise
This test regime specifically does not address the characteristics of the color filter array. In fact, I take pains to calibrate out those effects. It does not concern itself with the spatial properties of the sensor, except in the case of pattern errors, or when it becomes obvious that the camera manufacturer is sneaking some digital filtering into the raw processing.
My objectives for this program are:
No iteration. I want to get the boring part, the making of the test photographs, over at the beginning of the project. I don’t want to have to return to the test stand in the middle of the analysis and make ad hoc exposures.
Standard metrics. There is a fairly well-developed science for sensor evaluation – more on that in the next post. A part of that science is the assumption of a standard sensor model, and reporting on the test results in terms of the parameters of that model.
Accurate assessments in the presence of black point subtraction. Some cameras subtract the black point before they write the raw files, clipping off the left-hand side of the read noise histogram in a dark-field image. This causes the noise from sensors that do this to look different from the noise from similar sensors that don’t. There are ways around this. I have ignored some of them previously. I don’t want to do that anymore.
ISO testing in one-third stop increments. One of the interesting ways that cameras differ from the standard models is in the way that their performance changes as the in-camera ISO knob is adjusted. These changes, when they happen, usually occur in discrete steps. When the test points are separated by whole-stop increments, it’s often difficult to see what’s going on.
Exposure testing in one-third stop increments. This is essential to doing some of my anticipated analyses of the data set without iteration. Actually, I would prefer doing the exposure testing in 10th stop increments, but I don’t have a convenient way to do that. So, for now, one-third stop will have to do.
Automated data set cleanup. In making the initial exposure set, with thousands of images involved, I make mistakes. Occasionally, I miss a data point. More often, I duplicate one. As you will see, the bulk of the testing involves making pairs of photographs. I want the analysis program to automatically deal with these situations.
Batch operation. Some conventional methods of sensor characterization require lines to be extended by hand, and human-mediated curve fitting. While the performance of any useful sensor analysis program will need to be tailored, I’d like the tailoring to be done parametrically, at set-up time, and the operation of the computer program for each analysis phase to be autonomous. Part of the reason for doing this is to make the results more repeatable by others. To be completely honest, part of the reason is to avoid having to spend much time constructing graphical user interfaces, a task which could easily end up taking more effort than the rest of the project.
Automatic operation with virtually all raw files. I’d like the decoding of the raw files to be accomplished entirely under program control. I’d also like as little as possible restrictions as to what files can be decoded. To that end, I’ll be using Dave Coffin’s DCRAW program, but calling it from within the main analysis program.
An open-source project. I’m collaborating with Jack on the initial production of this tool, although I’m writing the bulk of the code myself. However, if there’s interest in this in the future, I would be delighted if other people used the tool. I’d be even more delighted if they made changes to it to make it more useful. I’d be ecstatic if they send me those changes so that I could incorporate them into the master code set. There was one significant fly in the ointment for some of you. I’m writing the tool in Matlab, which, although it is a common image processing development environment, is neither cheap nor universal. There is a free clone of it called Octave, however.
I believe these objectives have been largely accomplished. Tomorrow we’ll get started on the details.
n/a says
can’t you get a MATLAB Compiler license say for “academic” use ?
Jim says
I don’t know. I do know that some (most?) big universities have site licenses.
Jim
n/a says
well, then you might have some friends who can compile your code for you once polished – as it is a non commercial endeavor it shall not violate anything (put some EULA to that effect) and then people can run tests themselves w/o getting ML… like people can do tests using W. Claff software themselves… $0.02
Jim says
That’s a possibility down the road. The code packager is $5K, and I don’t want to spend that.
Another problem is that the current code has no user interface at all. To make configuration changes involves editing Matlab source code.
Jim
Jack Hogan says
After having played with Octave for a while I recently decided to purchase Matlab. I was expecting to have to shell out a couple of thousand dollars but I was pleasantly surprised to find that as of this year they have a Home version: it’s exactly the same as the more expensive option but without support (and I assume updates). This is what it displays every time you start it up:
“Home License — for personal use only. Not for government, academic, research, commercial, or other organizational use.”
Less than $200 did it including IPT, so definitely no longer prohibitive for experimenting photographers. I run 2014b happily on my main machine and the laptop that comes with me when I travel. Longish learning curve but loving it.