This is the second in a series of posts on the Sony a7RIII (and a7RII, for comparison) spatial processing that is invoked when you use a shutter speed of longer than 3.2 seconds. The series starts here.
Several people have said that they aren’t getting much out of the graphs that I posted yesterday showing that the Sony a7RIII still exhibits uncontrollable spatial filtering (aka star-eating) at shutter speeds of 4 seconds and longer. I realize that observing phenomena in the spatial frequency domain is not something that comes easy to many, so I thought I’d show you a couple of images today.
These are dark-field images that Rishi Sanyal sent me. They are the same images that I used for the graphs I posted yesterday. They were made with an a7RIII with a body cap attached, at ISO 1000, in single-shot mode, with LENR off. The files were written in uncompressed raw.
Here is a somewhat-larger-than-300% look at the 3.2 second raw-red-channel image, equalized with the Photoshop “Equalize ” tool.
See all those single-pixel light spots? Those are indicative of sensor pixels that have greater leakage current than most. The leakage current is also known as “dark current”, and its effect on the image is approximately proportional to exposure time, so these “hot pixels” get to be more and more of a problem as the shutter stays open longer and longer.
Here’s the equivalent image from the 4-second exposure, which enjoys, or suffers from, the “star-eater” algorithm:
Most of the hot pixels have been removed by the spatial filtering. That’s why Sony put it there. I personally would like to see this kind of thing done in postproduction rather than before the raw file is written to the SD card, but nobody at Sony has asked my opinion. If you’re a connoisseur of noise, you might call the 4-second exposure “mushy”. I sure would.
Not that the hot pixels in the 3.2 second image could have been stars, and the camera would have happily removed them. It can’t tell the difference between dark current and real signal.
Now we’re going to take some baby steps into the frequency domain. I will show you the magnitude portion of the Fourier transform of the two captures.
First, 3.2 seconds:
This image and the one immediately below has been equalized in Photoshop. The Fourier transform of white Gaussian noise is white Gaussian noise, so this doesn’t look all that different from the image before it was transformed into the frequency domain.
But things are different in the 4-second image:
See that hot spot in the middle? What that’s about will become apparent after I give you a little tour of the spatial frequency domain. The center of the image is zero frequency, also known for arcane reasons — that go back at least as far as Thomas Edison — as dc. As you most away from the center, the spatial frequencies get higher. More energy is shown as lighter pixels. So that hot spot means that there is more energy at lower frequency than at higher frequency, which is the smoking gun for detecting star munching.
DrewG says
The data is definitely there and it’s apparent that spatial filtering is still active to a degree, but how about an actual star test? You can test the back of a body cap for days, but shooting the real thing is a whole different story.
Also, judging your graphs from the A7RII vs RIII, they seem to have quite a difference in freq/db levels when compared to each other (the RIII in favor of a cleaner image, albeit with spatial filtering turned on).
JimK says
Shooting real stars is indeed entirely different, and not in a good way. You need a tracker, clear air, and good technique for real stars. And experiments are never precisely repeatable. Artificial atars are better, but I don’t have the requisite equipment.
And there’s one thing that is very important that doesn’t seem to have sunk in. I don’t have access to an a7RIII at this point. I got the dark-field files from Rishi.
JimK says
“Also, judging your graphs from the A7RII vs RIII, they seem to have quite a difference in freq/db levels when compared to each other (the RIII in favor of a cleaner image, albeit with spatial filtering turned on).”
Here’s an apples-to-apples comparison:
http://blog.kasson.com/the-last-word/sony-a7rii-and-a7riii-star-eating-under-the-microscope/
Victor Trasvina says
Thanks for all you hard work Jim!
Lynn Allan says
>> don’t have access to a7Riii
Good point … eager? (assuming you put in pre-order or trade-in)
Would you anticipate that use of sensor-shift ultra-resolution would make a difference in star-eat’ing?
Semi-related: Is there a reason that ISO 1000 was used rather than what would seem to be preferred ISO 640 with the a7Riii being essentially ISO-less (from ISO 100 thru 500, and then above 640), and the Aptina dual gain electronics?
JimK says
The choice of ISO 1000 was to generate white noise for a stimulus. Choosing ISO 640 would have generated less noise, and probably would have had nor fixed pattern read noise, which isn’t what I wanted, since FPRN tends to have a strong low-frequency component.
JimK says
There are so many fantastic cameras now — a7RII, now the III, D810, now the D850, GFX, a9 — that I am becoming a bit jaded. It’s getting to the point that the cameras and lenses aren’t even close to what’s limiting my photography.
JimK says
No. I think Sony probably does the same processing on each of the 4 captures.