In several days of trying, I can’t focus the Nikon D810 accurately enough for the focus error standard deviations to be less than the repeated exposure standard deviations.
The first thing I tried was to change the target from a Siemens star to a combination of the star and a zone plate:
You can’t see the focusing target very well, so I’ll blow it up:
When the target is in focus, the zone plate is a riot of aliased signal, with magenta and green false color that varies with focus. It is certainly easier to focus on than the Siemens star, and it provides better consistency.
Is it as good as the focus peaking on the Sony a7R?
Is it at least good enough to have repeatability much better than the differences between cameras that we’re trying to spot?
I tried hooking the D810 up to an external monitor through the HDMI port. Unfortunately, the resolution on an attached monitor at full magnification is exactly the same as the resolution on the LCD display on the back of the camera. The image is bigger, but, since I was already using a loupe on the LCD screen, focusing is no more accurate.
I give up, at least for now.
All of this brings up the issue of achieving all of the resolution that lenses like the Otus and cameras like the a7R and the D810 can deliver in real-world photography. First off, if center sharpness is your goal, you can’t stop down any further than f/4. Second, the depth of field at that aperture with 36 MP is tiny. Third, much as we’d like to have our subjects walk around with little zone plates stuck to critical places about their persons (we could have artificial intelligence in our image editors to find the targets and eliminate them with content-aware fill), that’s probably not practical. Then there’s camera and subject motion, atmospherics for long lens work, etc.
For most photography, you’re just not going to get all the sharpness out of an Otus and a D810 that’s available.
So why bust a gut trying to measure it?
Good question. I’m torn.