• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Handholding the a7 and a7R, part 1

Handholding the a7 and a7R, part 1

March 8, 2014 JimK Leave a Comment

You may have noticed that I haven’t posted much for the last week or so. I’ve spent the time trying to come up with a way to measure the absolute and relative image sharpness of a7 and a7R when not firmly mounted to a tripod. I had been resisting this exercise in the past because handheld images exhibit a lot of variability, and in order to make sense of things, I’d need to make a lot of exposures, do a lot of analysis, and compute statistics on the results. It sounded like a big time sink. The results would apply only to my handholding ability on that day with that many cups of tea in me. However, several people have been urging me to do some work on handheld use of the two cameras, so I set about developing a test protocol.

My first idea was to figure out a way to remove the variability from the testing by taking the living, breathing, shaking photographer – that would be moi – out of the picture. I experimented in mounting cameras wrapped in neoprene in a drill vise. That would simulate the soft, damped hold on the camera that a real person would have, but it wouldn’t shake like a real person. The results would reflect an upper bound to the sharpness that an actual user might experience. The project proceeded in a manner that gave me serious doubts that I was on the right track. The damping of the neoprene seemed to be less than one observes with real handholding. Also, I couldn’t figure out a way to simulate the way a photographer actually holds a camera, with force applied from the bottom and top as well as the sides. I abandoned that approach.

My second try was to make a series of images of the ISO 12233 test target, and analyze them by eye to find the point on the upper right resolution cross when the contrast just faded to zero, and compute statistics on the result. That produced interesting results, but there was sufficient variation that I knew I needed more samples, and it took so long that I knew I’d drive myself nuts if I persisted. I also thought the zero-contrast resolution was the wrong thing to measure. I really wanted the 50% contrast resolution, but I didn’t have a way to eyeball that.

I needed to find a way to have a machine analyze the test shots.

One big problem with computer image analysis of handheld images is what the folks in the machine vision biz call “region of interest (ROI) selection.” When you make a series of photographs from a tripod, you can find what you’re looking for in the first shot, and that’s where it is in all the others. When you handholding, that’s not true. Finding the ROI in a hundred images is a PITA. Doing that eight or ten times is something I wouldn’t wish on my worst enemy.

I like to write software, so I thought briefly – very briefly – about doing the analysis code myself. I’ve never written any machine vision code myself, but I’ve hung around people who have to have a healthy respect for their abilities and the difficulty of the problem. So I decided to look for an image analysis tool. It wasn’t much of a search. There is a well-respected, venerable tool available: Imatest.

I purchased the software, picked a likely target to use to get the 50% contrast resolution (or actually a related measure called MTF50), and made a series of images. The ROI moved around too much. SO I brought out the bid guns. There’s a series of test targets that Imatest calls SFRPlus. For those targets, Imatest will do automatic ROI identification.

I printed out an appropriate target at the right size, and did an initial series of test images, developed them in Lightroom, and set up the parameters for automatic analysis in Imatest. That looked good, and that’s what I’m doing in this and subsequent posts on this topic.

Before I post the results of handheld testing with the a7 and a7R, I needed to establish a baseline: a set of results that indicted what the cameras and lens – I used the same lens on both cameras – could do when camera motion was, ahem, out of the picture.

I set up an Arca Swiss C1 Cube and RRS TVC-44 legs. For illumination, I used a Paul Buff Einstein, set 6 stops down from full power for a t.1 of about 1/5000 seconds and triggered from the camera hot shoes through a PC adapter. I used a RRS L-bracket on the cameras, and set it up in landscape orientation. I used the Sony/Zeiss 55mm f/1.8 Sonnar FE, focused wide open on the Siemens star in the target and set to f/5.6. I used trailing curtain synch with a shutter speed of 1/25 second for the a7R so that any vibration could damp out. I set the shutter speed of the a7 to 1/125, since it doesn’t need any help in the vibration department for strobe images.

Here’s a sample image:

_DSC2050

I brought the raw files into Lightroom 5.3, and developed them with the default settings except for white balance, which I set at the white of the target, and exposure, which I boosted fractionally. The default settings for Lr include some sharpening and some noise reduction, both of which can affect the MTF50 results. Rather than neutralize these settings, I left them at default, figuring that most people would use their raw developer with some sharpening and noise reduction. I cropped the files to the approximate location of the target, leaving it to Imatest find the ROI. I exported the files as 16-bit TIFFs.

In Imatest, I set up the program to look for horizontal edges, which are the most problematical for the a7R in landscape orientation, given the vertical motion of the camera’s shutter. Incidentally, these edges are the ones that the a7 handles with greater sharpness than the vertical edges, since the a7 anti-aliasing (AA) filter is stronger horizontally than vertically.

Here is an Imatest result for an a7 image:

a7-3_YB11_01_cpp

And here’s the Imatest analysis of an a7R image:

a7r-3_YB12_01_cpp

A lot of information, huh? The number that I’m concentrating on is MTF50, which is measured in cycles per pixel. For the a7, it’s 0.364, and for the a7R, it’s 0.355. However, since the cameras have different resolutions, that doesn’t tell the whole story. What we’d like, if we are to compare the sharpness of the two cameras, is the MTF50 measured as cycles per some length, where that length is the same proportion of the horizontal, vertical, or diagonal distance of the camera’s sensors. Let’s say that the length that we want is the pixel pitch of the a7’s sensor. Then we need to multiply the a7R number by the ratio of the two camera’s pixel pitch, or 1.22, to get an apples-to-apples number. That gives us 0.433 for the a7R, indicating that its corrected MTF50 is 19% higher than the a7’s.

Enough preliminaries. Next up, handheld results.

The Last Word

← Handholding the a7 and a7R Handholding the a7 and a7R, part 2 →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • bob lozano on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes
  • Wedding Photographer in DC on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Fujifilm GFX 100S II precision
  • Renjie Zhu on Fujifilm GFX 100S II precision
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.