• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / On vibration control

On vibration control

January 21, 2014 JimK 1 Comment

When I was working at Hewlett-Packard in the early 70s, I remember walking through optical labs and noticing the benches the engineers used to set up experiments. The most striking thing was the top: a huge slab of four or six-inch thick dark-gray granite pockmarked with holes on a regular grid. The bench had many qualities important to the experimenters — resistance to thermal expansion, long-term dimensional stability, rigidity, flatness, hardness – but one relates to photography: vibration control.

I thought of the optical bench when I was considering the problem of stabilizing a camera during the exposure. That line of thinking was, as those who have been following this blog know, triggered by my problems getting sharp images with the Sony a7R, and, to a lesser extent, with the Nikon D800E. The vibration problems solved by the optical bench and the tripod/head/QR clamp/QR plate are related, but far from identical. The lasers, mirrors, lenses, and instrumentation on an optical bench are not usually the predominant source of vibratory motion. A lot of the vibration that needs to be dealt with comes from the environment in which the bench finds itself. The main source of vibration a photographer in a windless environment has to deal with is within the camera. In fact, a common photographic assumption, which may or may not be true, is that whatever surface the tripod rests on does not move with respect to the subject.

You could say that the distances involved are far different. You’d be right, but maybe by not as much as you think. People working with optical benches think in terms of fractions of the wavelength of the light they’re using. Consider that the pixel pitch of the a7R or the D800E is 6 or 7 times the wavelength of red light. When Lloyd Chambers talks about 1/5 of a pixel movement, he talking about a little over one wavelength.

Regardless of the differences between the job of an optical bench and a camera support system, thinking about how an optical bench does its job might be useful background to any experimentation with camera vibration control.

I did a little web research, and found that things have changed a lot in forty years. Granite is no longer the preferred surface material, having been replaced by sandwiches with stainless steel “bread” and honeycomb “filling”. Think of an airplane skin, or a Hexcel ski. The tables are mounted on “legs” that absorb vibration. There is a lot of information on environmental vibration and table design on the web. I commend two papers to you: a reasonably-quick read by someone who works for a table manufacturer and a deeper dive that’s still comprehensible to people who aren’t mechanical engineers.

Here’s what’s similar between jobs of the optical bench and the camera support.

Resonance control. The table, and the camera support, should not introduce its own resonances.

Damping. The table, and the camera support, should damp any vibration introduced by the equipment mounted on it, and it should do that with the minimum motion. If he table, or the camera support, does resonate at frequencies excited by the mounted equipment, it should damp those resonances as well.

And here’s what’s different:

Isolation. The optical table needs to be isolated from its environment. If you think about it, this isolation can’t – and needn’t – take place at all frequencies. If the surface on which the optical table rests rises over a period of several days, there is no passive mechanism that can compensate for that, nor is there any need to. Extending this thinking to higher frequencies, the isolation of the table needs to exclude frequencies for which the table surface is not rigid.

This differentiation of the effect of various frequencies extends to the tripod-mounted camera in the following way. Think of the subject of the photograph and the tripod mounted camera as being mounter to the same base. Any vibration of the base should raise or lower the subject and the camera identically, and should not change the direction in which the camera is pointing with respect to the subject.

Some math makes for some sobering conclusions.

It’s not so bad if we only look at displacement. Assume that the speed of sound in the surface on which the camera and the subject rest is 2500 ft/sec. Assume the frequency of the vibration is 10 Hz. That means the wavelength is 250 ft, so if the camera and the subject are within 5 feet of each other, the phase angle between them is (5/250)*360 or 7.2 degrees. Thus the maximum differential displacement of the camera and subject is the peak amplitude of the vibration times the sin of 7.2 degrees. Thus the maximum displacement is 1/8 of the peak amplitude of the vibration. Typical building vibrational amplitudes run from 0.01 inches to 10 micro-inches. The vibration’s effect at the sensor is reduced by the inverse of the magnification ratio. If the image on the sensor is a tenth the size of the actual subject, then the maximum displacement on the sensor is from about 100 micro-inches to 100 nano-inches. Converting that to micrometers, we get 2.5 micrometers to 2.5 nanometers. Thus the worst case error is about half the sensor pitch on the D800E or a7R. If your shutter speed is not faster than a quarter of the vibration’s period, or 1/40 second with our 10 Hz example, you may see all the vibrational image shifting calculated above. If it’s slower than half the vibration’s period, or 1/20 second in our example, you’re likely to see twice those numbers.

However, the tripod legs don’t rest on the same point on the floor, and moving one of the legs up and the other down will cause the camera to point at a different place on the subject. Let’s assume the same 10Hz vibration as above. Assuming the camera stays pointed parallel to the floor underneath it, the worst-case upward or downward tilt in degrees is 0.0175 times the peak amplitude of the vibration divided by its wavelength times 2 times pi. This means that our 0.01 inch amplitude, 250-foot-wavelength — which is 3000 inches – vibration has a worst-case tilt of plus and minus 0.366 micro-degrees. With the subject 5 feet – 60 inches — away, this translates to plus and minus 0.0013 inches on the subject, and a tenth that, 130 micro-inches or 5 micrometers, on the sensor. Double this to get the peak to peak variation. This is more than two pixels on the two 36 megapixel cameras we’re discussing, and twice as much as the displacement error. As above, you can cut these numbers in half by using a shutter speed of 1/40 second. [I’d appreciate it if any interested people would check my math.]

Another way to come at this is to look at the vibration criteria that engineers have developed to figure out how little vibration you need to do certain kind of measurements. Here’s a good paper on the subject.  Take a look at the table at the top of page three and note that it says that, for details of 3 micrometers, a little smaller than our sensor pitch, you need to keep vibration velocities to under 1000 micro-inches per second.

The take-home lesson here is that, now that out cameras have such great resolving power, there are fundamental vibrational limitations which may prevent our using all the capabilities they have. We may usually operate in environments that have well under a hundredth of an inch peak vibration. On the other hand, distant trains or subways, motor vehicle traffic, swaying buildings, etc. can generate even higher amplitudes than that.

Now that we’ve discussed the things we can’t control even with a 2-ton tripod and a milled-from-a-solid-block-of-diamond head, let’s move on what we can, and how they’re similar in an optical bench and a camera support. Those things are resonances and damping.

I think I’ll let that wait until tomorrow.

The Last Word

← Looking back – and forward On vibration control, part 2 →

Trackbacks

  1. Sony a7R testing, part 1 | The Last Word says:
    January 3, 2015 at 3:59 pm

    […] series on the physics, engineering, and practical implications of vibration control starts here. There’s a simulation study of the a7R’s shutter shock […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes
  • Wedding Photographer in DC on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Fujifilm GFX 100S II precision
  • Renjie Zhu on Fujifilm GFX 100S II precision
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.