• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Camera vs sensor resolution

Camera vs sensor resolution

May 4, 2014 JimK 1 Comment

There’s been a lot of discussion on the web about the relationship of lens and sensor resolution. Questions often go something like, “Is putting lens A on camera B as waste of money, since lens A can resolve X while camera B can only resolve Y?” I looked around and found that there’s a body of engineering knowledge on the subject. Since it comes from the scientific imaging world, it’s aimed at equipment and usage scenarios that are different than normal photography, but it’s worth looking at.

The first assumption that’s different from most of our cameras is that the sensor is monochromatic. Thus there’s no Color Filter Array (CFA). This certainly makes the analysis easier, but we’ll need to make some adjustments to make it apply to Bayer-patterned sensors. In the applications where this approach originates, either monochromatic imaging is suitable for the task, or a filter wheel can be used to obtain color information.

The next assumption is that the lenses are diffraction-limited in their resolving power. In some scientific and industrial applications, that’s a pretty good assumption. However, many lenses commonly used in photography are not diffraction limited at wider apertures.

The last assumption is that there’s no anti-aliasing (AA) filter on the sensor. While not common in normal digital cameras, there are an increasing number that meet this test, among them the Leica M9, and M240, Nikon D800E, the Sony a7R, and all medium-format cameras and backs.

The tradeoff between camera and sensor resolution is a quantity represented by the letter Q. I will lunch room some of the concepts involved in creating and using this metric.

The first concept we need to understand is spatial resolution. The definition of spatial resolution is so small separation between two image objects that allows them to still be resolved as distinct. We need to tighten up on that definition in order to use it to quantify resolution, and the first version of that was developed in 1879 by Lord Rayleigh. The Rayleigh criterion occurs when the location of one point lies on the first zero of the Airy disk http://en.wikipedia.org/wiki/Airy_disk from the other point.

Here’s plot of a cross-section of the Airy disk plotted against x, where x = k*a*sin(theta), k = 2*pi/wavelength, a is the radius of the lens aperture, and theta is the angle from the lens axis to the point in the image plane where the intensity is desired.

airyplot

The first zero is at x = 3.83. If we plot two Airy functions that far apart – at the Rayleigh criterion, and add them together to get the red curve, here’s the intensity vs x:

rayleigh sum all

There is also another, less rigorous (in terms of separation required) spatial resolution criterion called the Sparrow criterion, which is the separation of the Airy disks where the value at the midpoint between the two disk centers just begins to dip as the disks are moved apart. That occurs at x = 2.97, and the summed Airy disks (the red curve) look like this in cross-section:

sparrow sum plus

For anybody who wants to look behind the curtain, here’s the Matlab code to generate the above plots:

matlab sparrow

For photographers, x, as defined above, is not a particularly satisfying quantity. We don’t normally think in turns of angles from the lens axis unless we’re worried about coverage. We think about f-stops, not the radius of the aperture. We normally don’t think about the wavelength of light at all. Is there a way to cast the Airy disk dimensions and the two separation criteria into something that makes sense to us?

There is indeed.

After some mathematical manipulation, the Rayleigh criterion is:

R = 1.22 * lambda * N, where lambda is the wavelength of the light, and N is the f-stop.

So, for green light at 550 nanometers and the lens set at f/8 – a point where a really good photographic lens is likely to be diffraction-limited – R is 5.4 micrometers, or about the pixel pitch of modern sensors. What if we’re more stringent? For blue light at 380 nm, and the lens set to f/5.6, R is 2.6 micrometers, which is finer than the pixel pitch of current full-frame and larger sensors, though not tighter than the pitch of some smaller sensors.

The Sparrow criterion boils down to:

S = 0.947 * lambda * N, where lambda is the wavelength of the light, and N is the f-stop.

Optical engineers usually cavalierly round the 0.947 to one, giving

S = lambda * N

The Sparrow criterion is tougher than the Raleigh criterion. For our f/8 lens in green light, S is 4.4 micrometers, and for the f/5.6 lens in blue light, S is 2.1 micrometers.

So we’re done, right? Lenses with Sparrow criteria tighter than our sensors will be limited by the sensor, and lenses with Sparrow criteria looser than the sensor will be limited by the lens? That’s sort of right, if you take account of Dr. Nyquist’s teaching that you need two samples of a cycle of some spatial frequency in order to reconstruct the original.

More soon.

The Last Word

← Contrast Sensitivity Functions and the Bayer Array Who are you going to believe, me or your own eyes? →

Trackbacks

  1. What’s your Q? says:
    July 26, 2018 at 8:16 am

    […] its head and look at spatial frequencies. When we do that, we can restate one of the conclusions of this post as: the upper cutoff spatial frequency of a diffraction–limited lens is one over the Sparrow […]

    Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • Javier Sanchez on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • Mike MacDonald on Your photograph looks like a painting?
  • Mike MacDonald on Your photograph looks like a painting?
  • bob lozano on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.