LensRentals and my CO 60/4 head to head

This is a continuation of testing of  the following macro lenses :

  • Sony 90mm f/2.8 FE Macro
  • Leica 100mm f/2.8 Apo Macro-Elmarit-R
  • Zeiss 100mm f/2 Makro-Planar ZF
  • Nikon 105mm f/2.8 Micro-Nikkor G VR
  • Coastal Optical 60mm f/4 UV-VIS-IR

Focus shift and LoCA in the Leica-R 100/2.8 Apo Macro

 

I figured that the best way to get an apples to apples comparison of the Lensrentals Coastal 60 and my copy was to ignore the numbers on the aperture setting ring, and use the camera’s light meter to estimate the f-stops. It’s a bit crude, what with the meter reading in 1/3 stop increments, but it’s what I’ve got.

Longitudinal chromatic aberration (LoCA) at f/4 through f/11, in pairs, with the LensRentals lens being the first of each pair:

Loca LRCO 4

loca my co 4

loca lr co 56

loca my co 56

loca lr co 8

loca my co 8

loca lr co 11

loca my co 11

 

Now the lenses look a lot more alike.

White-balanced focus shift for the two lenses:

fa lr co wb 2

fs my co wb

My lens peaks at f/5.6, which is what I’d expect. The LensRentals lens peaks at f/8. which is not what I’d expect. Of course, doing this in whole-stop intervals in kind of crude, but the test is time consuming enough as it is.

I think I’m going to tell myself to be happy with the Coastal 60/4 that I have.

 

 

 

Bad copy of the CO 60/4?

This is a continuation of testing of  the following macro lenses :

  • Sony 90mm f/2.8 FE Macro
  • Leica 100mm f/2.8 Apo Macro-Elmarit-R
  • Zeiss 100mm f/2 Makro-Planar ZF
  • Nikon 105mm f/2.8 Micro-Nikkor G VR
  • Coastal Optical 60mm f/4 UV-VIS-IR

Focus shift and LoCA in the Leica-R 100/2.8 Apo Macro

Earlier I reported on more focus shift that seemed right in the Coastal 60mm f/4 Macro lens:

Focus shift and LoCA in the Coastal 60/4 at 1:10

I got a copy of the CO 60/4 from LensRentals yesterday to see if it measured differently than my copy.

The answer?

Not by much.

The LoCA curves are excellent, and I won’t bore you with them.

Here’s the white balanced focus shift curve:

LR CO 60 focus shift wb

You’ll note that the f/4 and the f/5.6 curves are right on top of each other. The exposure was the same for those to runs, too. It looks like the diaphragm blades aren’t moving much between those settings. My very own CO 60 only shows a third stop drop in light when you move the aperture ring from f/4 to f/5.6, and the focus shift curves for those two stops were very close.

It’s now looking like either the focus shift is designed into the lens (based on a sample space of two, which isn’t really enough for any kind of certainty), or there is a systematic manufacturing problem.

By the way, I find it odd that the f/11 numbers are better than the f/8 numbers, and I put this down to the lens really being at f/8 when it’s set at f/11 and f/5.6 when it’s set at f/8. I also think my labled aperture values in the curves for the CO lens that I previously presented were off for the same reason, just not as far.

Measuring the entrance pupils with dial calipers — thanks Frans! — yields this:

co fstops vs indicated

That should be pretty self explanatory, but just in case it isn’t, the first column is the indicated f/stop. The next two are my (necessarily imprecise, considering my method) entrance pupil measurements in inches, the next two convert those to mm, and the last two are the calculated f-stops assuming the focal length is 60mm.

Those numbers jibe with the exposure meter readings.

Kinda weird. At least it doesn’t appear that I have a bad copy of the lens.

Off-axis MTF results with the Sony 90/2.8

This is a continuation in a discussion of spatial frequency response (SFR) and modulation transfer function (MTF) testing reproduciblity. The series starts here:

Towards a reproducible MTF testing protocol

This post is a continuation of the last one in this series:

Off-axis MTF testing

The thrust of that post was that moving the target rather than the camera had the potential of making practical off-axis LoCA and focus shift testing by changing object distance.

A problem that I ran into in trying to turn theory into practice was that small alignment errors made the razor blade’s position in the frame not constant throughout the series. One way to deal with that is to black out the perforations in the razor blade with tape so that the region of interest (ROI) can be larger and MTF Mapper won’t get confused by edges other than the one I want it to look at. I will do that in future tests, but I didn’t for the work reported here, figuring that If I could programmatically deal with a moving razor blade with holes in it, blacking out the holes would only make things simpler.

The way I dealt with the target’s motion in the frame has a long tradition in engineering: even if you know something is nonlinear, assume it is linear and see how far you can get.

I set up the program so that it knows how many images there are in one series, a series being defined as a sequence of images in which only the target position changes. It presents the first image and the last image in the series to the user (moi), and asks him to pick the location of the ROI. Then it runs the whole series, linearly interpolating the location of the ROI between the two end points.

Why did it take me several days to make a change like that? While I was at it, I took the opportunity to reorganize the program, to report its progress in a way that I don’t waste a lot of time getting an incorrect result, and make it more tolerant of cases where MTF Mapper can’t make sense of a particular image. I consider that time well spent, since it will make future modifications to the program easier. In this context, let me express my thanks one more time to Jack Hogan, who wrote the Matlab code that I started with.

Let’s look at the longitudinal chromatic aberration (LoCA) in the upper right corner of the frame, with the Sony 90mm f/2.8 FE macro lens at f/2.8 with the target nominal (center of the horizontal axis of the graph) at approximately 2 meters:

 

loca sony 90 ur 28

Distance is the horizontal axis, with the left hand size having the subject farther from the camera than the right hand side (The camera moves closer to the subject by 1.9 mm after each exposure).  There were 101 exposures in the series, and thus the total travel was 190mm. The vertical axis is MTF50, measured in cycles per picture height, assuming the entire sensor is used.

Now the other whole stops through f/11:

loca sony 90 ur 4

loca sony 90 ur 56

loca sony 90 ur 8

loca sony 90 ur 11

This is pretty spectacular performance, and compares favorably to the results on-axis.

Let’s look at focus shift in each raw plane:

focus shift sony 90 ur red

focus shift sony 90 ur green

focus shift sony 90 ur Blue

Especially in the green plane, there is enough focus shift that you’ll want to focus at taking aperture. There is slightly more focus shift than in the on-axis graphs, indicating that field curvature changes very slightly with aperture setting. I don’t measure field curvature directly in these tests; in fact, I do all I can to calibrate it out.

For completeness, here are the focus shift graphs for a white balanced image:

focus shift sony 90 ur WB

Compare those to the corresponding  on-axis graph:

sony 90 2 meters WB focus shift

The bottom curve has 51 samples per aperture, and the top one 101 samples. The bottom one also has f/16 and f/22 data.

There’s not much loss in sharpness at the corner.  Good job, Sony!

IR hills with a 400

I’ve struggled all day tring to get the code to handle the moving target SFR calculations to work, and with confusingly erratic results. So, instead of posting off-axis MTF50 results, I’ll show you a few images that I made last week with a Nikon 400/2.8 in front of my IR-modded a7RII.

 

DSC07524-Edit

DSC07067-Edit

DSC06402-Edit

DSC07037-Edit

DSC07031-Edit

DSC07483-Edit

DSC07082-Edit-2

DSC06404-Edit

DSC07086-Edit

DSC07518-Edit

Off-axis MTF testing

This is a continuation in a discussion of spatial frequency response (SFR) and modulation transfer function (MTF) testing reproduciblity. The series starts here:

Towards a reproducible MTF testing protocol

A reader suggested that I reconfigure my testing protocol to move the razor blade target rather than the camera. That approach has several advantages:

  • The target is in general lighter than the camera, so less wear on the rail
  • I could accommodate heavy camera/lens combinations which exceed the capacity of the rail.
  • For magnifications of less than 1:1, the effect of rail vibrations on the test capture is less if the target moves than if the camera moves. This means I can program faster settling times.

There is one disadvantage. I now need cables to connect the rail controller to gear mounted on two tripods (one cable to trip the shutter, and one to drive the rail). If the target is a couple of meters from the camera, then the controller should ge on the tripod with the camera. If the camera and the target are much farther apart, then the controller should go on the tripod with the target to keep the length of the cable that drives the rail down.

However, all the little stuff above is trumped by one huge advantage: moving the target gives me the geometry I need to do off-axis testing.

On-axis geometry is undemanding. Aim the camera at the target, center it, and it’s on the lens axis. If you move the camera towards the target, the target stays on-axis, and everything works fine. If you move the target towards the camera along the lens axis line, that works too.

But what if you want to measure corner SFR? You line up the camera so that the razor blade is in the corner, and focus to get rid of field curvature effects. But if the camera is on the rail, how do you line the rail up to move the camera along the line traced by the path from the razor blade through the lens to the place where the blade is imaged on the sensor? It makes my head hurt just to think about it.

If you move the target in a direction orthogonal to the plane of the razor blade, you’re golden, assuming the blade was square to the camera when everything was on-axis.

So now it should be easy to check the MTF anywhere in the frame.

At least, it seems like it should be easy. With these things, as with most things in experimental science or engineering, you never know for sure until you try. So I set up a test. I put the Sony 90mm f/2.8 FE macro on a Sony a7RII 2 meters from the target, set the magnification to as far as it would go in the upper right of the image, adjusted the aiming of the camera so that the razor blade was centered, set the rail for 190mm of travel and 101 exposures, and made a series at whole stops from f/2.8 through f/11.

I ran the files through my MTF cruncher, and that’s when I ran into trouble. The razor blade wasn’t in the same location at both the near and far ends of the rail. I suspect two things.

First, after I line up the camera on-axis, I should really rotate the camera/lens assemble around the lens node just as if I were doing stitching. I didn’t think of that.

Second, it’s a lot harder to line the rail up precisely facing the camera than it is to line the rail up facing the target. The camera has a viewfinder, which is a precise and accurate aiming tool, an the rail itself does not. It would probably be a good idea to attach a laser pointer to the rail for aiming.

Because of both these things, I think I’m going to have to revise my procedures so that I’ll get results even with small movements of the target in the frame of the captured images.

There are two pieces to this. First, block off the internal perforations in the razor blade with gaffer tape so I’ll have more freedom to place the region of interest without creating what MTF Mapper will identify as edges. Second, rewrite the MTF Mapper front end program to track changes in the target position. That’s going to take a while, but will yield other benefits, since I’ll make the code more modular while I make the changes.

 

 

 

MTF testing & lighting

This is a continuation in a discussion of spatial frequency response (SFR) and modulation transfer function (MTF) testing reproduciblity. The series starts here:

Towards a reproducible MTF testing protocol

The basic lighting technique that I’ve been using for my razor blade MTF testing has been to backlight a piece of white paper with two LED panels at a 45-degree angle to the paper. I put the razor blade a couple of feet in front of the white paper. There are two reasons for that. I want the paper well out of focus, and I don’t want any atray light falling on the front of the razor blade, which is supposed to be silhouetted against the paper.

From the Sony a7RII’s perspective, the setup looks like this:

_DSC9136full

 

I use APS-C crop mode to save disk space and processing time. If you don’t, the razor blade will appear smaller in your pictures.

How sensitive are the MTF50 values derived from the above setup to lighting variations? That’s mostly what this post is about.

The Westcott LED panels that I use have two sets of LEDs: one with a color temperature of 2800 degrees Kelvin, amd one with a color temp of 6000K. On the controls there’s a dial that lets you set the temperature of the combined light. It operates by mixing varying amounts of the two kinds of LEDs. Intensity is not corrected for.

I set the panels to 2800K, 5000K, and 6000K, with lighting level to 100%, and made a series of 51 razor blade exposures at each color temperature with 2mm of travel between each exposure, and the Sony 90mm f/4 FE macro lens. The I turned off one of the LED panels and made another set of exposures at 5000K.

Here are the MTF50s of the three raw color planes:

MTF lighting test red ch

MTF lighting test green ch

MTF lighting test blue ch

Distance is the horizontal axis, with the left hand size having the subject farther from the camera than the right hand side (The camera moves closer to the subject by 2 mm after each exposure). The vertical axis is MTF50, measured in cycles per picture height, assuming the entire sensor is used.

Here’s the white balanced MTF50 data:

MTF lighting test WB ch

And the difference between the four test runs and the average of the four test runs at each distance:

MTF lighting test WB error

The above data expressed as a percentage error:

MTF lighting test WB pct error

The errors appear to consist of a noise component that is unrelated to position, and which probably can be smoothed out, although at this point in the development of a protocol, I resist smoothing, as I think it can obscure underlying effects. For the two 5000K curves, there appears to be a systematic error that is position dependent (or maybe time-dependent), although it goes in opposite directions in the two curves.

At this point, I consider plus or minus 4% to be not bad accuracy, and I don’t think that lighting color temperature or geometry is particularly critical for this protocol.

I’ve been asked if flash illumination would be useful for this kind of testing. I see no reason why it wouldn’t work, but I have a few caveats.

If you use continuous illumination, you can, with proper exposure compensation, let the camera’s automatic exposure system control the shutter while you do the usual constant-aperture tests. This is a convenience, but I think the main advantage is the elimination of a class of experimental error. I suppose that, with TTL flash, that advantage doesn’t have to go away, but my studio strobes don’t support TTL, and I’ve never tried that.

You may find the situation different with small flashes, but I’ve found that I can’t turn my strobes down far enough to get wide open exposures with fast lenses, and I end up having to put diffusers in front of them. You might think that you can use bounce flash to throw away some photons, but I think it would be tricky to do that and not end up with stray light on the front of the razor blade.

Using flash also means that you can’t do runs that test the camera’s resistance to vibration-induced blurring.

If you do decide to use flash, I would recommend a flash with a short duration to mitigate any vibration that might occur. If your camera doesn’t offer EFCS, then flash in a dark room with a shutter speed of 1/2 second or slower and trailing curtain synch may be the only way you can deal with shutter vibrations.

 

MTF testing & vibration

This is a continuation in a discussion of spatial frequency response (SFR) and modulation transfer function (MTF) testing reproduciblity. The series starts here:

Towards a reproducible MTF testing protocol

 

A reader brought up the subject of vibration and its effect on SFR testing.

I suppose we should deal with that sooner rather than later. However, I warn you that I have very little to say about vibration and sharpness testing that constitutes good news.

Let new tell you about my testing setup from the perspective of vibration.

I’ll start with external (out of the camera) vibration. I do my work in my basement, which has a six-inch concrete slab covered with unpadded vinyl tiles. My house is a mile from the nearest public road. I use heavy RRS carbon fiber legs on the tripod to which the camera is mounted, and I use the rubber indoor hemispherical pads on the feet. In theory, spikes would be better, but to preserve domestic tranquility, i wish to avoid scratching the floor.

I also mount the razor blade carrier to a set of (smaller) RRS legs, which rest on the same vinyl tiles with similar pads.

Because of where I live, in the absence of earthquakes, there are no material vibrations from things not on my property. I have found this setup to be resistant to local vibrations such as footfalls. I have not tested what would happen if someone were using a flail mower right outside, but I expect that that would cause problems.

When I use charts mounted to a wall in the basement, footfalls can be a problem with sharp lenses measured on-axis. I expect that they vibrate the door jamb and shake the door to which the targets are mounted. Thus, when I work with charts, I need to be concerned with who else is in the house and what they’re doing.

In an environment where there are low-frequency external vibrations, mounting both the target and the camera to tripods supported  by the same floor structure can help get consistent results, since low-frequencies will tend to lift and lower both the camera and the target synchronously. That only works for translation, though. If the floor twists as it rises an falls, all bets are off.

What about camera-induced vibrations?

Before electronic first-curtain shutters (EFCSs), I never could achieve the results that I currently get with focal plane shutter cameras. If your camera has neither a leaf shutter nor EFCS, I think it would be a huge uphill battle to get reproducible SFR results, and I’d advise you against trying.

With EFCS, camera-induced vibration is rarely a problem. The trailing curtain launch acceleration does cause some measurable SFR degradation at shutter speeds around 1/125 to 1/30 second, so in a perfect world I’d adjust the lighting levels or the camera’s sensitivity to avoid those speeds when testing the best apertures of the best lenses.

Unless specifically testing for shutter-induced vibration, I’d avoid portrait orientation, since tripods are less stiff horizontally than vertically.

If you live in an urban environment, or one near train tracks or highways, you need to be concerned with vibrations external to the building in which you do the testing. If you plan to test in a large building with heavy machinery (including HVAC equipment) and people whose movements you cannot at least briefly control, you should think about the effects of that vibration on your testing.

I wish I could provide some quantitative guidance. I had a couple of sites that had some good material on the subject of environmental vibration, but my old links are broken now. I’ll do some more looking.

 

 

Towards a reproducible MTF testing protocol

There has been a little interest in developing a protocol for slanted edge modulation transfer function (MTF) testing that is sufficiently standardized and reproducible that the results from various amateur practitioners can be compared fairly.

There are several things that conspire to make slanted edge MTF testing irreproducible:

  • Alignment errors
  • Field curvature
  • Target differences
  • Lighting differences
  • Focus errors

In this post, I’ll propose ways to deal with each of these things, based upon the backlit razor blade methods that I’ve developed for testing macro lenses on-axis. In all cases, I’ll trade off time and the number of exposures for repeatability and reproducibility. The idea is that this protocol might be useful for a single person testing her own lenses, not an operation like Lens Rentals, where time is money.

I’ll assume you have a MTF testing program like Imatest or MTF Mapper, and you know how to use it. If not, feel free to read on, but when you’re finished with this post, you won’t have everything you need to go and do the measuring.

Alignment errors can be dealt with by making the slanted edge subtend only an angle in the field of view big enough for accurate MTF calculations. This means only a few edges per exposure. For now, let’s think in terms of one or two.

Field curvature can be handled, if the edges are compact, by focusing on the area in the field containing the edges, no matter where in the frame that happens to be.

Target differences can — I hope — be handled by using a readily available double edged razor blade as the target. I have no way to print a target that is anywhere near as sharp as using a backlit double-edged stainless steel razor blade. When I print targets, the numbers that I get are affected by the file contrast, the print size, the paper finish, the printer, and the camera distance. The razor blade is so sharp that camera distance is a non-issue, and none of the other variables apply.

I have not used the Imatest photographic transparencies with their light box, one because they are so small and two because they are expensive and, I believe, beyond what most of my readership is willing to spend to test lenses.

Lighting differences will be taken care of with readily-reproducible illumination that is non-critical.

Focus errors will be managed by using multiple exposures and a focusing rail. Ugh, right? Yeah, I know, that’s a PITA, but I know of no other way to find the point of peak focus reliably. I’ve given up manually focusing when accuracy is primary. I use a motorized rail for focus bracketing. I can’t get repeatable results any other way. Making lemonade out of lemons, we’ll use the rail to get longitudinal chromatic aberration ( LoCA) and focus shift data at the same time.

The test revolves around a razor blade.

The blade is stuck into a block of wood, which is mounted to a tripod with a gear head that I can use for alignment. Any way to get the blade perpendicular to the lens axis will work, and the alignment is not critical. Behind the razor blade is a white piece of paper illuminated by two light sources. I use Westcott 5000K LED panels. When you turn the room lights off and the panels on, here’s what you see with a Sony a7RII, a 90mm lens and the target at 2 meters:

_DSC9136full

The above image was made with the camera set to APS-C mode, which crops the full frame image. I use this more when doing on-axis testing to save disk space and processing time. For on-axis testing, I use the framing lines in the Sony camera to put the center of the edge right smack dab on-axis, like you see here.

With the Sony a7RII, the combination of 90mm and 2 meters gives a usable target extend of about 250 pixels. You want the extent small, so that alignment is non-critical. You want it big, so that the slanted edge testing is more accurate. This seems like a good compromise. For other focal lengths, you can use the formula:

distance = 20 * focal length

where distance and focal length have the same units, say, millimeters. Yes, yes, I know; the constant should be 22.2, but I rounded. If you’re testing an APS-C 24 MP camera, you can get a little further away. If you’re testing a 24 MP full frame camera, you need to get a little closer.

You’ll note that, with a 200mm lens, you’ll want to be about 4 meters away. If that strains the size of the room you’re in it will be fine if you’re closer, since the razor blade is so sharp.

With a short lens like a 20mm, you’ll want to be about 40 cm away. You won’t be able to back up from that a lot and still have enough pixels covering the blade.

The blade angle should be somewhere between 3 and 14 degrees. Actually, if you avoid certain critical angles, it can be almost anything.

Here’s the razor blade front lit:

_DSC6707

I don’t think it makes much difference what brand you buy, but, in case it does, you now know the brand I use.

Here’s a close up of the blade when it’s back lit:

_DSC9136

 

I use a Cognisys computer-controlled focusing rail with 200 mm of travel. In order to make sure that I don’t run into the stops, I only use 190mm of that. I focus in the middle, back the rail to the far from target end, and make a series of images spanning 190mm.

How many images?

For the macro work I made 200 exposures. That took about 10 minutes. For the work I’ll show you today, I made 50 images at each aperture. That turned out to be about as few as I think I could get away with, although, if you aren’t aiming for the whole LoCA/focus shift/peak MTF50 package, you might be able to get away with fewer.

Could you use a manual focusing rail? I think you could, but it would get to be a pain after a while, unless you just wanted peak MTF for each f-stop, in which case you could focus at that f-stop and focus bracket four or five shots on either side of that.

I used the Sony 90mm f/2.8 macro as my test lens.

I developed the files in DCRAW, and analyzed them for MTF50 in each raw channel using MTF Mapper. In MTF Mapper, I used a 150×150 pixel region of interest (ROI), even though I could have used a bigger one, do make sure the razor blade was large enough in the target for useful data. Here are the LoCA results for each whole f-stop:

sony 90 2 meters 28 loca

Distance is the horizontal axis, with the left hand size having the subject farther from the camera than the right hand side (The camera moves closer to the subject by 3.8 mm after each exposure). The vertical axis is MTF50, measured in cycles per picture height, assuming the entire sensor is used.

It is pretty obvious that the green and blue raw channels focus in the same place, and the red channel focuses farther away.

 

sony 90 2 meters 4 loca

sony 90 2 meters 56 loca

sony 90 2 meters 8 loca

sony 90 2 meters 11 loca

sony 90 2 meters 16 loca

sony 90 2 meters 22 loca

As the aperture is closed down, the depth of field swamps out the LoCA.

Plotting all apertures and each of the raw channels on its own chart give us focus shift information:

sony 90 2 meters green red shift

You can see that I probably missed the peak of the f/5.6 curve, indicating that 50 exposures is marginal.

sony 90 2 meters green focus shift

sony 90 2 meters green blue shift

Ub this one, I probably missed the peak of the f/4 curve.

A white balanced set of curves uses all the raw channels:

sony 90 2 meters WB focus shift

Because there’s more data, the curves look smoother, but the f/4 one is still suspiciously pointy on top.

Where shall I go with this? I’m open for suggestions.

 

 

 

 

 

 

 

 

Aliasing at 42 MP with a sharp lens

After seeing the aliasing and false color with the Otus 85 wide open on the Sony a7RII in yesterday’s post, it’s reasonable to ask how much the lens aberrations at f/1.4 are reducing aliasing and false color, even if it’s obvious that they are not eliminating them.

To do that, we can look at a series of similar images made at f/2.8, which is the Otus 85/1.4’s sharpest on-axis aperture.

The images are slightly darker. The reason for that is the way the a7RII meters, combined with the strong light falloff that the Otus has wide open. The camera sees the darker edges and corners at f/1.4 and opens up to compensate.

At one extreme of defocusing:

_DSC9043

At the other end of the rail:

_DSC8993

At a few places in the middle:

_DSC9012

_DSC9019

_DSC9023

_DSC9015

_DSC9014

_DSC9016

It is interesting that the places on the rail where there’s the most false color are not necessarily the places where there’s the most monochromatic aliasing.

My conclusions are that the aberrations of the lens wide open are making little difference in the visible aliasing.

 

Comment subscriptions added

I’ve had a request for an additional function on this blog: the ability to be notified via email when someone replies to your comment.

I think I’ve added that function.

When you comment now, you should see something like this at the bottom of the commenting page:

comment subscriptions

You’ll have several choices from the drop down list to pick from before hitting the big green Post Comment button:

  • Replies to my comments
  • Don’t subscribe
  • All

The first is the default, and you will receive email only when someone replies to a comment that you made on this blog.

The second allows you to not be notified about replies to the comment that you are making. You will continue to be notified about replies to other comments to which you have subscribed.

The third allows notifications upon the occurrence of any comment to the post to which you are commenting.

As you can see, there’s also a link you can follow to subscribe to comments to the thread without adding one of your own.

Please let me know how this is working.

 

Photography meets digital computer technology. Photography wins — most of the time.

Entries RSS
Comments RSS