• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / One-dimensional sharpening

One-dimensional sharpening

August 29, 2014 JimK Leave a Comment

In the last couple of posts, I talked about how to smooth slit-scan photographs in the time direction. For the time being, I consider that a solved problem, at least for the succulents images.

These images require a lot of sharpening, because

  • the subject has a lot of low-contrast areas
  • there’s a lot of diffraction, because I’m using an aperture of f/45 on my 120mm f/5.6 Micro-Nikkor ED
  • even with that narrow f-stop, there are still parts of the image that are out of focus

I’ve been using Topaz Detail 3 for sharpening. It’s a very good program, allowing simultaneous sharpening at three different levels of detail, and having some “secret sauce” that all but eliminates halos and blown highlights. Like all sharpening programs that I’d used before this week, it sharpens in two dimensions.

However, I don’t want to sharpen in the time dimension, just the space one. Sharpening in the time dimension will provide no visual benefits — I’ve already smoothed the heck out of the image in that dimension — and could possible add noise and undo some of my smoothing.

I decided to write a Matlab program to perform a variant of  unsharp masking in just the space direction.

To review what one of the succulent images looks like after stitching and time-direction smoothing cast your eyes upon this small version of a 56000×6000 pixel image:

Overall381

The horizontal direction is time; the image will be rotated 90 degrees late in the editing process. The vertical direction is space, and is actually a horizontal line when the exposure is made.

Here’s the program I’m using to do sharpening in just the vertical direction, using a modification of the technique described in this patent.

First, I set up the file names, specify the coefficients to get luminance from Adobe RGB, and specify the standard deviations (aka sigmas) and weights of as many unsharp masking kernels as I’d like applied to the input images. There are four sets of sigmas and weights in this snippet:

1dsharpCode1

Then I read in a file and rotate the image if necessary so that the space direction is up and down:

1dsharpCode2

I convert the image from 16-bit unsigned integer representation to 64-bit floating point and remove the gamma correction, then compute a luminance image from that:

1dsharpCode3

I create a variable, accumulatedHp, to store the results of all the (in this case, four) high-pass filter operations, create a two-dimensional Gaussian convolution kernel using a built-in Matlab function called fspecial, take a one-dimensional vertical slice out of it, normalize that to one, apply the specified weight, and perform the high-pass filtering on the luminance image and store the result in a variable called hpLum, and accumulate the results of all the high-pass operations in accumulateHp:

1dsharpCode4

The I add one to all elements of the high-pass image to get a usm-sharpened luminance plane, and multiply that, pixel by pixel by each plane of the input image to get a sharpened version:

1dsharpCode5

Finally, i convert the sharpened image into gamma-corrected 16-bit unsigned integer representation and write it out to the disk:

1dsharpCode6

How does it work?

Pretty well. Here’s a section of the original image at 100%:

Image8orig

And here it is one-dimensionally sharpened with sigmas 3, 5, 15 and 15 pixels, and weights of 5, 5, 5, and 2:

Image83-5-15-35 5-5-5-2

If we up the weight of the 15-pixel high-pass operation to 9, we get this:

Image83-5-15-35 5-5-9-2

For comparison, here’s what results from a normal two-dimension unsharp masking operation in Photoshop, with weight of 300% and radius of 15 pixels:

Image8USM300-15

Finally, here’s what Topaz Detail 3 does, with small strength, small boost, medium strength and medium boost all set to  0.55, large strength set to .3 and large boost to 0:

Image8Topaz55-55-55-55-30-0

One thing that Topaz Detail does really well is keep the highlights from blowing out and the blacks from clipping. I’m going to have to look at that next unless I decide to bail and just do light one-dimensional sharpening in Matlab and the rest in Topaz Detail.

The Last Word

← Eliminating median filtering in the time direction Slit scan image processing →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on How Sensor Noise Scales with Exposure Time
  • Štěpán Kaňa on Calculating reach for wildlife photography
  • Štěpán Kaňa on How Sensor Noise Scales with Exposure Time
  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • JimK on Calculating reach for wildlife photography
  • Geofrey on Calculating reach for wildlife photography
  • Javier Sanchez on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • Mike MacDonald on Your photograph looks like a painting?
  • Mike MacDonald on Your photograph looks like a painting?

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.