• site home
  • blog home
  • galleries
  • contact
  • underwater
  • the bleeding edge

the last word

Photography meets digital computer technology. Photography wins -- most of the time.

You are here: Home / The Last Word / Comparing downsampling algorithms, Fuji still life

Comparing downsampling algorithms, Fuji still life

September 28, 2014 JimK 4 Comments

This is the fourth in a series of posts showing images that have been downsampled using several different algorithms

  • Photoshop’s bilinear interpolation
  • Ps bicubic sharper
  • Lightroom export with no sharpening
  • Lr export with Low, Standard, and High sharpening for glossy paper
  • A complicated filter based on Elliptical Weighted Averaging (EWA), performed at two gammas and blended at two sharpening levels

The last algorithm is what I consider to be the state of the art in downsampling, although it is a work in progress. It’s implemented using a script that Bart van der Wolf wrote for ImageMagick, an image-manipulation program with resampling software written by Nicholas Robidoux and his associates.

This post using a Fuji demonstration image. This is the first target image that is actually photographic, in that it was captured by an actual, not a simulated, camera.

Here’s the whole target:

FR4blog

Now I’ll show you a series of images downsampled to 15% of the original linear dimensions with each of the algorithms under test, blown up again by a factor of 4 using nearest neighbor, with my comments under each image.

 

Bilinear interpolation
Bilinear interpolation

Bilinear interpolation, as implemented in Photoshop, is a first-do-no-harm downsizing method. It’s not the sharpest algorithm around, but it hardly ever bites you with in-your-face artifacts. That’s what we see here.

Bicubic Sharper
Bicubic Sharper

Photoshop’s implementation of bicubic sharper, on the other hand, is a risky proposition. Look at the halos around the flower stems, the flowers themselves, the clock, and just about everywhere.

Lightroom Export, No Sharpening
Lightroom Export, No Sharpening

With the sharpening turned off, Lightroom’s export downsizing is, as usual, a credible performer. It’s a hair sharper than bilinear — though in this image the two are very close — and shows no halos, or any other artifacts that I can see.

I’ll skip over the various Lightroom sharpening options, and just include the images at the end. We’ve seen before that these don’t provide better performance than no sharpening when examined at the pixel-peeping level, although they might when printed.

EWA, deblur = 50
EWA, deblur = 50

For this crop, EWA looks a lot like Lightroom’s export processing, but with some lightening of the first third of the tone curve in high-spatial frequency areas. Look at the clock near the white flower, and the green stems near the yellow flower at the upper left corner.

 

EWA, deblur  = 100
EWA, deblur = 100

Withe the deblur dialed up to 100, the image crisps up nicely. The downside is mild haloing around the clock and the stems.

Lightroom Export, Low Sharpening
Lightroom Export, Low Sharpening
Lightroom Export, Standard Sharpening
Lightroom Export, Standard Sharpening
Lightroom Export, High Sharpening
Lightroom Export, High Sharpening

In general, the differences with this scene are less striking than with the artificial targets used in previous posts.

The Last Word

← Comparing downsampling algorithm noise performance Noise performance of downsampling with eliptical weighted averaging →

Comments

  1. Edward says

    September 28, 2014 at 5:26 pm

    Approaches that involve scaling then sharpening, based on a reduced set on information, seem fundamentally flawed to me (unless you are deliberately aiming to introduce halos to increase apparent sharpness). The sharpening is an attempt to recreate some of the detail you’ve just thrown away – why not just do a better job of selecting what information you keep and what you discard during the scaling process? ImageMagick gives you crude control over that selectivity with the filter:blur parameter.

    I’ve found even a simple Gaussian downsample in ImageMagick comes pretty close to the performance of your EWA examples, with the right selection of blur factor. As far as I can tell ImageMagick’s Gaussian downsample is equivalent to Gaussian blur followed by “true” bilinear (i.e. nearest four neighbours) downsampling. The degree of blurring is controlled by the “-define filter:blur=” parameter, with 1.0 as the default. It operates as a convenient slider to trade off between softness and noise/moire/blockiness. I’ve had fairly good results with a value of 0.7, but the optimal value is going to depend on the source image and your intent/preference for the output. Here’s what I used on Bruce Lindbloom’s test image:

    convert DeltaE_16bit_gamma2.2.tif -colorspace RGB -define filter:blur=0.7 -filter gaussian -resize 15% -filter point -resize 400% -colorspace sRGB foo.tiff

    Reply
    • Jim says

      September 28, 2014 at 8:32 pm

      As far as I can tell ImageMagick’s Gaussian downsample is equivalent to Gaussian blur followed by “true” bilinear (i.e. nearest four neighbours) downsampling.

      That’s what I was doing with Matlab’s imresize when I started this series. I did plot some noise reduction graphs here:

      http://blog.kasson.com/?p=7101

      but I never got around to looking at the spectra and comparing it to the other methods. I started out looking at downsizing algorithms with the idea of gaining some insight into the noise part of photographic equivalence theory. I’ve gone far enough down that road to be convinced that it’s valid. If you want to participate in working of perfecting downsampling, I suggest you head on over to this immense thread:

      http://www.luminous-landscape.com/forum/index.php?topic=91754.0

      and dive in.

      I’ll be finishing this series up in a couple of days and moving on to some work on color space transforms. I may return to it if there’s some progress. You can be part of that if you like.

      Thanks,

      Jim

      Reply
      • Edward says

        September 28, 2014 at 11:33 pm

        Thanks – it’s a fascinating thread, but for downsampling it seems like a case of severly diminishing returns. If you rate that complex EWA method a 10, then I reckon a simple Gaussian downsample with the right blur value comes in at around a 9 (with the advantage that it’s simple enough for me to completely understand, so it can never surprise me with unexpected artefacts).

        Reply
      • Nicolas Robidoux says

        September 29, 2014 at 5:45 am

        Edward: TTBOMK, Gaussian with .7 deblur is quite moiré-prone given how blurry it is. Try downsampling the fly http://upload.wikimedia.org/wikipedia/commons/8/85/Calliphora_sp_Portrait.jpg.

        Reply

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    

Articles

  • About
    • Patents and papers about color
    • Who am I?
  • How to…
    • Backing up photographic images
    • How to change email providers
    • How to shoot slanted edge images for me
  • Lens screening testing
    • Equipment and Software
    • Examples
      • Bad and OK 200-600 at 600
      • Excellent 180-400 zoom
      • Fair 14-30mm zoom
      • Good 100-200 mm MF zoom
      • Good 100-400 zoom
      • Good 100mm lens on P1 P45+
      • Good 120mm MF lens
      • Good 18mm FF lens
      • Good 24-105 mm FF lens
      • Good 24-70 FF zoom
      • Good 35 mm FF lens
      • Good 35-70 MF lens
      • Good 60 mm lens on IQ3-100
      • Good 63 mm MF lens
      • Good 65 mm FF lens
      • Good 85 mm FF lens
      • Good and bad 25mm FF lenses
      • Good zoom at 24 mm
      • Marginal 18mm lens
      • Marginal 35mm FF lens
      • Mildly problematic 55 mm FF lens
      • OK 16-35mm zoom
      • OK 60mm lens on P1 P45+
      • OK Sony 600mm f/4
      • Pretty good 16-35 FF zoom
      • Pretty good 90mm FF lens
      • Problematic 400 mm FF lens
      • Tilted 20 mm f/1.8 FF lens
      • Tilted 30 mm MF lens
      • Tilted 50 mm FF lens
      • Two 15mm FF lenses
    • Found a problem – now what?
    • Goals for this test
    • Minimum target distances
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Printable Siemens Star targets
    • Target size on sensor
      • MFT
      • APS-C
      • Full frame
      • Small medium format
    • Test instructions — postproduction
    • Test instructions — reading the images
    • Test instructions – capture
    • Theory of the test
    • What’s wrong with conventional lens screening?
  • Previsualization heresy
  • Privacy Policy
  • Recommended photographic web sites
  • Using in-camera histograms for ETTR
    • Acknowledgments
    • Why ETTR?
    • Normal in-camera histograms
    • Image processing for in-camera histograms
    • Making the in-camera histogram closely represent the raw histogram
    • Shortcuts to UniWB
    • Preparing for monitor-based UniWB
    • A one-step UniWB procedure
    • The math behind the one-step method
    • Iteration using Newton’s Method

Category List

Recent Comments

  • JimK on Goldilocks and the three flashes
  • DC Wedding Photographer on Goldilocks and the three flashes
  • Wedding Photographer in DC on The 16-Bit Fallacy: Why More Isn’t Always Better in Medium Format Cameras
  • JimK on Fujifilm GFX 100S II precision
  • Renjie Zhu on Fujifilm GFX 100S II precision
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • JimK on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF
  • Ivo de Man on Fuji 20-35/4 landscape field curvature at 23mm vs 23/4 GF

Archives

Copyright © 2025 · Daily Dish Pro On Genesis Framework · WordPress · Log in

Unless otherwise noted, all images copyright Jim Kasson.