For the past couple of weeks, I’ve been working on a simulator to assist in visualizing the effects of lens aberrations. In the interest of computing speed and programming effort, I’ve had to make a lot of compromises, but I hope it will be useful. I’ll be presenting results soon, but first I’d like to present the quite lengthy list of caveats.
Optical Model Limitations
- No phase modeling
Aberrations are applied to intensity PSFs only. There’s no phase information preserved, so interference effects (e.g., from overlapping Airy rings) are lost. - Scalar PSF model
No polarization, vector diffraction, or wavelength-dependent transmission effects are modeled. - Monochromatic PSFs per color plane
Only 3 broad channels (RGB) are used, with no integration over continuous spectra or precise spectral sampling. This reduces fidelity in chromatic aberration and diffraction modeling. - PSF is precomputed per normalized radius and angle
This assumes symmetry about the optical axis, which may not hold for decentered or tilted elements.
Sampling & Spatial Resolution Limitations
- Pixel-based PSF convolution
The image is blurred at the resolution of the pixel grid, which limits simulation fidelity for high-frequency detail and introduces spatial quantization. - No sub-pixel positioning or resampling
All computations assume tile-aligned, pixel-centered PSFs — motion blur, field tilt, and image shift artifacts at sub-pixel scale are not captured. - No sensor sampling model (e.g., Bayer CFA)
I’m applying PSFs to RGB images, which assumes a perfect full-color sensor. This omits color aliasing, demosaicing artifacts, and sensor-level spatial interactions.
Computation & Performance Tradeoffs
- Tiled approximation of radial PSF
For speed, I’m discretizing space into blocks. Artifacts can arise at tile boundaries, and smooth spatial transitions are approximated. - Truncated PSFs
The PSF kernels are finite (e.g., 21×21) and implicitly windowed, which ignores long-range blur tails. - No frequency-domain simulation
All convolution is done spatially. More accurate diffraction simulation might use Fourier optics, especially for large or complex apertures.
Sensor & System-Level Omissions
- No vignetting or transmission rolloff
All tiles are equally weighted, ignoring geometric or optical falloff toward the corners of the field. - No sensor noise, quantization, or nonlinearity
Simulation is clean and idealized — no photon shot noise, ADC quantization, gain non-uniformity, or black-level variation is applied. - No motion blur, rolling shutter, or temporal effects
The model assumes a single, globally exposed frame with no temporal variation. - No lens distortion (barrel/pincushion/mustache)
Field mapping is based on angular PSFs but does not include geometric distortion from projection geometry.
Stepan Kana says
If you can simulate aberrations, can you also unsimulate them – i.e. remove them? Canon does that with their DLO – digital lens optimisation. But sometimes they overshoot. For example, with the 70-300 mm DO lens (the only DO zoom in existence, I think), the CAs (both LaCA and LoCA) are minimal, but there’s something that looks like spherical aberration esp. at close distances (the lens only focuses down to 1.4m). When you use the digital correction in Canon’s software, it seems too eager and amplifies the noise esp. at high ISO.
JimK says
It depends. LaCA is not too hard to correct. LoCA is much more difficult. Diffraction is partially correctable by deconvolution, at a cost of introducing some artifacts and increasing noise. Distortion is correctable with a small resolution penalty. Vignetting is correctable with a cost in the signal to noise ratio.