Tuesday, October 20, 2020

Lens softness: Why do lenses become softer at different apertures?

It's an inescapable fact that no matter how well engineered the lens, certain apertures will always be sharper than others. Why is this?

Must read

DaVinci Resolve 16.2.2 update is now available

Blackmagic Design has continued to improve its DaVinci Resolve editing software with a number of bug fixes, and some handy new features. Read on to find out what's new.

Sony IMX500 sensor – Cameras are about to get a lot smarter

When Sony tells us that it has integrated AI into the same single-chip device as an imaging sensor in its IMX500 series, that’s interesting from all sorts of angles.

Lil Miquela: Signed CGI characters are now an actual thing

Lil Miquela is the first fully CGI character to be signed to a talent agency. You might want to sit back in...

Unreal Engine 5 has been unveiled, and it’s absolutely staggering

Achieving photorealistic 3D graphics has been a holy grail ever since computers displayed the first pixel on a screen. Unreal Engine 5 would appear to have made a gigantic leap towards achieving it.

There is an unabated craze right now for shallow depth of field. But lenses are do not, generally, perform at their best when fully wide open. Why is this?

Right back to the dawn of photography, cameras have been a circular lens in front of a flat image detector. It’s been that way for well over a century, if we keep demanding more and more resolution, dynamic range, noise floor and colour performance from cameras, things may have to change. A 2050 camera may look vastly different to a 1950 camera or even a 2020 camera.

That’s true even if we consider just one part of the equation. As sensors have become bigger and sharper, the demands they make on lenses in terms of sharpness and coverage have increased out of all recognition. They’ve certainly become sufficiently big, heavy and expensive, in some cases, that sheer practicality starts to become an issue.

Zeiss Supreme Radiance lens on Arri ALEXA. Image: Zeiss
Zeiss Supreme Radiance lens on Arri ALEXA. Image: Zeiss

Lens limits

OK, there are limits to the usefulness of ever bigger and higher resolution sensors. The fundamental limits of physics are in sight in the world of sensor design, even if we haven’t quite exceeded them yet in most designs. In the world of lenses, though, there can be no such thing as perfection, and that’s been known since James Clerk Maxwell and Ernst Abbe (Abbe, incidentally, possibly invented the term “aperture stop.”)

In short, we know that lenses don’t perform well wide open – and they also don’t perform well when very stopped down, but that’s another issue. In many practical lenses this is due to failings in the precision with which the glass elements are made; open the aperture and the edges of the glass, where things are often least perfect, make a larger contribution. But that’s not the whole story: even if a manufacturer could make a lens with theoretically perfect elements, it would still suffer from monochromatic aberration – that is, distortions in the image not related to colour fringing.

Distorted light

One way to think about an optical system such as a camera lens is based on work done by the famous Carl Friedrich Gauss, and called Gaussian optics, which – if you’ll excuse a brief foray into jargon – is a paraxial approximation of how an optical system works. The theory is only completely accurate for a ray of light that goes straight down the optical axis (hence paraxial). It also only works correctly in situations where all the lens elements are spherical, though that’s often the case in real world situations. As we choose wider and wider apertures, rays of light can pass through the lens at a larger variety of angles.

All this explains the familiar reality of depth of field. Pinhole cameras don’t have depth of field; they have resolution limited by the size of the pinhole, but everything is at least that well-focussed. With glass lenses and an aperture far larger than a pinhole, light can go places other than straight down the middle. The imperfection of Gaussian optics means that optical systems image one plane – an area of a given depth in front of the lens – in another plane, the second plane being the image sensor in a practical camera. Everything else is blurred. 

Effect of aperture on blur and DOF. The points in focus (2) project points onto the image plane (5), but points at different distances (1 and 3) project blurred images, or circles of confusion. Decreasing the aperture size (4) reduces the size of the blur spots for points not in the focused plane, so that the blurring is imperceptible, and all points are within the DOF. Image used under Creative Commons from Wikipedia.

But as we know from experience, even things which are properly focussed can be fuzzy to at least some extent, especially at wide apertures. There are a lot of reasons this happens, some of which would involve a Ph.D in quantum electrodynamics to fully understand. The late, great Richard Feynman has a fantastic book on the subject called QED, in which it is explained, about as accessibly as possible, why refraction actually occurs (and it is mildly mind-melting). 

Spherical aberration

Let’s stay at a more familiar level, though, and look at just one example to get a flavour of what we’re up against.

A big contributor to fuzziness in real world lenses is spherical aberration. In short, the surfaces of most lens elements in most lenses is a section of a sphere, and it shouldn’t really be. Quite simply, a spherical lens focuses rays of light that pass through the edges of the lens at a slightly different distance to those that pass through the middle. What we’re told about an infinitely small point of light in the scene forming an infinitely small point of light (or at least one smaller than the circle of confusion) on the sensor just isn’t true for spherical lens elements. We minimise the visibility of that by excluding light rays which enter the lens at a large angle to the optical axis (that is, stopping down) or by using other lenses to bend things back nearer where they should be, but it’s not perfect. 

This explains the existence of aspheric lens elements. Aspheric elements (strictly speaking lenses with the profile of a cartesian oval) are massively more difficult to manufacture, though, so in practice it’s unlikely that we’ll see them used exclusively in lenses that even movie companies can afford. Some Mexican physicists found a theoretical way to eliminate spherical aberration in 2018, but even if we found a way to economically manufacture the resulting very complex lens surface, there’s astigmatism, coma, and field curvature to deal with, and we haven’t even talked about chromatic aberration yet.

Summary

Are there solutions, beyond finding ways to make ever more complicated and precise pieces of glass?

Maybe the most promising option is lightfield arrays, which offer us a way to get around a number of issues of both camera and lens design, potentially minimising aberrations more than was ever previously possible among many other benefits. It’s too much to go into here, but research institute Fraunhofer is not the only organisation to have shown lightfields at the film and TV trade shows, and it is very promising tech. Strangely, the world seems disinterested. Given the enthusiasm for mid-century lenses in which aberration is almost a feature, it’s not hard to see why, but don’t be surprised if cameras using an array of dozens of small, simple lenses start to become a thing.

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -

Latest articles

DaVinci Resolve 16.2.2 update is now available

Blackmagic Design has continued to improve its DaVinci Resolve editing software with a number of bug fixes, and some handy new features. Read on to find out what's new.

Sony IMX500 sensor – Cameras are about to get a lot smarter

When Sony tells us that it has integrated AI into the same single-chip device as an imaging sensor in its IMX500 series, that’s interesting from all sorts of angles.

Lil Miquela: Signed CGI characters are now an actual thing

Lil Miquela is the first fully CGI character to be signed to a talent agency. You might want to sit back in...

Unreal Engine 5 has been unveiled, and it’s absolutely staggering

Achieving photorealistic 3D graphics has been a holy grail ever since computers displayed the first pixel on a screen. Unreal Engine 5 would appear to have made a gigantic leap towards achieving it.

Lens softness: Why do lenses become softer at different apertures?

There is an unabated craze right now for shallow depth of field. But lenses are do not, generally, perform at their best when fully wide open. Why is this?