« Cecil Stoughton Dies at 88; Documented White House | Main | Random Interest: Black Croix de Guerre Winners, 1919 »

Friday, 07 November 2008


Apart from the equations, that last bit describing diffraction is the neatest I've ever seen. Beats the heck out of my high-school physics teacher.

Crap! Is this going to be on the test???? ;-)


It turns out that out that diffraction has nothing to do with "spooky action at a distance" (AKA Bell's Theorem or quantum entagnglement).

Diffraction has everything to do with constructive and destructive wave interference. The interference arises when light waves interact with a barrier (the aperture blades or opening in a lens). Wave mechanics is indeed weird because our Newtonian common sense does not apply.

However light is not the only physical phenomenon that exhibits diffraction. Any wave from any source will diffract when the waves pass through a barrier. Water waves, for instance, also experience diffraction (Basic Wave Mechanics: For Coastal and Ocean Engineers By Robert M. Sorensen,Wiley-Interscience, 1993, p 95). So, while diffraction is common and important in quantum systems, difraction is also important elsewhere.

Diffraction is described exactly by wave mechanics. It may be frustrating that we have no idea why waves behave the way they do, but how waves behave is perfectly predictable to the point of being boring. By boring I mean when you use wave mechanics to predict how waves will behave in a given scenario (experiment), when you do the experiment, they behave exactly as predicted... every time, all the time.

Hi Ctein -

now I use a 38mm biogon, and have always understood that the best performance of this lens is at f8 or f11, indeed that any lens should perform best around there.

Maybe I haven;t read your post carefully enough, but it seem to me that you are saying that best performance is widest aperture ? Am I right in thinking that ? DO I need to start using f4.5 to get the best out of the lens ?


Ctein wrote -

"It's the mere existence of the edge that causes diffraction. It's a quantum mechanical "spooky action at a distance" thing."

It's not necessary to invoke quantum mechanics to accurately explain or describe diffraction. Classical wave theory works perfectly well (as I know that you know). In essence, the part of the wavefront that is blocked by the surroundings of the aperture acts like another wave, out of phase with the primary light (so that the two cancel outside the aperture). The out-of-phase part interferes with the primary light, producing the diffraction.

Of course, quantum mechanic leads to the same result since it is describing wave interference as well, but the quantum aspect isn't needed to understand diffraction.

Whichever way you look at it, the results are the same for the photographer.

But don't waves in the water exhibit diffraction when they encounter an obstacle? Is it diffraction when they "bend" around the end of a jetty? That seems to have very little to do with quantum mechanics.

hypnotic discs, aren't they


curse you Sir Airy!

Well, I'm glad we finally cleared that one up!

Dear Kevin,

Well, no they don't. That's not actually equivalent to optical diffraction. Also, light isn't waves.

You're correct, though, to note that that has nothing to do with quantum mechanics.

I think you may have misunderstood the logic of my piece. I'm not saying optical diffraction proves quantum mechanics (it doesn't); I am saying it IS quantum mechanics.

pax / Ctein

Once again, I find myself within the circle of confusion...

Cheers! Jay

Is diffraction limitation caused by stopping down the lens? Would a 50mm f/2 lens stopped down to f/8 behave the same way as a wide open 50mm f/8 lens of similar quality (assuming there's such a thing)?

I remember seeing a lens test of either one of the Rodenstock Digital or Schneider Digitars, where it was demonstrated that the lens was in fact diffraction limited, in that it produced the sharpest image wide open. Of course the problem with LF lenses is that you generally need to stop down for depth of field, so you have to knowingly take a hit to image sharpness to get the shot at all.

Then there are astronomical telescopes, in which any model worth its salt will be diffraction limited. But there you are dealing with a much simpler optical design, with a fixed aperture and optimized to focus at infinity.

Dear Robert,

The least diffraction occurs with the lens wide open. That is not necessarily the best-performing aperture; with most lenses it's not. Wide open, most lenses have lots of residual aberrations and other optical defects that degrade image quality. As you stop the lens down, that degradation decreases. At the same time, diffraction increases.

Somewhere, you hit the aperture where the combined effects of diffraction and other image-degraders is at a minimum. That's the optimum aperture.

What aperture that is depends upon the lens. It is most definitely not always f/8-f/11, alothough it often is. I have a 50mm lens whose optimum aperture is f/4.7-f/5.6.

pax / Ctein

Dear Tom,

My column is not an argument attempting to prove quantum mechanics. QM is a given; I'm describing how diffraction is, in fact, an interesting QM phenomenon.

Yes, people *did* explain diffraction with classical wave theory. That explanation was wrong.

If there are readers of this column who don't believe QM is a valid description of the universe, please be aware that I am not going to argue with you. This column lacks the scope to entertain such an argument, even if it interested me (it doesn't).

pax / Ctein

Something just occurred to me. Since there's an "edge" involved in any optical lens, even if you shoot wide open (i.e. the very opening itself), isn't there diffraction happening all the time?

Thanks for the excellent basic intro to diffraction in classical optics!

Because I think a lot of us were nonplussed by some of the arguments in such articles as http://luminous-landscape.com/essays/Equivalent-Lenses.shtml , which brings the term into the digital age and applies it to sensor sizes, I hope in a future article you'll bring a bit of clarity to how pixel size and separation are also affected by diffraction.


Dear Lambert,

Diffraction is a purely physical effect that doesn't depend upon the lens design. That doesn't mean your two hypothetical lenses will perform the same in practice, because who knows how well they are corrected for all the other optical flaws they may be heir to. But when it comes to diffraction, and only diffraction, you can assume they will behave the same.

~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 

Dear Jerry,

I've also seen enlarging lenses that were diffraction-limited wide open. In fact, some even gave true diffraction-limited performance wide open! Usually, though, other optical flaws made them better if they were stopped down a bit. For instance flare and micro-contrast would usually improve and so would light falloff. But if you're looking at mere resolution, the best medium and large format enlarging lenses are very close to their best wide open.

I am limited in my ability to test lenses at large apertures. I can't check resolution above 400 lp per millimeter, which means a lens that is delivering diffraction-limited performance at f/2.8 is going to slip right past me. I suspect there are some out there.

At some point, though, this becomes meaningless in practice. For example, let's imagine a lens that is delivering true diffraction-limited performance at f/4. That's 400 line pair per millimeter. Which means your total depth of focus is only +/-10 µ, and if you don't want to sacrifice a lot of that diffraction-limited sharpness, you better limit that to +/-5.

Those are insanely tight, usually unachievable tolerances for photographic work (and if you're talking about a camera lens, your depth of field is nonexistent).

~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 

Dear Howard,

That link no longer works, but I think I know the article you're talking about. Basically, I am entirely avoiding the various online discussions of acceptable diffraction limits as they apply to Bayer array sensors. I am not convinced any of these people actually know what they're talking about and I don't have the resources to fully research it myself to come up with an answer that I KNOW is correct.

I am also inclined to think all such discussions are missing the forest for the trees.

So I am running and ducking. Sorry!

~ pax \ Ctein
[ Please excuse any word-salad. MacSpeech in training! ]
-- Ctein's Online Gallery http://ctein.com 
-- Digital Restorations http://photo-repair.com 

Dear Fred,

Don't shoot the messenger!


Dear John,

Not entirely-- since my columns are of restricted length, the information's always a bit fuzzy [nonlocalized grin].


Dear David,


pax / Ctein

Thanks for the honest response. I feel less alone! ;-)


Thanks for a really insightful piece. A nice example of physics where the size of the aperture is the deciding factor, not the implementation details such as what shape and how many aperture blades.

Arguing about classical vs. QM physics is quite unproductive. You may get the same results, but in the end there is no avoiding QM.

I'm going to enjoy this series. I keep meaning to write to say that TOP is on a run of excellent articles and then the next week there's another run and so on.

Q for the venerable Ctein: when writers suggest that the effects of diffraction are now visible at wider apertures for newer, higher-resolution cameras of the same sensor size as the models they replace, should they be implying as they often do that absolute recorded is lower in the newer model compared with the old?

For example, you might see someone claim that an advantage of an old, six-megapixel D50 over a new, 12-megapixel D90 (roughly similar sensor size) is that on the D50, diffraction doesn't become an issue for a certain type of picture until the lens is stopped down to f/13 or f/16. With the D90 taking the same picture with the same lens , the effects might be claimed to be visible at f/11 for many lenses.

Regardless of whatever the limits are determined to be, is the higher resolution camera (the D90, in this example) able to provide more resolution at any aperture, even after the effects of diffraction have become visible? (Given accurate focus and the same decent lens.) And would this continue to be the case until the detail provided by the lens fell below the level recordable by a six-megapixel camera?

IMHO, ALL compact digital cameras with small sensors (~1/1,8") are influenced by difraction. Such sensor is approx. 5x smaller than full format digital sensor. You need 5x higher LINEAR resolution for the same print resolution. Therefore, if for a DSLR lens the diffraction is serious at FN=8, the same effect in case of a compact camera is visible for FN=2 (approx.). (And it has nothing to do with magapixels).

And yes, the "spooky action at a distance" has nothing to do with diffraction, but it seems it describes the basic principle of our world. Wave theory in optics is an approximation, which is usefull, but not correct. (Try to make a long exposure night photo. How can one photon interfere with the whole aperture and excite a bit later exactly one electron in one pixel only.)


"Yes, people *did* explain diffraction with classical wave theory. That explanation was wrong."

This is a philosophically inflammatory pronouncement, indeed! Diffraction is very accurately described by wave mechanics derived from Maxwell's equations. These equations are taught to all physical science and engineering students everywhere. They are "wrong" in the same same sense that Newtonian mechanics are "wrong" - that is, a newer theory has been developed that proposes a more fundamental description of the observed phenomenon.

Meanwhile, those of us who actually design optical and electronic systems for a living, (as well as those who send spacecraft across the solar system) are perfectly comfortable using the "wrong" theories of Maxwell and Newton. Because in practice, they describe our physical observations to a very high degree of accuracy. Certainly to a higher degree of accuracy than any photographer will ever need!

I liked your two line derivation of how optimal lens resolution depends only on a tradeoff between f/stop and aberrations, but Eolake's question sounded to me like "What is it about diffraction in very small imaging systems that makes the manufacturers not even bother to offer smaller apertures than f/5.6 to 8?" And that probably has to do with the fact that the acceptable circle of confusion in an image that is going to be blown up from a few mm wide to spread across an Apple Cinema screen is much smaller than we were used to thinking about in film days.

Light (photons) isn't waves, huh? Well, they are particles, too, and that results in shot noise at low light levels. But does the rest of the neat stuff that photons and QM offers, like quantum encrypted communications and Bose-Einstein condensation offer creative opportunities to a photographer?



These quantum vs. wave interpretation differences are easily resolved. Diffraction is caused by quantum effects unless someone hailing from Copenhagen is looking through the finder when the picture is taken, thereby collapsing the wave function.

What I'd like more clarity on is apertures and SF vs. MF vs. LF. One of the most common reasons for using a larger format is to do landscapes a la Adams, meaning large, highly detailed prints with seemingly endless DOF. But since DOF varies inversely with focal length, and since larger formats mean longer focal lengths, one pretty much has to stop down more for each format increase. Where I might use f/8 with my APS-C, I would have used f/11 with my 35mm, f/22 with MF, and f/32 with LF. I know there is the Scheimpflug effect to complicate matters, so let's restrict any discussion to cameras without movements.

Thinking purely in terms of getting the most detail per square inch/centimeter of a typical landscape print, where would the sweet spot lie - with SF, MF, or LF? Seems to me with each increase in format, we're throwing away more of the advantage that comes from the increase in pixels (or film area) down the sinkhole of diffraction loss. It might help to think in terms of a thought experiment in which a hypothetical matched set of cameras across the various formats were used, such that the sensor in each had been cut from the same silicon wafer, similarly with the AA filter ditto, etc., etc. Plus, some hypothetical matched set of super-high quality lens were used on all cameras.

I'm particularly hoping for Ctein's answer to this or at least one as crystal clear as his diffraction essay above.

Dear Folks,

A lot of you seem confused about the import of what I wrote. I did two things:

First, I answered Eolake's question about diffraction using the CLASSICAL equations for describing such.

Then I explained how diffraction is really a QUANTUM-MECHANICAL phenomenon (and, yes, it really does involve entanglement and 'spooky-action-at-a-distance' and there's no way I can explain the details in this column, so please don't ask me to) (and, in a 500-word column written in English for a lay audience, fergodsakes please don't expect me to be too scientifically precise).

Physicists use classical calculations every day. We hardly calculate *anything* using quantum mechanics; the math is intractable. But that doesn't mean the phenomena aren't quantum-mechanical in origin.

Don't confuse a useful engineering model with what's really going on.

The idea behind my column was to give you the useful engineering tool and then let you in on the cool, weird stuff that's actually responsible for the phenomenon. They are two associated but independent nuggets of knowledge. Do not treat them as two components of an argument; they're not.

It's apples and oranges, people. Enjoy the apples. Enjoy the oranges. Don't try shoving them into a blender...or assuming I did.

pax / Ctein

Dear Bahi,

In a word, yes.

But I'll not go into details, because that's much of my next column.

I will say this: the argument that a lower-res camera has an advantage because diffraction doesn't show as readily has about as much merit as arguing that you should load up your film camera with with really grainy, low-resolving film, because then you can't see the lens' optical aberrations and diffraction effects in the photograph.

If that makes sense, then so do the arguments you asked about.

pax / Ctein
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com

Dear Martin,

No, on all counts, I'm afraid.

You have the equations in this column, so you can calculate the effects of diffraction yourself. Here's a sample: at f/2, your individual sensor pixels have to be smaller than 2.5 microns for diffraction to have any effect. It won't dominate until they get down to about 1.5 microns.

Nobody's camera has pixels anywhere that small.

There may be many reasons why you'd find a small-sensor camera unsuitable for your needs. Diffraction shouldn't be on that list.

pax / Ctein
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com

Dear Scott,

Eolake's question was distilled from a longer conversation; my "quote" isn't his exact words, but it catches the intent.

Smaller apertures don't serve much purpose on small-format cameras. Few lenses perform better when you stop them down beyond f/8. And, very crudely, the aperture you need to achieve a certain depth of field is proportional to the format. So, with a 1/5th scale sensor, a minimum aperture of f/5.6-f/8 gets you as much depth of field as f/30-f/40 would in 35-mm format.

The only reason for wanting an aperture smaller than that is to be able to intentionally get slower shutter speeds, which is of very little importance in the total scheme of things photographic.

pax / Ctein
-- Ctein's Online Gallery http://ctein.com
-- Digital Restorations http://photo-repair.com

Dear Dale,

Your first paragraph was Bohring. [chuckle]

Is there a "typical landscape?" Most of mine (and Adams') don't require extreme amounts of depth of field, because the range of subject distances in the scene are all near infinity.

The rest of your question is the topic of my next column -- when and how does diffraction actually matter.

pax / Ctein

I would like to recommend this article:



Thank you, Ctein. That answer doesn't get as much airtime as it should. I wish I'd thought of the analogy with grainy film. Looking forward to the details.

When I posted the Johnson/Myhrvold link above, it was only because at the time I couldn't find Efraín's and your LuLa article on resolution, which seems to me to contain more useful information than that in the previous one.

Thanks very much for posting the link here!!


Dear Ctein,

The modern compacts with 10-15 Mpixels have the pixel size well bellow 2,5 um, and the lenses min. apertures are not 2,0, but 2,8 at least, and above 4,5 at long end.

My old cheap Canon A80 (4 Mpix only) has very nice lens, which is diffraction limited stopped to 5, of course in the center only (my experience, not a web test results).

And, you are right, I don't care about diffraction when using a compact.


"Here's a sample: at f/2, your individual sensor pixels have to be smaller than 2.5 microns for diffraction to have any effect. It won't dominate until they get down to about 1.5 microns.

Nobody's camera has pixels anywhere that small."

The G10 has 1.7 micron pixels (14.7 MP on a 1/1.7" sensor), and a smaller maximum aperture, f/2.8. Is it possible that those extra 2.5 million pixels in the G10 from the G9 aren't actually contributing to any extra resolution? Could all of the improvements in image quality just be due to better image processing?

Dear Martin and Sam,

I stand corrected! I hadn't been aware that the pixel sizes had gotten that small. Thank you for the education.

Diffraction still won't be the issue, because you're very unlikely to find a lens which is providing diffraction-limited performance at f/2 or 2.8.

As for what more pixels may contribute to image quality or resolution... wait for the next column [impatient smile].

pax / Ctein

You're welcome Howard!

Oh dear, I was going to be the first to jump in ant say that diffrection is a wave effect, not a quantum effect... but the server was down so I couldn't.

Easy steps:

1. Diffraction in lenses, with light waves, works exactly like diffraction of water waves. It was understood well before 1900. (Maybe before 1800, anyone know?) So we call it a wave effect.

2. Our most current theory of light is indeed quantum, QED is part of the standard model which is certainly quantum. But insisting that therefore light is always a quantum effect is a bit strong. And here we veer into philosophy... all physical theory is effective theory, this is the central lesson of Wilsonian renormalisation etc. Certainly nothing we have yet is final. "what's really going on" probably isn't a sensible concept. So picking QED as the truth on which to base the statement that light in lenses is quantum is a bit odd, given that you can explain it just fine with 19th century waves. Whew.

3. Diffraction certainly has nothing to do with entanglement (spooky action at a distance), EPR, Bell, or any of that. That's another aspect of QM (which is not just waves, it is inherently quantum). Even using quantum mechanics, you don't need to invoke observers to derive these diffraction formulae.

(With QM you can also make sense of electron diffraction, which is a quantum wave effect, but not an entanglement effect. And there's also the path-integral way, which looks very un-wave-like, but gives the same answers, and still has nothing to do with entanglement. That's what Feynman's book is about, and is highly recommended.)

Apart from that, nice article Ctein!

And in case anyone's wondering: Yes I do this for a living, No I don't normally get involved in QM arguments on the internet, and No I am not entirely sober.

Dear Improbable,

Lovely post!

I take to heart your comment about renormalization. Arguing too vigorously about the philosophical underpinnings of QM is a dubious business when what we have to work with are the 'engineering rules.'

So, I'll just leave this by saying that I think entanglement (which doesn't have to involve EPR experiments or Bell) provides a more coherent and unified description of all of the phenomena described.

But no matter how you parse it, the math comes out the same. In the absence of a solid underpinning, you correctly called out that we're debating about virtual angels and potential pins.

BTW, if this is how you write when you're not entirely sober, I don't want to find myself on the opposing side of an argument about QM when you are!

pax / Ctein

The comments to this entry are closed.