Why is it necessary for an object to have a bigger size than the wavelength of light in order for us to see it?

At David Zaslavsky's suggestion I'll transfer this from the comments to the answers (I was a bit hesitant because I don't know how reliable youtube videos are to still be around in, say, 6 months time!):

This little youtube video might help. You can only resolve the objects by looking at the reflected waves. The amount of detail you can get in the reflected waves can't be smaller than the wavelength (roughly speaking).

Edit: The video shows incident waves being reflected off small irregularities in the surface at the bottom of the picture. The first case (wavelength smaller than the irregularites) shows information about the irregularities being "fed back" in the reflected waves: enter image description here

The last case (wavelength larger than the irregularities) shows much coarser information being fed back, making it not possible to get any information about, for example, the size of these irregularities:enter image description here

Of course snapshots are a bit hard to read, you'd really have to look at the statistics of the received reflected waves as a function of position to really see what was going on, but the video gives a general impression of the problem.


Classically it's hard to resolve detail in an object in a less than half the wavelength of light (abbe limit).

It is possible to make an 'image' of the structure of an object if you can get closer than a wavelength from the object - near-field microscopy, essentially by measuring the electric field of the the light directly rather than focussing it.

And if you can make a material with a negative refractive index then you can image structures much smaller than a wavelength, see: The Perfect Lens and superlensimaging


As was pointed out elsewhere, the premise of the question is not quite accurate.

Yes, a shorter wavelength makes it easier to see smaller objects, but the wavelength does not necessarily have to be smaller than the object.

The short answer is that, most commonly, the limit of resolving small objects is due to the diffraction of light and this limit is determined (among other things) by the wavelength of light and the size of the aperture.

As a result of diffraction, the image of a small object will be blurred and increased in size relative to the image predicted by ray optics.

enter image description here

So, the diffraction does not prevent us from seeing or detecting an object, but rather prevent us from seeing fine details of an object. It comes down to the ability to see two closely located points of an object as distinct, which comes down to the determination of a minimum angle between two closely located points at which they will still be perceived as distinct.

The minimum angle between the two points is a better characteristic than the minimum distance, because the angle is applicable to a vast range of distances, which could be a distance between two atoms, two car headlights or two stars.

If an aperture, through which the points are observed, is circular in shape, such minimum angle of observation (or resolution), θ, could be approximated by this formula:

θ=1.22λ/D,

where λ is the wavelength of light and D is the diameter of the aperture.

enter image description here

On the picture above, the light originating from the two points on the left, passes through a circular aperture (the pupil of a camera or an eye) and leaves two images on the screen (the film in a camera or the retina in an eye).

As can be seen from the picture, the images on the screen are spread out a bit and partially overlap. It is easy to see that if the angle θ was smaller, the two images would merge and would be perceived as one.

The spreading of the images is caused by the diffraction (spreading) of light as it passes through the aperture and, the degree of the spreading depends on the wavelength of the light and the diameter of the aperture. This is why these two parameters (not just the wavelength) define the minimum angle between the two points at which they are still perceived as distinct.

Here is a laundry list of some of the basic concepts behind this formula, with some simplified explanations of how they are related to the problem at hand:

  • The Huygens-Fresnel principle, which says that every point of a wavefront acts as a source of a new wave (a wavelet).

  • Wave interference, which describes how two waves add and subtract when they overlap.

  • Diffraction, which is a spreading of a beam of light passing through an aperture or bending of light passing an edge of an object. The diffraction results from the interference of multiple wavelets at the wavefront of the beam.

  • Diffraction of a single slit, which specifically describes the formation of the (Airy) diffraction pattern of the light passing through a single aperture and which directly related to the case in point.

  • Rayleigh criterion, which defines the minimum distance between two Airy patterns, at which they still can be detected as two distinct images. This criterion is used to calculate the angle in our formula.