Why is it so expensive to produce cameras for non visible light?

It comes down to market size. Where is the demand for such cameras and do the number of sales justify the production set up costs? You can get an infra red conversion to standard type DSLR cameras (eg Do It Yourself Digital Infrared Camera Modification Tutorials) and you can convert the camera to a 'full spectrum' type which takes in some ultra violet. (see Full-spectrum photography). For smaller wavelengths you'll need different sensors. These, by their specialist nature and low volume production, tend to be very expensive.


First of all: standard CCD sensors are sensitive to wavelength far beyond 700nm. As much as I know Si-sensors are even more sensitive for near-IR light than for visible light.

Of course it changes for much larger wavelengths: One condition for light being detectable is that photons have enough energy to create a hole-electron-pair. This energy threshold is the band-gap of the particular semiconductor material (e.g. for Si: ~1.1 eV). Since photon energy is inversely proportional to wavelength ( E = h * c / lambda) there is a maximum wavelength that can be detected with a given semiconductor material (e.g. for Si: ~1100 nm).

For cameras the lens is also relevant: Most types of glass are less transparent to UV light. Lenses optimized for UV transparency are very expensive (although a cheap alternative could be plastic lenses).


Both your existing answers are valid, but may be taken in combination: Simple Si sensors are good for visible and NIR and are common and therefore cheap. Modifications to the imaging system are required in many cases as the IR is normally blocked because it's undesirable. See for example Canon's EOS 20Da.

Silicon sensors are fairly easily adapted to UV use by means of a phosphor coating (I wanted to try a homebrew version of this on a webcam I'd modded with a B+W CCD but never got the chance). Even X-rays use is possible with a scintillator (which is normally fibre-optic-coupled).

To go beyond ~1µm further into the IR requires other semiconductors - which are expensive. InGaAs is a popular choice, but is ridiculously expensive as you say - but that's not surprising as you need dedicated production facilities. InGaAs and other NIR cameras are also regarded as military technology for the purposes of US export regs (which are also imposed on many NATO countries in effect); this adds cost to the camera manufacturer in terms of compliance.

Cameras which have any sensitivity at all to thermal radiation, or which are made from narrow bandgap semiconductors, will need significant cooling to remove thermal noise that could be greater than the image you're trying to measure. That often means a Dewar of liquid nitrogen (material cost + operating cost). There are newer technologies (even uncooled) coming on the market - in particular for thermal imaging, but the resolution is much less than for Si CCD or CMOS sensors.