Why can I see remote control radiation on my phone but not with my eyes?

Building on the good answer by nanofarad, "But why white?" is a good follow up.

We don't know what the precise sensor in your phone is, or how good the IR filter in front of the lens is, but we can study the sensitivity plots for an example CMOS sensor.

enter image description here

Notice the three distinct humps, or increased sensitivity, occurring at different wavelengths for visible light. Notice also the increase in sensitivity towards infrared, for all three sensors equally.

By tuning the efficiency to different colours you get different light sensitivity. (More on colour selective sensitivity below)

In the curves you'll see the blue, green and red sensor sensitivity plotted, with peaks at 450 550 and 600nm wavelengths respectively.

White is for instance an equal mix of red, green and blue.

An area for the larger wavelengths on the right is supposed to be blocked by an IR filter. IR light is on the far right, at 940nm. All three sensors of this type, according to the graph, have the same efficiency at infrared frequencies.

This means that IR light will translate to the same amount of free electrons (current, voltage) for each sensor (R G and B) and will be stored as equal RGB pixel values (8-bit, 16-bit or 24-bit numbers) in a typical photo file.

When shown on screen, these equal numbers will appear "white" to the eye.

Thus, the 940 nm wavelength typically used by remote controls falls into the range where all three sensors have roughly the same response, giving a "white" result.

It should be noted that not all devices will have colour curves that raise at lower frequencies, as shown. Some will remain low, but usually all three will converge to exhibit about the same "efficiency". These devices would require a higher level of IR light intensity to translate to a "white" depiction in an image. This does not mean that the sensor has to be saturated with IR to appear white.

Now a bit more on colour selective sensitivity:

Technically, quantum efficiency refers to how well a CMOS semiconductor can transfer light energy into free charge. A sensor's colour selectivity is accomplished by placing a mosaic of tiny pixel-sized colour filters in front of the sensor array. This is pixel filter is called the "Bayer Filter", after its inventor, and it was developed by Kodak, already back in the 1970s.

Polymers are used to dye the filter, which will then absorb light (or reflect light - depending on the type) of specific wavelengths, and thus filter it.

So, the tuned efficiency is, more precisely, the quantum efficiency of the CMOS sensor cell's semiconductor times the transmittance of the optical filter in front of it. With the two together, we can speak of colour dependent quantum efficiency of a colour CMOS sensor.

And by combining different base colours (RGB) in the camera's photo sensor array, we can represent practically all colours for the full range of colour visibility of our eyes.

By the way, a CMOS sensor is not to be confused with a CCD sensor, which is an older technology now phased out for cameras.

There is of course much more to read about this, but I think a bit of an introduction is warranted here, as invited by the comments.

Image credit https://www.thorlabs.com/NewGroupPage9_PF.cfm?Guide=10&Category_ID=220&ObjectGroup_ID=4024


The remote control emits infrared light, typically around 940 nm, while the human eye is receptive to ~400-700 nm. However, the CMOS optical elements in a phone camera have a response up to 1000 nm (source), and for cost/space reasons, the phone may omit an IR filter.

I recommend reading (and accepting) P2000's answer as it includes far more details about the actual sensitivities of CMOS sensor elements in the IR range.