Can you be blinded by a 'dim' light?

Yes indeed, infrared light (the wavelengths beyond those of red light) can be very harmful to your eyes even though you don't see them. The same applies for ultraviolet light (the wavelengths beyond those of violet light).

You can read more under the topic of laser eye safety. People that work with lasers need to use safety glasses if these lasers fall within certain categories. These include infrared and ultraviolet lasers.


To add to the other answers (and Flippiefanus's answer) that both invisible IR and UV light can permanently damage the eyes without any sensation of damage or injury being felt.

Invisible IR damage causes damage through thermal loading: the retina absorbs heat from incident light faster than its vascular network can draw the heat away. High power UV can do the same, but for shorter visible and UV wavelengths a second mechanism is photochemical toxicity - where photons of energy comparable to organic molecule bond energies beget chemical changes in the retina or even nuclear (in the biological sense) DNA changes with the attendant neoplasia risk - is a much more dangerous factor because it is damaging at far lower levels than are needed for thermal damage to happen. Basically, the retina can safely absorb of the order of five to ten milliwatts of light focussed to a spot of less than about $50{\rm \mu m}$ diameter, and laser and light safety standards aim to limit light entry to the eye of less than $1{\rm mW}$ at IR wavelengths where only thermal loading is a problem. Laser safety standards, in particular, IEC/ISO 60825, aim to limit power input to the eye at visible and UV wavelengths to only a few microwatts owing to the danger of photochemical damage.

Long term, chronic exposure to UV is even more of a problem. Cataracts arising from photochemical damage to proteins are the foremost cause of human blindness on the planet. Very long term, chronic exposure to low levels of UVB such as one encounters on normal, sunlit days, especially at lower, tropical latitudes or snowy environments, are an overlooked hazard. One should generally encourage children to wear sunglasses, conforming to a sound eye protection standard, as the eye's lens is particularly transparent to UV under the age of 20.

Lastly, there is even some evidence that high peak powered IR pulses can give rise to severe photochemical damage and that the laser safety standards are inadequate in the way they deal with it. See for example:

Glickman R. D., "Phototoxicity to the retina: mechanisms of damage", *Int. J. Toxicology". 2002 21, #6, 2002, pp473-490

Not being a biologist/ ophthalmologist, I am not fully qualified to read this paper. But it does sound thoroughly reasonable from a physicist's standpoint. On interaction with the complicated organic molecules in the eye, high peak power pulses yield much shorter wavelength light through nonlinear processes. Significant production of even UV can result, hence the risk of photochemical damage. The problem here is that the safety standards (including ISO60825) blithely assume that the thermal loading on the retina is the only problem. Therefore, they are too forgiving of pulsed lasers with small duty cycles: the standards will accept a level of laser power as intrinsically safe if its average power is small. As I have discussed, very low levels of UV can be a problem, and this is even more so when the light enters the eye as IR as it is then deeply penetrating. Conversion to shorter wavelengths can happen beneath the retina's layer of shielding melanin, so the retina is particularly vulnerable to this kind of exposure.

Therefore, until the standards are updated to account for this factor, I have been assuming that IR light is a mixture of IR together with one tenth of its power at each of one half and one third of its nominal wavelength, and applying the safety standards both to the IR and the assumed second and third harmonic UV if I am called on. This is obviously highly conservative, especially if the IR light isn't pulsed, but until someone convinces me that the standards are taking these things into account, that's what I am going to do.


In response to the edited question which went after the idea of being "dazzled" by the light, it would not work the way you would like.

We show the spectrum "darken" as we leave the visible region, because we are showing a roughly constant intensity of light from one side of the spectrum or the other. When you start talking about making the light brighter, we have to be a bit more formal:

Spectrum

Graphs like the one above show the frequency response of our cones. There's a few variants on the y-axis (some use a unit system where the blue is more sensitive), but it doesn't matter for this answer.

We can see that as we move past 700nm, into IR, the red light response goes down. But it doesn't go to zero right away. So if you had a very bright light just outside this region, it would stimulate the red cones in your eye.

But here's the trick: for all pain and dazzling responses, we rely on the signals that we receive from the rods and cones. There are no pain neurons in the retina, so these are all the signals our brain gets (and obviously it has to become a signal to the brain before it could dazzle us).

So what we would see with your very bright IR light is an ultra-pure red -- bothersomely pure because it's so far away from the peak of the green cones. But you would not see it as "dim," because by definition, you're stimulating the cones to cause the dazzling effect. It would have to appear very bright, just like any other dazzling light.

Of course, the limit to this is when you get to the IR region where your sensitivity gets low enough that second order effects (like thermal heating) become important. That's where the other answers pick up. But below that point, the light would be dazzling because it's bright, or it would not be dazzling because it's dim. You can't have dazzling and dim from a signal processing perspective.