Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

LPF Donation via Stripe | LPF Donation - Other Methods

Links below open in new window

ArcticMyst Security by Avery

The dangers of infrared - exaggerated?


New member
Jun 28, 2019
Lasers in the near infrared range are often claimed to be particularly dangerous, and on paper, there's good reason for that.
Passing through most of the eye largely unhindered into the retina, but unlike visual light, it would (presumably) do so without triggering any of the usual defensive reflexes.

Thus, I was quite surprised about the info in this study: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4722897/
Where the ED50 for a beam 1.2 mm wide is 175 mw, at an exposure time of 100 seconds.

That seems very high, is the retinal absorption for IR just that much lower than with visible light, or am I missing some factor here?

For comparison, here's another study with shorter exposure durations: https://www.spiedigitallibrary.org/...-surgery/10.1117/1.JBO.17.9.095001.full?SSO=1

Which puts the damage threshold over 20 W/cm².

Going by those numbers, it seems to me there's nothing particularly dangerous about near IR, aside from the fact that it's invisible. Probably less

On that note: Would there actually be no warning signs upon exposure?
After all, the human eye can detect UV as well, to some extent.
Last edited:

Sep 20, 2013
I don't know the specific differences between a rabbit's eye and a human's, but the amount of tissue damage that they insisted on to show, photographically, that a white scar was present could be one reason the amount of exposure was so high in these tests. I know that damage can occur at less severe scarring than was used here. Also, the second test was done, in part, to show how femtosecond IR lasers are used in cataract surgery, so this is also different than what we are used to for damage to human retinas. I would not take the first example of 175 mW for 100 seconds to be the necessary level of exposure before damage with an IR laser will happen to human retinas.
Sep 12, 2007
Yeah, unless we get a biologist in here, it's all-but-meaningless. If we relied only on testing our food on dogs, Chewing gum, grapes, macadamia nuts, chocolate, and countless other things that we currently eat regularly would be considered highly toxic.

One feature that may make this irrelevant is that rabbits are dichromats, and we are trichromats. We have a third type of cone, which happens to be closer to IR than a rabbit's two cone types.

IR is still less harmful (all else equal) than visible light for humans, as outlined by the MPE guidelines, but 175mW in a 1.2mm beam? Try that at your own peril.


May 8, 2009
ED50 as I understand it is that 50% of exposures result in Detectable Damage. Now do the same study to achieve a less then 1 in 100,000 or 1 in 1,000,000 Statistical Chance of Detectable Damage., which is ( I think) the requirement for Class IIIA and Class II in Humans, respectively.

Now, do something a little barbaric by today's standards. Ask some brave persons who are about to have an eye removed surgically or who are already going to have unavoidable permanent blindness to submit to a study where laser damage is induced while the patient is partially sedated but still able to see when a lesion does damage. Slowly increase the power. Also ask the same study questions to persons undergoing treatment of various retinal diseases/detached retina etc, using a laser. Carefully measure the energies used, and analyze the measurement equipment errors that exist.

Take that validated number, divide it by N for a safety margin, say 10 or 100, so you can be a good scientist/responsible government employee/sleep at night.

Use that data for a simple regulatory statement everybody can understand without doing math, while doing a more sophisticated analysis for things such as aerospace and battlefield safety, direct projection retinal displays, and medical procedures.

Ie, in US Law we have 0.95 mW for Class II and maximum 4.95 mW for Class IIIA, and the exposure tables for other uses...

Does that make sense? That exposures for the general public need to be so low as to be statistically unlikely to create any damage to be safe?

The numbers above are from a really old memory from reading Dr. David Sliney's book 15 years ago..
Take with a grain of salt.

EDIT: Also add a multiplication factor for accidental "Aided Viewing" through a set of eyeglasses / binoculars with 50 mm Diameter lenses, which is a larger collection aperture then the 7 to 9 mm traditionally used for the human eye.

Last edited: