B

#### BrittanyGulden

##### Guest

Assume I have a laser diode optical output of 1w, & a pupil size of .25" in my eye (@ Night). If I shined that 1W Laser directly into my eye, I will be blind instantly. However, what If I put that 1w through a lense & disperse it out into a circular area. I should only need to have a large enough circle so that the fraction that would go into my eyes is a safe amount, correct? Area of a Circle = Pie * (R)^2, So I can just figure the fraction of the pupil's area to the larger circle's area. I only need an "Eye-Safe Level," which I'll call 1mW. So I have (Diode Power) x (Pie * (.25")^2) / (Pie x r^2) which equals 1mW. Hence, 1000 x (1/16) / (r^2) = 1. Radius = about 8" or 16" in diameter. This means that If I have 1 watt of light, going through a lense , & spreading out into a 16" circle of light, and that shines directly into my Eye, my eye will see 1 milliwatt correct?

Is this correct & if so how or does Time play a part?

-Thank You