Jiggel
New member
- Joined
- Dec 14, 2018
- Messages
- 11
- Points
- 3
Hi guys, my first multi-watt laser arrived today, a 3W 445nm build by Lifetime17 (which I am very much enjoying, thank you Lifetime17!). I wear my Eagle Pair glasses whenever I operate the laser, but I have read in several threads here that it is safe to view even high-powered lasers when they are aimed far away at a (preferably dark) diffuse surface. I have never tried taking off my glasses, and I won't unless you guys confirm that it is indeed going to be 100% safe to do so. So with the safety stuff out of the way, this is my question: at what approximate distance would looking at a 3W laser dot with the naked eye be safe (assuming a near perfect Lambertian surface)?
I am no physicist, but I do have some experience programming light simulation algorithms, so I did some rough calculations. I encourage anyone to correct or expand upon this.
First, I will make a few assumptions for simplicity's sake:
Let's assume the angle between the direction of incoming radiance and the surface normal is exactly 0 degrees (this matters because of Lambert's cosine law). This means our laser is being shone at a flat surface at a 90 degree angle from the surface (straight on).
The next assumption is that the laser outputs 3W continuously. Lastly, we will assume our surface has an albedo of 0.1 at the 445nm wavelength (or whatever wavelength the laser is). This means we assume the surface is pretty black, but not as dark as something like ventablack.
Because our normal is aligned with our light direction, we can say that the angle of incidence is 0. Lambert's law tells us that the diffuse light reflection is proportional to the cosine of the angle of incidence. In our case it will be the cosine of 0, or 1. Next, we can calculate the perceived light brightness with an inverse square falloff attenuation function (the distance will represent the distance from our eyes to laser's dot). I chose inverse square falloff instead of a linear attenuation model because our laser dot can be treated as a point light source, and point lights have inverse square falloff. This is another assumption, because laser dots are not the same as point lights, but fuck it, it's probably close enough.
So, our formula is:
I = (dot product of N and L) * ((Li * kD) / (distance^2))
Where I is the radiant flux received by our eyes, N is the normal, L is the light angle, Li is the incoming radiance, kD is the diffuse reflection coefficient, and distance is distance (obviously). Since our normal is aligned with our incoming light direction (we are aiming the laser straight-on), our angle of incidence is 0, so the dot product becomes Cosine(0), or simply 1. Our function becomes:
I = (Li * kD) / (distance^2)
The variable kD represents the surface albedo, which is just the ratio of irradiance reflected to irradiance recieved. Our assumption is a ratio of 0.1, so our function now becomes:
I = (Li * 0.1) / (distance^2).
We can see from this function that the intensity of the laser will quickly drop the further you are standing from the dot.
PS: Sorry for the piss-poor notation, I do graphical programming as a hobby so I am writing this like I would be writing my code.
I am no physicist, but I do have some experience programming light simulation algorithms, so I did some rough calculations. I encourage anyone to correct or expand upon this.
First, I will make a few assumptions for simplicity's sake:
Let's assume the angle between the direction of incoming radiance and the surface normal is exactly 0 degrees (this matters because of Lambert's cosine law). This means our laser is being shone at a flat surface at a 90 degree angle from the surface (straight on).
The next assumption is that the laser outputs 3W continuously. Lastly, we will assume our surface has an albedo of 0.1 at the 445nm wavelength (or whatever wavelength the laser is). This means we assume the surface is pretty black, but not as dark as something like ventablack.
Because our normal is aligned with our light direction, we can say that the angle of incidence is 0. Lambert's law tells us that the diffuse light reflection is proportional to the cosine of the angle of incidence. In our case it will be the cosine of 0, or 1. Next, we can calculate the perceived light brightness with an inverse square falloff attenuation function (the distance will represent the distance from our eyes to laser's dot). I chose inverse square falloff instead of a linear attenuation model because our laser dot can be treated as a point light source, and point lights have inverse square falloff. This is another assumption, because laser dots are not the same as point lights, but fuck it, it's probably close enough.
So, our formula is:
I = (dot product of N and L) * ((Li * kD) / (distance^2))
Where I is the radiant flux received by our eyes, N is the normal, L is the light angle, Li is the incoming radiance, kD is the diffuse reflection coefficient, and distance is distance (obviously). Since our normal is aligned with our incoming light direction (we are aiming the laser straight-on), our angle of incidence is 0, so the dot product becomes Cosine(0), or simply 1. Our function becomes:
I = (Li * kD) / (distance^2)
The variable kD represents the surface albedo, which is just the ratio of irradiance reflected to irradiance recieved. Our assumption is a ratio of 0.1, so our function now becomes:
I = (Li * 0.1) / (distance^2).
We can see from this function that the intensity of the laser will quickly drop the further you are standing from the dot.
PS: Sorry for the piss-poor notation, I do graphical programming as a hobby so I am writing this like I would be writing my code.