a_pyro_is said:
Example:
If I had a LASER with a diameter at the aperture of 0.05 Inches or about 2 mm, and a measured divergence of 0.7 mRad. Then at 1000 Feet it should have a spot size of about 8.45 inches according to pseudonomen137's calculator.
http://www.pseudonomen.com/lasers/calculators/diameterCalculator.html
Now if I sent the .05 Inch beam through a 10X beam expander/collimator, would I have a 0.5 Inch beam with a 0.07 mRad divergence? Giving me at 1000 Feet a spot size of only 1.34 Inches?
Am I understanding this right?
Yes, that's roughly the idea. To be technically correct, its not the beam diameter at aperture that matters, but the minimum beam diameter anywhere along the beam. However, yes, a 10x beam expander, aligned properly, could take your divergence down 10x.
Here is th eimportant relationship to remember:
Divergence is proportional to the wavelength, and inversely proportional to the minimum beam diameter.
Therefore, if you want to lower divergence, you either need to increase the beam diameter, or decrease the wavelength (the first being the more practical of the two for our purposes).
Also, keep in mind that there is a theorhetical perfect beam. The M^2 attribute is a measure of how many times worse than a perfect beam (at that wavelength) the laser is. For green DPSS lab lasers, you'll find the M^2 beam spec is usually <1.2, and the perfect beam would be exactly 1 (M^2 can never be less than 1 because of diffraction). With multi-mode red diodes, your M^2 is often closer to 20 (hence why I complain about them so much).
The M^2 is basically a factor based on the divergence, minimum beam diameter, and wavelength. So for instance, if you're considering the full-angle divergence, and 1/e^2 beam diameter (don't worry about what that means, I'm just trying to be correct so no smart-ass complains
) on a 532nm laser, the best possible beam would have a (divergence) x (minimum diameter) of roughly 0.678 mm*mRads