Scubajoel527
New member
- Joined
- Mar 29, 2021
- Messages
- 2
- Points
- 1
Hello everyone! I've been researching this problem and came across a few threads on this forum that had a lot of information, so I figured I would pose my specific question and see what you think.
Problem Statement: I would like to build a system using a near-infrared laser, preferably around 940nm to transmit a 2-inch diameter collimated beam at least 1/4 mile and produce a spot size no larger than 6-inches diameter at the other end. If less divergence than this is achievable, that would be excellent. I believe this comes out to a divergence of approximately 0.12 mRad or ~25 arc seconds. The beam needs to have an intensity at the target of at least 2mW/m^2. Finally, I need to be able to modulate the beam using an on/off scheme at ~38kHz.
I haven't found many laser diodes that operate at this wavelength, and I've been kind of disappointed in the information available in their datasheets, but here's an example of one potential laser diode I have been considering. I guess I can't post links, but here's the manufacturer and p/n.
ThorLabs L904P010
Question 1: Is it possible to achieve divergence this low? If so, how would you do it? From what I've read, beam expanders help lower divergence, and I want a pretty fat beam, so if I started with a small laser diode like the one I linked, and expand it out to 2 inches, it seems like it could work, but I don't know how to quantify this, and would like to do some level of design before just buying and testing things. I am reading Hecht's "Optics", but it doesn't seem like the best resource for this problem. If anyone knows how this design would be performed, or has any book recommendations, that would be great!
Question 2: What is the lowest transmit power I can use to still achieve this? I would like to use the lowest power possible, for safety reasons. From what I've read, it sounds like lasers primarily lose intensity through divergence, but don't tend to lose much actual power at all. So here's my train of thought. A 2 inch diameter beam would have an area of approximately 0.002 m^2. If it expands to a 6-inch diameter, the area would be about 0.016 m^2. So to get an intensity of say, 5mW/m^2 at 1/4 mile, assuming no other losses other than the beam diameter increasing, I would only need to transmit 0.09mW of power to achieve the 5mW/m^2 intensity. This seems like a crazy low amount of power, is that calculation right? Are there other power losses I am not considering?
Question 3: I am already experimenting on a similar design using an NIR LED rather than a laser, but from what I can tell this kind of divergence is basically impossible without a laser. I am an EE, so I'm pretty familiar with driving LEDs, and I already designed a variable current driver board that I use to blink the LED at 38kHz logically anded with some serial data, and at any current desired (up to approximately 1 amp). This works pretty well, and lets me control the brightness of the LED, while using it to transmit data. I would like to do something similar with a laser, but I haven't found much information at all on actually driving laser diodes. Can I treat it the same as an LED? Can the laser brightness be adjusted by adjusting the current, or is this a bad idea? I could also adjust the brightness with an adjustable aperture in the optics section, and still produce a beam of the desired size, would that be better? But I have to be able to blink the laser at 38kHz anded with my data in order for my detection circuitry to work, so if I can't do that, this is dead in the water.
Sorry about the novel guys, I appreciate any insights you might have here.
Thanks!
Problem Statement: I would like to build a system using a near-infrared laser, preferably around 940nm to transmit a 2-inch diameter collimated beam at least 1/4 mile and produce a spot size no larger than 6-inches diameter at the other end. If less divergence than this is achievable, that would be excellent. I believe this comes out to a divergence of approximately 0.12 mRad or ~25 arc seconds. The beam needs to have an intensity at the target of at least 2mW/m^2. Finally, I need to be able to modulate the beam using an on/off scheme at ~38kHz.
I haven't found many laser diodes that operate at this wavelength, and I've been kind of disappointed in the information available in their datasheets, but here's an example of one potential laser diode I have been considering. I guess I can't post links, but here's the manufacturer and p/n.
ThorLabs L904P010
Question 1: Is it possible to achieve divergence this low? If so, how would you do it? From what I've read, beam expanders help lower divergence, and I want a pretty fat beam, so if I started with a small laser diode like the one I linked, and expand it out to 2 inches, it seems like it could work, but I don't know how to quantify this, and would like to do some level of design before just buying and testing things. I am reading Hecht's "Optics", but it doesn't seem like the best resource for this problem. If anyone knows how this design would be performed, or has any book recommendations, that would be great!
Question 2: What is the lowest transmit power I can use to still achieve this? I would like to use the lowest power possible, for safety reasons. From what I've read, it sounds like lasers primarily lose intensity through divergence, but don't tend to lose much actual power at all. So here's my train of thought. A 2 inch diameter beam would have an area of approximately 0.002 m^2. If it expands to a 6-inch diameter, the area would be about 0.016 m^2. So to get an intensity of say, 5mW/m^2 at 1/4 mile, assuming no other losses other than the beam diameter increasing, I would only need to transmit 0.09mW of power to achieve the 5mW/m^2 intensity. This seems like a crazy low amount of power, is that calculation right? Are there other power losses I am not considering?
Question 3: I am already experimenting on a similar design using an NIR LED rather than a laser, but from what I can tell this kind of divergence is basically impossible without a laser. I am an EE, so I'm pretty familiar with driving LEDs, and I already designed a variable current driver board that I use to blink the LED at 38kHz logically anded with some serial data, and at any current desired (up to approximately 1 amp). This works pretty well, and lets me control the brightness of the LED, while using it to transmit data. I would like to do something similar with a laser, but I haven't found much information at all on actually driving laser diodes. Can I treat it the same as an LED? Can the laser brightness be adjusted by adjusting the current, or is this a bad idea? I could also adjust the brightness with an adjustable aperture in the optics section, and still produce a beam of the desired size, would that be better? But I have to be able to blink the laser at 38kHz anded with my data in order for my detection circuitry to work, so if I can't do that, this is dead in the water.
Sorry about the novel guys, I appreciate any insights you might have here.
Thanks!