Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

Voltage regulation of laser diodes? (Instead of current regulation)

Joined
Sep 6, 2009
Messages
101
Points
18
Is it possible to use voltage regulation instead of current regulation when driving laser diodes? The reason I ask is because using the LM317 in voltage regulation mode would lower the dropout voltage when compared to using it in current regulation mode (as it is used in the DDL driver). In current regulation, the LM317 wastes the 1.25v Vref across a resistor. Wiring the LM317 for voltage regulation would integrate the 1.25v Vref into to total output voltage, resulting in a lower dropout voltage.

For that matter, why exactly do we use current regulation anyway? Are there any advantages of using current regulation? How about consequences of using voltage regulation?
 





As from what I know we use current regulation because LD's are very stupid and very hungry. So as they heat up they eat more and more current hence the LD LED's from over powering it. ( Please anyone correct me if I am wrong, I wish to learn too :) )
 
The voltage drop across the diode can vary, a fixed voltage may work one day but blow the diode the next. take a look at the U-I graph, voltage regulation just isn't the way to go, it's very unstable. A constant current driver is much better and quite stable, but photodiode feedback would be the best.

A LM317 based driver is indeed quite crude, dissipating quite a lot in a resistor for higher power diodes because of the needed 1.25v drop. I typically have 0.1v across a sense resistor.
I use mosfets for high currents but I have more transistor around, so I use those mostly. They typically have 0.1V minimum voltage drop, but I only build full size drivers, so I worry more about the design than waste heat.
 
Thanks for the responses,

From what I've read, I'm guessing the voltage regulation isn't as good for regulating laser diodes. But will it still be able to power LDs? I'm not looking to perfectly regulate my diode, I'm fine with something that's "good enough". I've read about people using direct drive (I.E. no driver) to power their diodes; While this probably isn't the best way to power your laser, it seems to at least work to some extent. This gives me some confidence that voltage regulation may be used with laser diodes. I could be wrong though, maybe there are some long term consequences of voltage regulation that I'm overlooking?

@Bluefan

You're right, each and every diode has a different voltage drop, but does it really vary from a day to day basis? If it doesn't then the voltage can be set to correspond to the desired current. Granted, the current may increase as the diode heats up, but I'm hoping that the effects will be relatively minimal. Additionally, setting the voltage to a conservative value will ensure that the diode never gets over-driven from the current increase resulting from heating effects.

BTW, What driver are you using that gets 0.1v dropout? And what U-I graph are you referencing? I cant seem to find it :thinking:.

Thanks
 
It's just the instability of using a set voltage. It'll work perfectly well with a voltage source or a current source either one. With a little instability, a tiny change in voltage is a HUGE change in current, but a small change in current is an even smaller change in voltage.

So yeah, it'll work perfectly well to drive with a voltage source, our testing lab for diodes here is set up as a voltage source for all diode testing. But temperature and a whole host of other things can change the diode's behavior on a day-to-day basis, and that small change with a voltage source can give a huge change in current, so it's "safer" to drive with a current source, especially over longer periods of time. A few percent variation in current is safe, but a few percent variation in voltage is deadly.
 
Well the current regulation is there because, after the diode reaches its threshold voltage, that's the main factor in the output power is the current. The diodes can, in theory, take as much voltage as you can give them. Physically, they may burn out from too much of a voltage drop acting over the resistive properties of the diode, or just the other effects from the voltage, but having too much voltage really isn't an issue.

If you want greater efficiency, which is probably your actual goal, you should look into those boost-based drivers likes the Lavadrivers, which provide the set amount of current, with as much voltage as is needed, with efficiencies at around 90% or so. You can also try using something like a Joule Thief, but current regulated.
 
It's just the instability of using a set voltage. It'll work perfectly well with a voltage source or a current source either one. With a little instability, a tiny change in voltage is a HUGE change in current, but a small change in current is an even smaller change in voltage.

So yeah, it'll work perfectly well to drive with a voltage source, our testing lab for diodes here is set up as a voltage source for all diode testing. But temperature and a whole host of other things can change the diode's behavior on a day-to-day basis, and that small change with a voltage source can give a huge change in current, so it's "safer" to drive with a current source, especially over longer periods of time. A few percent variation in current is safe, but a few percent variation in voltage is deadly.

LM317s have line regulation of ±0.01% as listed on their data sheet. That seems pretty stable to me, but do you think that is stable enough to drive a laser diode?

Well the current regulation is there because, after the diode reaches its threshold voltage, that's the main factor in the output power is the current. The diodes can, in theory, take as much voltage as you can give them. Physically, they may burn out from too much of a voltage drop acting over the resistive properties of the diode, or just the other effects from the voltage, but having too much voltage really isn't an issue.

If you want greater efficiency, which is probably your actual goal, you should look into those boost-based drivers likes the Lavadrivers, which provide the set amount of current, with as much voltage as is needed, with efficiencies at around 90% or so. You can also try using something like a Joule Thief, but current regulated.

I'm not too concerned about efficiency actually. It's just that achieving a lower dropout voltage allows me to use a lower input voltage (and thus use fewer batteries).
 
Last edited:
LM317s have line regulation of ±0.01% as listed on their data sheet. That seems pretty stable to me, but do you think that is stable enough to drive a laser diode?

I'm talking about the instability of the laser diode. The LM317 is going to give it the same voltage every time, I know that. But one day, giving a laser diode 5V may only result in 100mA. Giving the same diode 5V on a different day could easily wind up giving it 200mA.

These numbers are made up and on the very extreme side, but it conveys the point: An IV curve for a laser diode is very flat, but that curve also moves a bit from time to time, with temperature and such, and with the general "finicky-ness" of electronics in general.

It's very possible to drive with a voltage source, people do it all the time. But with handheld drivers that are "set it and forget it" and laser diodes that have very flat IV curves and are being overdriven to the very edge, it's just safer to use a current source, because a small change in voltage is a big change in current.
 
A typical diode behaves like this, the area is the area of interest. A fixed voltage around Vd only have to deviate a tiny bit to change the current by a lot. And, is the temperature plays, this may make a runaway situation, the Vd drops because of the temperature, rising the current, heating the diode.

So, voltage regulation is only safe at the very beginning of the curve, there the current doesn't rise that fast with a changing voltage. Current regulation however works very good, the output of a diode is reasonably constatn with current, although the temperature does play here. Photodiode feedback would be very good in terms of stability.

I build my own drivers, but those are usually lab style, not that small. The lowest input voltage I think I could reach would be 1.8V, below that are no op-amps available. And the 1.8v opamps are not the cheapest, about $2.50 a piece in a bit larger quatities. 2.7V would be easier, 3v plain easy. After about 0.1v drop from a single transistor, that would leave 1.7 to 2.9 minimum for the diode.

The challange would be getting this into a small package and soldering all the smd components, but it would make a nice clean linear driver, no switchmode with the associated spikes. That's the trouble with all the flashlight mods, I just build lab style stuff.

EDIT: 0.1v voltage drop limits the choice of transistor more than I thought. It it perfectly possible, but would be a bit difficult for higher currents. 0.3 to 0.6v is more common, altough these are typical maximum ratings, so it's probably below this. How much current do you need?
 
Last edited:
it's posible but dangerous. Anyway, to do it with a lm317 you'll need two resistors instead one, what's the good side?
 
I just thought of it: it's possible to to reduce the 1.25 drop needed in a similar way as a LM317 is used in voltage mode, witth he same restriction that the output voltage can't be below 1.25V. I'l make a circuit diagram tomorrow, unless I find some time today.
 


Back
Top