Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

LPF Donation via Stripe | LPF Donation - Other Methods

Links below open in new window

ArcticMyst Security by Avery

Constant Current vs. Constant Voltage, Heatsinking and Thermal Runaway - Explained






Benm

0
Joined
Aug 16, 2007
Messages
7,896
Points
113
Re: Constant Current vs. Constant Voltage - Explained

Its a bit of a slow video, but it demonstrates the issue well.

I'm surprised the led survived the last experiment, in some cases they get hot enough to stop emitting (much) light, possibly even de-soldering themselves from the circuit board.

For a laser diode the ordeal would probably have been fatal - a 30% increase in current is often enough to kill them when already operating above spec (as all of us do here ;) ).
 
Joined
Jan 12, 2008
Messages
3,290
Points
83
Re: Constant Current vs. Constant Voltage - Explained

Yeah, if I didn't have the current sense resistor, the LED probably would of died, and it was just 0.1 ohm! And in the kipkay-hack there is NOTHING between the diode and batteries...
 

Benm

0
Joined
Aug 16, 2007
Messages
7,896
Points
113
Re: Constant Current vs. Constant Voltage - Explained

That relies on internal resistance of the batteries for the same purpose. The downside is that this is always an unknown factor - can be as little as 0.1 ohm for a AA battery, and goes up while the battery wears down.

If you were to wire a red LD to a 3.0V hard voltage source (lab supply with current limiting off) it will probably result in instant-diode-poof ;)
 
Joined
Jan 12, 2008
Messages
3,290
Points
83
Re: Constant Current vs. Constant Voltage - Explained

Oh, and I might make another video showing what happens with a high power LED with just a current limiting resistor instead of a driver. Like kipkay's phaser...
 

Benm

0
Joined
Aug 16, 2007
Messages
7,896
Points
113
Re: Constant Current vs. Constant Voltage - Explained

You could.. but it would be less dramatic than the constant voltage example shown.

Running leds of a fixed voltage source with series resistance is a pretty common practice, and it often works out fine since there is enough voltage drop across the resistor (typically at least a 3V led fed off a 5V supply, 100 ohms for 20 mA). The deviation in current isnt that big if the forward voltage drops 100 mV or so.

The problem starts when people push that principle a bit too far, trying to run a 3.3V led off a 3.6V battery and expecting everything to be the same as with the little leds they are used to.

Same goes for laser diodes - you could run a red LD from a 12 volt supply with only a series resistor just fine, though that would waste a shitload of power in the resistor.
 
Joined
Jan 12, 2008
Messages
3,290
Points
83
Re: Constant Current vs. Constant Voltage - Explained

Yes, it will work fine at low currents, but it will start to drift at higher currents as the temperature builds up, like in the last experiment, but not quite as dramatic.
 




Top