Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

VERY Simple Driver Theory Question

Joined
Aug 11, 2010
Messages
360
Points
0
I have a basic understanding of how current regulating drivers work but I have a specific question about them.

I know that a laser requires a certain current and voltage from the driver to operate properly. In current regulating drivers the amount of current output is controlled. Does this mean that the voltage is determined by the diode itself running at this preset current?


Basically im asking if the driver outputs say 1000mA do I need to worry about the voltage or will the diode dictate that itself?

Im thinking of ordering some LED drivers online and messing around with using those instead of laser drivers and I need to know what specs are necessary.
 





The Diode pulls the voltage necessary to operate. The current is what's critical, and the drivers used for LD's have an operating range that limits the voltage available to the Diode. AFAIK the diode will not pull excessive voltage over what's required to operate. hope this helps, and if I'm in error someone will chime in to correct me.
 
so any driver thats outputing the correct current and chas a range of voltage that includes that needed for the diode will work?
 
So, in theory, suppose you stacked (wired in series) 30 lithium 3V button cell batteries that can supply about 10mA of current. You would have a power source supplying 90V. However, your current wouldn't be high enough to destroy a 445nm 1W diode.

Would that diode really be able to draw only its required 4 or 5 volts from the 90V supplied?

The theory of this just seems so backwards and difficult to become comfortable with.
 
Yes that is correct. the closer you can get the input voltage to the requirement of the diode the better. Any voltage in excess of what the diode needs will be converted to heat by the driver. For a DDL driver you need aprox 2.5v over the forward bias of the diode.
 
So, in theory, suppose you stacked (wired in series) 30 lithium 3V button cell batteries that can supply about 10mA of current. You would have a power source supplying 90V. However, your current wouldn't be high enough to destroy a 445nm 1W diode.

Would that diode really be able to draw only its required 4 or 5 volts from the 90V supplied?

The theory of this just seems so backwards and difficult to become comfortable with.

Understandably so.

Normally, current is what we control with a circuit, so we're used to providing correct parameters IE voltage and resistance, for our needed current to flow.

But since a diode has very low resistance when its forward biased, we need a new way to control the current through the diode, since by nature, a diode will take whatever current you give it before frying itself.

The depletion layer in the diode is what determines the voltage that the diode must drop, if a current is to pass through it. That's the thin layer where there are few "holes'' and few "non-localized" electrons. The atoms in this layer are chemically happy, so they refuse to become as "p" or "n" type as they are supposed to be. So, to push a required current through this layer, you need a little more energy to do so- thus a small voltage is dropped.

An ideal diode has a voltage drop of zero. Just remember that, and you'll remember that we're dealing with LASER diodes, which contain weird doping materials, and strange crystal structures. Its a SEMIconductor laser. It "kind of" conducts well. It just needs a little bit of a "push" to get that current through!
 
Ehm ..... be careful with these premises, sorry.

Is true that a LD, same as leds, works in current and not in voltage, but is wrong to think that the LD will automatically not pull excessive voltage over what's required to operate .....

Also, the current that flow in a load is a derivate from the resistance (or, more exactly, the equivalent serie resistance), of the load, and from the voltage that you apply at the sides of this load .....

Let me do an example, if you have a load with a resistance of 1 ohm, you will have a current flowing through it that is 1A for each V that you apply to the load ..... if your load is 0.1 ohm resistance, the current is 10A for each V, and so on .....

Now, imagine a laser diode as "a low resistance with a voltage threshold", that basically mean, the resistance is high under a certain voltage value, and after this (the FV) it become very low for the part of the voltage that is over the FV (a bit like a zener diode) ..... the current drivers that we uses, acts regulating the voltage at the LD terminals in the way that the current that flow through it remains stable, and this is done placing a sense resistor in the current path (basical ohm law, the current in the circuit is the same for all the elements in serie), and measuring the voltage drop caused on this sense resistor from the current flowing, and using this as feedback for the regulation circuit ..... when for some reason (junction heating, as example), the current start to change, the driver modify the voltage for keep it constant.

But, with the low serial resistance of the LD in working state, a small increase of the voltage is usually enough for give you a big increase of the current, for this reason the drivers works in current regulation, cause you cannot get a voltage regulation stable enough for grant you that the current don't kill your diode (and, if you give it enough voltage, it take all the current that it can, until it blow up).

In the example made from rhd, your 90V battery will not blow up your diode ONLY if your batteries cannot give more than the maximum current that the chip can hold (about the 10mA in your example, yes, IF the batteries can give ONLY a maximum of 10mA, for their internal resistance) ..... is the same as for some of the small keychain flashlights that uses 3xAG13 button batteries and a led diode without a resistor ..... the led works at 3V, and the batteries gives 4.5V ..... it don't burn ONLY cause those batteries cannot give it more than those few mA that the led can hold ..... but try to use 3 normal batteries, and you will burn out your led in seconds.

Same for a LD, if you use a battery that can give only less than the maximum current that the LD can hold, then also a higher voltage don't burn it, but use a battery that CAN give mor current, and you kill it.
 
Essentially, a small increase in voltage results in a large increase in current. If you consider the many charge levels of a battery and their respective voltages, you have a large current range within the small voltage range of the battery. At best, this makes power inconsistent. At worst, it ruins components.

If you hold the voltage and temperature constant, the current will not change much.
 
I have a basic understanding of how current regulating drivers work but I have a specific question about them.

I know that a laser requires a certain current and voltage from the driver to operate properly. In current regulating drivers the amount of current output is controlled. Does this mean that the voltage is determined by the diode itself running at this preset current?

Basically im asking if the driver outputs say 1000mA do I need to worry about the voltage or will the diode dictate that itself?

Im thinking of ordering some LED drivers online and messing around with using those instead of laser drivers and I need to know what specs are necessary.

Brad;

The LED drivers will output a set current . . .

throughout a fixed range of voltages.

They usually specify an upper voltage limit for full current.

They are happiest, operating just under this upper voltage limit.

So, pick one with a voltage range slightly above the voltage your laser diode requires.

LarryDFW
 
Last edited:


Back
Top