erdabyz said:
For example, if you were driving a blu-ray with a 5.5V supply, you'd want a 0.5V drop to feed the diode with the required 5V. If the voltage drop through the diode decreases 0.1V, you might be screwed if you are already driving the laser diode near the limit.
That is an excellent example! It shows what can go wrong, and why.
I am going to sound like I'm trying to sell the idea of resistors as current regulators or something, and I'm not. Its more that I'm selling the idea that with a little creativity and an understanding of the fundamentals, you can make things work when other people tell you that they won't work.
So with your example, lets say you pick a resistor that gives you 150 mA with a 5.5 v source when the diode is dropping 5 v across it. Then, as you say, the voltage drop through the diode decreases 0.1V to 4.9 v. How much current does the diode get then? The resistor voltage went from 0.5 v to 0.6 v. Since the resistor is linear, it is now passing 150 * 0.6 / 0.5 = 180 mA of current, and your diode will probably go poof in short order.
Now suppose you use a 9 v power source. Under the the same scenario, where the current is initally set to 150 mA and then the diode voltage drops 0.1 v, the diode current goes up to 150 * 4.1 / 4.0 = 153.75 mA. This is not so bad. And with even higher power supply voltage, the situation gets even better.
If you were actually going to do this using a 9 v battery, you would want to choose a resistor that would be safe for your laser diode when the diode voltage was at its lowest and the battery voltage was at its highest.
Some other things to be aware of would be:
- Batteries have their own series resistance, and you can subtract that from the value of the current limiting resistor you use.
- Batteries have different voltages and series resistances depending on their technology (eg: alkaline, carbon zinc, NIMH, lithium), their state of charge, and their temperature. For example, a common, cheap carbon zinc 9 v battery actually starts out life closer to 10 v.
It's also more difficult to set a precise current this way, without risking the diode, because as you have to calibrate each resistor for a specific diode (voltage drop is always within a range, not a "fixed" value for every diode), you can't get a precise current reading with a dummy load.
That's also a good point if your power supply voltage is low and you are trying to drive the diode to the max. If your power supply voltage is up around 9 or 10 volts and you are driving your violet laser diode at, say 10% under its safe limit, you actually can use a fixed value resistor for every one.
Again, I'm not saying anyone should do this, but it can be nice to know what is possible and why it works.
Thanks for your good ideas!