I do. I know enough to be a good consumer and "buy this driver for this diode".
But I really have no idea how they work. Well..Ok I know how they work..but information is so vague and lacking on what you can and cannot do. It seems almost assumed by people with obviously greater electrical knowledge.
So it seems simple enough...use a power source (battery) that gives you the ideal input voltage for the driver...plug it up ..and now plug that into your diode.
But what if I don't want to use batteries..what if I want to plug it into a 12v source. If t he circuit was designed to ..output 900mA at 4.5V to a diode from an input of 7.5V ..then..an input source of say 12V would not scale properly..it will go through the resistors and still be a higher voltage than the diode was designed for.
Or is this all irrelevant..will driver circuits "sort it out". It seems they are just "purpose built" with specific power requirements in mind with no room for flexibility.
Even if I turn the Pot on such a thing to "adjust the current". I can't just assume the output voltage to the diode will remain the same..when the initial voltage is higher than designed.
OR again..will it sort it self out and still give the same output voltage and current just with more heat. because turning the POT also adjusts "both" values by increasing resistance..if one value (input) is uncalibrated , it will continue to be wrong in comparison to the other no matter how much you play with the pot. I could get the right current to a diode..but the voltage would always be wrong or vice versa..
But I really have no idea how they work. Well..Ok I know how they work..but information is so vague and lacking on what you can and cannot do. It seems almost assumed by people with obviously greater electrical knowledge.
So it seems simple enough...use a power source (battery) that gives you the ideal input voltage for the driver...plug it up ..and now plug that into your diode.
But what if I don't want to use batteries..what if I want to plug it into a 12v source. If t he circuit was designed to ..output 900mA at 4.5V to a diode from an input of 7.5V ..then..an input source of say 12V would not scale properly..it will go through the resistors and still be a higher voltage than the diode was designed for.
Or is this all irrelevant..will driver circuits "sort it out". It seems they are just "purpose built" with specific power requirements in mind with no room for flexibility.
Even if I turn the Pot on such a thing to "adjust the current". I can't just assume the output voltage to the diode will remain the same..when the initial voltage is higher than designed.
OR again..will it sort it self out and still give the same output voltage and current just with more heat. because turning the POT also adjusts "both" values by increasing resistance..if one value (input) is uncalibrated , it will continue to be wrong in comparison to the other no matter how much you play with the pot. I could get the right current to a diode..but the voltage would always be wrong or vice versa..
Last edited: