Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

1 ohm resistors....

Joined
Aug 17, 2007
Messages
21
Points
0
I keep seeing people recommend a 1 ohm resistor to measure current. Good idea to get you in the ball park but the tolerances of most resistors is 5 percent. If it's off enough you can fry your diode.

I recommend you set your DMM on the lowest resistance setting, shorting the leads, note the reading. Then insert your 1 ohm resistor and take the difference.

Lead resistance of most cheap DMM's is at least 2-10 ohms.

Keep this in mind.
 





Some multimeter's have a setting to do just that, to take a relative reading. But, most multimeters also are not very accurate down at 1 ohm, so it may be difficult to get an accurate value. Even if I take lead resistance into account (mine's typically 0.2-0.4 ohm), the value will still likely fluctuate and since most meters are accurate down to 0.1 ohm at that range, you're still possibly off by 10% or more.

Also, you just have to think about that 5% of 100mA is 5mA, 200mA is 10mA, 300mA is 15mA and so on... worst case that it is +5%, the difference is not HUGE and will likely not instantly kill a diode, unless you're already pushing it to the limit.

A suggestion to confirm your value would be to just use an ammeter (multimeter set to amps) and just either short the output of the driver through your meter, or in series with a load (like your laser diode), to measure current directly. However, just be sure if you're using a linear-type driver (like mine), that you have a decent surplus of input voltage since the multimeter will add to the output voltage.
 





Back
Top