Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

M140 build wasted watts

Joined
Dec 27, 2013
Messages
17
Points
0
Hello all,
I've been experimenting with a M140 build (SF501B, 1.55W output). It's designed to run with 2x 18350 cells. It gets quite warm almost immediately, so I tested the current flow, and I got this: with a total voltage of 8.12 Volts (4.06 for each cell), there are 1.2 amperes flowing into the laser. That makes about 9.7 Watts of input power. Since the optical output of the laser is 1.5W... say about 2W if we account for optical losses in the lens and such... there are still almost 8 Watts of power that gets lost in heat.
I was wondering if this is consistent with what should be expected - a waste of 80% seems like a lot! :)

Cristian
 





sounds about right. lasers are not an efficient way of transforming energy. I was reading about the osram 445nm 1.4W diodes, they are 27% effiicient, not much better than the numbers here. Exacty how fast does it get warm? is it very hot, or just warm? that's a lot of power in a fairly small host. my 1500mW 445 is in a larger host but it feels warm after only about a minute, which is the duty cycle I stick to

Still, diode lasers are more efficient than DPSS lasers
 
Last edited:
Just think, though. It beats the heck out of an argon. Thousands of watts in and mW out. Good beam
quality comes at a terrible price! :beer:
 
Exacty how fast does it get warm? is it very hot, or just warm? that's a lot of power in a fairly small host.

It's hard to define "how warm and how quickly"... Surely way less than a minute (more like 15 seconds); but it's not "hot", more like "definitely warmer than my hand". I'll try to make a sort of graph with temperature/time if I can.

What got me wondering is that I know the diode is run at about 1.5A, but surely at a lower voltage (I'd guess about 4V? Does anyone know the exact voltage?)
At 4V, the diode would get 6W of electrical power, which would still mean a lot of heat produced by the diode; but it would also mean that there are about 4W wasted somewhere else, not in the diode. The driver should be a buck driver, is it likely to waste 4W? Or maybe my guess of 4V for the diode is way wrong?
 
I think it's about 5V for that diode, which at 1.5A means about 7.5W of power used by the diode itself. I'm not sure how efficient your buck driver is either. If you're putting in about 10W of power, you've got about 75% efficiency, which might be about right. The driver may not have been designed and tested to maximize efficiency. That 75% isn't that bad either.
 
I think it's about 5V for that diode, which at 1.5A means about 7.5W of power used by the diode itself. I'm not sure how efficient your buck driver is either. If you're putting in about 10W of power, you've got about 75% efficiency, which might be about right. The driver may not have been designed and tested to maximize efficiency. That 75% isn't that bad either.

You probably mean 25% efficiency and 75% lost, which is quite normal for laserdiodes.

Edit: The 75% is most likely related to the buck drivers efficiency but related to laserdiodes most of them are having about 25% efficiency.

So at 7.5W total you get about 1.8W pure light energy (about 1.5W after the optics) and the rest 75% is generated into heat.
 
Last edited:
Yeah I was talking about the driver. Lasers and LEDs aren't the most efficient light sources in terms of absolute wall-plug efficiency -- but still better than many light sources. It seems to get worse the shorter the wavelength too.
 


Back
Top