Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers



Laser Pointer Store

Tornado Laser Power Meter From The Laser Pointer Store

Joined
Mar 4, 2014
Messages
135
Likes
32
Points
28
I wouldn't immediately blame the power level as you have measured wavelengths from 405nm all the way up to 808nm. This could also be part of the problem with this meter. What was used as the meter that was used as the standard? How well does it measure these powers at these different wavelengths and what is its tolerance? There are too many variables here to blame it all on one thing. It might be better to use a single wavelength at different power levels first, then look at other wavelengths next.
I don't have the meter, but I just copied the data fro two other people who have the Pocket Power Meter. I assume that the Tornado Laser Power Meter is based on the same circuit that the Pocket Power Meter is based on. The data I can look at from the two different tests done by two different people is VERY limited. So, yes I'm guessing as to what might be the cause. If I sort the data by wavelength I see the following:

Readings-2.png

You can make a case that because these are two different meters, with two different sensor coatings, the data is questionable at best. I'd agree, but it's the only data I have. Still, I think I can safely say that the reading might be okay for a "Ballpark Reading", but it isn't really that accurate. By July of 2019, I'll have the Laserbee Meter and if all goes well, get the Tornado Laser Power Meter. At that point I could run some more reasonable tests.
 

paul1598419

Well-known member
Joined
Sep 20, 2013
Messages
13,832
Likes
1,983
Points
113
In that case, these measurements are meaningless. You will have to measure them for your self and your measurements will depend on the linearity and accuracy of your Laserbee. If you have a 50% beam splitter, this would be the best way to compare the two. Read them both at the same time.
 

Benm

Well-known member
Joined
Aug 16, 2007
Messages
8,082
Likes
688
Points
113
If you have a proper mains powered laser it should output a constant amount of light and be able to run indefinitely so you can switch between several meters without ever turning the laser off... and if the power is not constant you should see that when using the same meter again during testing - not much to worry about. If you only have undercooled battery powered lasers it would be a problem since you don't have a stable source. But that instability could be detected by just a single meter if it was due to overheating or output power depending on battery voltage etc.

I'm not sure how this meter works internally, but it just seems darn inaccurate for power levels under 100 mW or so.

Above that the accuracy seems to be within 10% or so across the visible and NIR spectrum, which is really not -that- bad for something costing only $55. It would be usable to compare lasers to eachother, or to see what effect changes in lenses and such have on output power.

I guess the sensor is a thermal one since it doesn't require inputting the wavelength, but that also makes me wonder how true the 10 watt capability claim could be in such a small form factor. Is also doesn't say anything about maximum power density on the sensor, so i wonder what would happen if you got a 5 watt laser focused to a pinprick on the sensor... I guess it'll damage the coating, which is understandable, but something that should at least be mentioned.

It's so cheap i'd almost buy one to do a teardown really, and to test how well that sensor handles high power densities since no maximum is stated at all (whereas it is for the hyperion etc).
 

paul1598419

Well-known member
Joined
Sep 20, 2013
Messages
13,832
Likes
1,983
Points
113
Most of my lasers are stable. Even the handhelds are, at least to several minutes. But, that is far from the case with lasers purchased from sellers in China. Even well thought of companies like Sanwu aren't stable out to several minutes. So, if you are testing the same laser twice using different meters, one has to ask if the power on the second reading is not lower because of the laser itself.
 

lasersbee

Well-known member
Joined
Sep 20, 2008
Messages
17,493
Likes
1,594
Points
113
If you have a proper mains powered laser it should output a constant amount of light and be able to run indefinitely so you can switch between several meters without ever turning the laser off... and if the power is not constant you should see that when using the same meter again during testing - not much to worry about. If you only have undercooled battery powered lasers it would be a problem since you don't have a stable source. But that instability could be detected by just a single meter if it was due to overheating or output power depending on battery voltage etc.

I'm not sure how this meter works internally, but it just seems darn inaccurate for power levels under 100 mW or so.

Above that the accuracy seems to be within 10% or so across the visible and NIR spectrum, which is really not -that- bad for something costing only $55. It would be usable to compare lasers to eachother, or to see what effect changes in lenses and such have on output power.

I guess the sensor is a thermal one since it doesn't require inputting the wavelength, but that also makes me wonder how true the 10 watt capability claim could be in such a small form factor. Is also doesn't say anything about maximum power density on the sensor, so i wonder what would happen if you got a 5 watt laser focused to a pinprick on the sensor... I guess it'll damage the coating, which is understandable, but something that should at least be mentioned.

It's so cheap i'd almost buy one to do a teardown really, and to test how well that sensor handles high power densities since no maximum is stated at all (whereas it is for the hyperion etc).
I'm convinced it uses the same circuitry as in the original
Pocket LPM.
A simple 10 turn trim pot across the TEC with the wiper
going to the off-the-shelf milli-volt meter as I showed
in my original review. There is no Linearity compensation
of the sensor coating or sensor power deviations as it
heats up.
Notice that after testing high power Lasers the reading
fall to Zero very fast as it reaches Zero rather than slowly
creeping to zero when reaching equilibrium.

The meter's zero point is set at a sensor cold point.
As more readings are taken the heat build up inside
the unit will drag that Zero point way below Zero
that can't be seen by the display since it does not
read below zero. There is no place for the heat to
escape and bring the "heatsink" to room temperature.
The worst possible design for an LPM.... IMO

As you stated a we use a mains powered Test Laser
with an adjustable Laser Power Supply to calibrate
our LaserBee products and when we set the laser to
say 1000mw it stays there with no drift.

Jerry
 

Benm

Well-known member
Joined
Aug 16, 2007
Messages
8,082
Likes
688
Points
113
I would not be surprised if that was the case, and they just displayed '0' for values that should technically be negative (i.e. sensor being colder than heatsink).

Since it's so tiny i would not expect the heatsinking to be able to do a proper job - it does't look like the outside of this thing could remain near room temperature if fired a 5 watt laser into it for a while.

As for corrections in linearity and such i really cannot tell. It could be just a standard mV meter with a pot, it could also be a uC that does have a look up table stored in memory and possible some calibration values in eeprom. There is no way to tell unless you take it apart really, either solution would easily fit the form factor as they are all chips with tiny footprints. They could also perform the same task really, a uC with enough IO's could easily drive that display multiplexed.

Anything like an arduino could easily do all this and hold the reference values and look up tables in flash memory easily, and would not cost more than $1 to bill of materials for this thing. The interesting bit to me is mostly how they do the actual sensing.
 
Joined
Mar 4, 2014
Messages
135
Likes
32
Points
28
Jerry,

Thanks for your comments. The offset of zero could explain this message I spotted:

mine can't read anything lower than 25mw...


Benm,

I was thinking the same thing about how it could have been built. A small PIC Micro-controller with built in A/D could read the sensor, apply corrections, and drive the 7 segment display. That's 7 pins for the segments, plus 4 to select (multiplex) each digit, 1 pin for A/D input, 2 pins for power, and a few more pins for who knows what, and the PIC16(L)F184XX seems to fit the requirements.

Microchip offers 8 bit, 10 bit and even 12 bit A/D built into their chips. The 10 bit would only be from 0 to 1023 and the 12 bit would be from 0 to 4095. With 1mw resolution, the 10 bit would allow for up to 1W and the 12 bit would allow for up to 4W. If they allowed the resolution to drop down to 2mw, 2.5mw, ... or more, higher readings are possible.

PIC Summery

I really wish I knew more about how the sensors work and where the error comes from. Is it wavelength absolution differences, non-linear response from the sensor, differences in room temperature, or what?
 
Last edited:

diachi

Well-known member
Joined
Feb 22, 2008
Messages
9,508
Likes
1,353
Points
113
Could easily just be an op-amp and an off the shelf panel mount digital voltage meter. For $55 I wouldn't be surprised.

Would never expect any sort of precision from a meter that costs $55 new, fine for a ballpark. Is my 1W laser actually 1W or is it 100mW? That sort of thing.
 

Benm

Well-known member
Joined
Aug 16, 2007
Messages
8,082
Likes
688
Points
113
Jerry,

Microchip offers 8 bit, 10 bit and even 12 bit A/D built into their chips. The 10 bit would only be from 0 to 1023 and the 12 bit would be from 0 to 4095. With 1mw resolution, the 10 bit would allow for up to 1W and the 12 bit would allow for up to 4W. If they allowed the resolution to drop down to 2mw, 2.5mw, ... or more, higher readings are possible.
Possibly, regardless of if they went with a microchip or atmel solutions, the baked in ADC's could do this.

I'm not sure what they are doing, but to get a 9999 count meter you would need a resolution of 13 bits or so. If you add a bit of noise to the input pin you can get that from a 10 bit ADC if you average over 64 measurements. I have no idea if they are doing this, or perhaps using so uC with 12 or 16 bit resolution dac's to get there.

On thing like arduino (atmel) processors doing an analogread is not -that- slow, so you could actually average out over a few dozen samples and still get a decent refresh rate if you wanted to. As far as it comes to that i think they could get away with that just fine, certainly citing times of 25 seconds or something to obtain an "accurate" measurement.

The thing i wonder more about though: what is the sensor? It seems to be fairly good in consistent wavelength response, but very bad at low power levels. This would point to the sensor being thermal, not optical.

I guess the only solution to a full answer would be to buy one and tare it apart to see what they use for sensing, as well as for processing, though i suspect there will be some uC inside with the part number filed off...
 

BobDiaz

Active member
Joined
Mar 4, 2014
Messages
135
Likes
32
Points
28
Could easily just be an op-amp and an off the shelf panel mount digital voltage meter. For $55 I wouldn't be surprised.

Would never expect any sort of precision from a meter that costs $55 new, fine for a ballpark. Is my 1W laser actually 1W or is it 100mW? That sort of thing.
lasersbee (Jerry) Said, "I'm convinced it uses the same circuitry as in the original Pocket LPM.
A simple 10 turn trim pot across the TEC with the wiper going to the off-the-shelf milli-volt meter as I showed in my original review. ... "

My comments to Benm were how it could have been built, not how it really is built. I don't have access to the costs of of things at the manufacturing quantity, so I can't be sure if they picked the milli-volt meter because it's just cheaper OR because it's just easier to design and build.

EDIT ADDITION BELOW:

I was thinking about how I would design it if the boss said, "Make it as cheap as possible", the PIC16F677 is only a 10 bit A/D, but if one does 16 reads of the A/D pin in a row with a slight delay between each read and add them all together, you get a 14 bit number. This does NOT make the A/D into a 14 bit converter, but would produce a 14 bit result that looks like a 14 bit conversion. It's really 10 bits that are true and the extra 4 bits are questionable. Assuming an 8W maximum, the real bits would have a resolution down to 8mW.
The other approach would be to use 2 A/D pins with one set to the high range, like 0W to 10W and the other set to the lower range of 0mW to 1,000mW. This would require two different adjustment pots, but would at least read lower values more accurately.
 
Last edited:

Benm

Well-known member
Joined
Aug 16, 2007
Messages
8,082
Likes
688
Points
113
I'm not excluding it actually just i some random voltage meter with opamps to scale the sensor signal, but i also just meant what i'd do go get it built on a small budget.

Throwing in a microcontroller that is under $1 and can both do the conversion, linearity correction and drive the display might be the cheaper choice here, compared having an opamp, several resistors, and the voltage meter unit.

As for getting more resolution out of an ADC: it's not that simple. You can induce a small amount of noise (if it's not already there) and then take multiple measurements to add resolution. To do this correctly though, you need to quadrate the number of samples for every extra bit. The first you get by averaging two values more or less, but to get the second bit you'll have to take 4 samples, 16 samples for the 3rd, 64 for the 4th etc.

This makes it totally impractical to use a 10 bit ADC to get 16 bit accurate value, but if you needed to display a range of 0000 to 9999 with a 10 bit ADC you'd only need 3 bits an change extra. To take it sort of safe you may want to use 32 samples, which would be fairly slow in response, but you could display the rolling average of these samples so your update speed could be as fast as doing a single analog read. This would give a bit of lag on sudden changes, but be good enough in most circumstances. As they allow someting like 20 seconds to get to 95% accurate measurement it would not be a problem.

As for the idea of using two ADC input with different scaling factors: that would work fine if your device was range switching, it's more or less what they do in your average multimeter on voltage settings. This thing does not seem to switch ranges at all though, it just displays from 1 to 9999 mW in a single range.

Any approach could be used here, but it seems to be terrible at low power levels, and i presume that is inherit to the sensor design, not the firmware. I could be completely wrong about that though - perhaps it's just a mV meter wired through some opamps and the unreliable measurements at lower powers could just be cause by electrical noise.

If anyone has this meter some images from what's inside could clarify a lot. Since it's so small it may also be worth sending it back and forth for analysis: It looks like it's just screwed shit so i could open it and have a look at the inside without doing any damage and send it back.
 




Top