Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

LPF Donation via Stripe | LPF Donation - Other Methods

Links below open in new window

ArcticMyst Security by Avery

Is there such a thing as a "hobbiest grade" spectrometer?

Joined
Mar 26, 2010
Messages
3,220
Points
0
You misinterpret what i'm saying. My gripe is against people whining about prices. Not that 'spectrometers shouldn't be cheap' Whining about prices accomplishes nothing. Period.

What i'm arguing however, is that spectrometers are pretty much out of the DIY realm due to the complexity.. Building one using a raw linear CCD which is what the science surplus units, as well as the ocean optics units use) is pretty much the equivalent of building a USB camera from scratch starting with a CCD.. It's possible for individual to do yes, but the time, and materials cost makes that $500 science-surplus one look a whole lot more attractive. Unless your retired or rich with nothing else to do with your time I guess. I unfortunately fall in neither of those categories.

Some people do have that kind of time, finances, and skill. Yes.. But the vast majority of people on this forum don't.

Though the people with the kind of time, finances, and skill to DIY a spectrometer also AREN'T the types to bitch about how much the ones available cost, either..
 





rhd

0
Joined
Dec 7, 2010
Messages
8,475
Points
0
I can actually start to see (in my head) some realistically DIY-able approaches to a spectrometer.

You'd be doing what was done with LPMs - simplifying, at the expense of an acceptable amount of accuracy loss.

For example, you could have:

- Slot, laser shines through.
- Cheap diffraction grating, projecting dots onto a white translucent sheet a known distance from the grating.
- Opposite side of the white sheet, place a cheap webcam focussed on the sheet.
- Snap a shot of the white sheet. Measure the pixels between the two centre-most dots.
- Calibrate the pixel to mm ratio by using a known wavelength first.
- From then on, just turn your pixel spacing info into mm spacing info, and apply fun math to determine the wavelength of the light.

It would take some work, and some software to make it user friendly. But I can already envision a < $100 spectrometer, and I'm not nearly as smart as many here ;)
 
Joined
Mar 26, 2010
Messages
3,220
Points
0
Possible, but your optical path has to be rigid and fixed. If the beam hits a different spot on the diffraction grating, that's going to make your calibration useless. There's a reason SS's and OO's spectrometers are fiber coupled. Any component in the optical path moving would render it off by a lot. You'll also have to find a way to sort out power differences in lasers as well. If you overload the camera and the dot 'blooms' like tends to happen with cheap cameras with too much light, you could get a reading 100nm wide. The blooming happens just due to the nature of how the cameras work, the bleedover to adjacent vertical pixels that happens in normal camera sensors would be a severe detriment. In the spectrometers the linear CCD doesn't have that issue because well.. It's only a single row of pixels. Linear CCD's are actually easy to find an everywhere. Pretty much every scanner and copier made in the last decade will have one. However these are a bit... large.. for this application.
 
Last edited:
Joined
Feb 7, 2009
Messages
201
Points
0
The OP for example. Even when pointed to $200 align it yourself spectrometers, pretty much said "thanks but i'll keep looking"


Perhaps I should have been more clear. When I said I needed to do more research, I mostly meant that I wanted to be damn sure I could calibrate such a device and get it working before I went and dropped $200 of my currently very limited disposable income on it. I once heard from a mod on another forum, "Buy quality and only cry once." and I agree whole-heartedly. However, at this time, it's kind of a big decision to spend that kind of money on what is pretty much, to me at least, a nerd-toy. It would serve to scratch an intellectual itch, and that's about it.

Honestly, I'd feel a lot "safer" if I red a thread here by someone who said "I bought one of these, and this is what I had to do to get it calibrated". I'm sorry if that sounds kind of cheap, but I honestly can't afford to pioneer much in this hobby. To those of you that can and do, you have my gratitude (actually, "gratitude" is a pretty insipid word for it. If y'all lived next door, I'd have already baked you some cookies. At least.)

So, in the meantime, I'm going to hit up Google and see if I can find someone who has had experience with such a task. That's all I was sayin'.

Oh, and I totally appreciate the gripe about low-ballers. Even with the very limited amount of online selling I do, I get 'em. I can't even bring myself to haggle in a venue where it's expected. Just feels scummy to me. . .
 

Ablaze

0
Joined
Oct 19, 2011
Messages
462
Points
0
I can't even bring myself to haggle in a venue where it's expected. Just feels scummy to me. . .
I know what you mean, when I go to Mexico I gladly pay twice as much for the privilege of not haggling.

I like your sig, btw. How much do you want for it? 3.5 cents? I may be willing to go up to 4....
 

rhd

0
Joined
Dec 7, 2010
Messages
8,475
Points
0
Possible, but your optical path has to be rigid and fixed. If the beam hits a different spot on the diffraction grating, that's going to make your calibration useless. There's a reason SS's and OO's spectrometers are fiber coupled. Any component in the optical path moving would render it off by a lot. You'll also have to find a way to sort out power differences in lasers as well. If you overload the camera and the dot 'blooms' like tends to happen with cheap cameras with too much light, you could get a reading 100nm wide. The blooming happens just due to the nature of how the cameras work, the bleedover to adjacent vertical pixels that happens in normal camera sensors would be a severe detriment. In the spectrometers the linear CCD doesn't have that issue because well.. It's only a single row of pixels. Linear CCD's are actually easy to find an everywhere. Pretty much every scanner and copier made in the last decade will have one. However these are a bit... large.. for this application.

But you know, those problems ^ are pretty small really. Considering the fact that I spent 15 minutes thinking up the idea, it actually looks fairly feasible - in fact, a lot more feasible than I was thinking it would be two days ago before this thread popped up ;)

It also occurred to me that the scanning head from a flatbed scanner might be a great resource. It's linear, and has a fairly high resolution. You probably wouldn't want the light to shine directly from the grating into the head (or the CCD), but you should be able to have some sort of intermediate substance (like a semi-translucent paper) than would pass light through, but also serve to reduce bloom.

Again, I lot of the logistics are technical minutia are for R&D to take care of. However, the principal and basic underlying concept of using a grating and then a sensor to measure distance of dots, *should* be doable in the $100 range at some point.
 
Joined
Mar 26, 2010
Messages
3,220
Points
0
I just think the accuracy will likely be worse than using cheaper methods, like the thread Jerry linked. Especially if using and intermediate layer between the grating and the sensor.

It's one of those 'do it right or there's no point' deals. If you can't break the +/- 5nm barrier, then it's no better than what you can already do more cheaply without fancy electronics. I'd be happy to be proven wrong, however I think it's unlikely I will be with the current sensor technology these use.
 

rhd

0
Joined
Dec 7, 2010
Messages
8,475
Points
0
I think that tool Jerry linked to is total BS. I don't trust the creator of it. He's a sketch bag ;)

On a serious note though, let me propose something -

I think you could setup a reasonably high resolution DSLR pointed at a clean wall. Somewhere (probably off the left of the DSLR) position a stationary diffraction grating. Then, on the wall towards the left-most of your DSLR frame, put a little dot.

Shine a known laser (a 532 would be perfect) through the center of the diffraction grating. Aim the laser, through the grating, at the dot on your wall. Scale the camera's zoom so that the dot one to the right of your center dot, is over near the right hand side of the frame.

With everything stationary, shoot a frame.

Then swap the 532 with the laser you want to test. Shine the unknown laser through the center of the grating, with the central dot lined up with the same point on the wall. Obviously you need the dot one to the right of your center dot to also be on frame.

With everything stationary, shoot a frame.

I believe that with 4000 horizontal pixels, you would be able to apply the diffraction grating math to your two photos, and determine the unknown wavelength with a +/- of no more than 2nm.

You'd have 3 or 4 pixels to a nm, so even if you loose some precision in the process of determining the center of a dot, you could have a 10 pixel wide dot and still maintain a pretty good ~2nm precision. If you hugely underexposed the photo, you would minimize dot bleed.

Ultimately, this could be automated in software. IE, feed in 532 calibration photo, feed in unknown photo, calculate the unknown wavelength.

Because both photos would use a consistent grating-to-wall ratio, and a consistent grating lines/mm figure, you wouldn't actually need to measure that. You could use a constant variable for both figures, and by virtue of the fact that you have a known (532) photo for calibration, you'd never need to identify the value of those variables.

You might actually require two known wavelengths in order to correct for an off-angle camera plane, but even then, many members here have a 532 and a 473.
 
Last edited:
Joined
Mar 26, 2010
Messages
3,220
Points
0
You'd need at least two calibration wavelengths. Not just one. You have to both define the width of your scale, then where the known wavelengths fall.

So you'd need two stable lasers with known wavelengths to define your scale.

For instance if you had a 532nm and a 633nm hene, and on your camera, there were 300 pixels seperating them. That would define your scale as being 1 pixel = .337nm

You'd have to find a cheaper camera though. People who can afford DSLR's, can afford $500 spectrometers as well. ;)
 

rhd

0
Joined
Dec 7, 2010
Messages
8,475
Points
0
You'd need at least two calibration wavelengths. Not just one. You have to both define the width of your scale, then where the known wavelengths fall.

So you'd need two stable lasers with known wavelengths to define your scale.

For instance if you had a 532nm and a 633nm hene, and on your camera, there were 300 pixels seperating them. That would define your scale as being 1 pixel = .337nm

You'd have to find a cheaper camera though. People who can afford DSLR's, can afford $500 spectrometers as well. ;)

Actually, it's even more complicated than that. Even with perfect data, there wouldn't be a constant pixel to wavelength ratio. The ratio isn't constant. For example, with 500 line/mm grating, 1m from the wall, a distance between the dots of:

20 cm = 392nm
30 cm = 574nm (182nm higher, for the addition of 10cm)
40 cm = 742nm (168nm higher, for the addition of 10cm)
50 cm = 894nm (152nm higher, for the addition of 10cm)

You would actually have to backtrack the equation, subbing in an "X" for the value of the grating lines per inch, and then "Y" for the value of the distance from the wall.

Then you'd need to keep those X & Y constant and build the equation out for the changed variable of the distance between dots. Math is not my strength - especially not algebra. But if the photograph was a perfect representation of 1 cm = *some consistent number* of pixels, then the approach above is all we need. Solution workable. And you actually shouldn't need more than one known wavelength, because were relying on the constancy of the other two variables in the grating/wavelength equation.

The task exceeds my abilities when you start to recognize that 1 cm on the wall does not = *some consistent number* of pixels, because a photograph isn't a scan of the wall. Perspective, lens construction, and angle relative to the wall create a non-constant relationship between actual distance on the wall, and pixels in the photo.

This is why I say you'd need a second known wavelength. The purpose here, would be to determine a factor to correct for the camera being off angle (not perfectly 90 degrees to the wall). For example, you would do all of the above with a 532. You could use those figures to calculate the expected dot positions for a 473. Then, using a photo of your setup with a 473, compare your expected dot positions, with the actual data on the dots from your 473 calibration shot. This would allow calculation of a correction factor for the off-angle camera.

The math there is beyond me, but I know that it's just a matter of algebra. All the required info is available.

Regarding DSLR availability. I have one, I don't have a spectrometer. I bet there are 20x as many people on this forum that have DSLRs sitting around, than have spectrometers. And regardless of cost, a lot of people have friends / family with one. Even a point and shoot with decent resolution and the ability to under-expose a shot should work.
 
Joined
Mar 26, 2010
Messages
3,220
Points
0
It's the same math you need to calibrate one of SS's uncalibrated spectrometers I believe. ;)
 
Joined
Sep 12, 2007
Messages
9,399
Points
113
You can also hold a grating to your camera lens like so:

25812d1267301661-white-lasers-dscf1192.jpg


It doesn't seem like too much of a stretch, provided you have a slit to sharpen the points:

Spectrum.jpg
 
Last edited:
Joined
Feb 1, 2008
Messages
2,894
Points
0
Actually, it's even more complicated than that. Even with perfect data, there wouldn't be a constant pixel to wavelength ratio. The ratio isn't constant. For example, with 500 line/mm grating, 1m from the wall, a distance between the dots of:

20 cm = 392nm
30 cm = 574nm (182nm higher, for the addition of 10cm)
40 cm = 742nm (168nm higher, for the addition of 10cm)
50 cm = 894nm (152nm higher, for the addition of 10cm)

You would actually have to backtrack the equation, subbing in an "X" for the value of the grating lines per inch, and then "Y" for the value of the distance from the wall.

Then you'd need to keep those X & Y constant and build the equation out for the changed variable of the distance between dots. Math is not my strength - especially not algebra. But if the photograph was a perfect representation of 1 cm = *some consistent number* of pixels, then the approach above is all we need. Solution workable. And you actually shouldn't need more than one known wavelength, because were relying on the constancy of the other two variables in the grating/wavelength equation.

Um... can anyone here do non-linear regressional analysis? That would be the kind of "mathing" technique that would be most helpful right now. Build a polynomial function given some "n" points are true.
 




Top