HIMNL9
0
- Joined
- May 26, 2009
- Messages
- 5,318
- Points
- 0
Auto balances are easy to disable in drivers, the worse problem is, as said, the resout of interferent colors ..... i mean, if the CCD get red and blue on the same pixels, it read "magenta", that is not a wavelenght, but the combination of two different colors.
Maybe the more easy and accurate system for not get those errors is the "pseudo-monochromator" system ..... i had built a setup similar to this, in the past (but then i broke the prisms, and actually it's lost somewhere) ..... the light was diffracted from prisms, then shooted on a linear CCD i got from an old HP scanner, and the CCD was simply drived for give out a data stream with a trigger signal at the start, so it was visualized on an oscilloscope.
There was some main problems, anyway ..... the output was not linear (each tipology of sensors have its own sensitivity curve, and i had no ways for make a correction amplifier that follow an irregular curve) ..... also, the spectrum was a bit compressed on the red side, due to the usual nonlinearity of the prisms in light diffraction ..... diffraction gratings are more linears (also they have problems, they compress a bit more the BLUE side, opposite to the prisms ..... i'm wondering if this can be corrected shooting first the light in a prism, and then through a grating) ..... anyway, i suppose it can be done, with some little optomechanical works ..... for me the main problem is not the hardware part, is the software part
Uhm, i think i have to start to make some experiments, when i get a decent glass grating (or, anyway, a decent grating with the maximum possible lines-per-mm)
@Leo: sorry, but they are secondary colors, made from the add-on of two different "primary" colors ..... so, as example, magenta is not a wavelenght, is just the interaction of two different wavelenghts in our eyes.
Maybe the more easy and accurate system for not get those errors is the "pseudo-monochromator" system ..... i had built a setup similar to this, in the past (but then i broke the prisms, and actually it's lost somewhere) ..... the light was diffracted from prisms, then shooted on a linear CCD i got from an old HP scanner, and the CCD was simply drived for give out a data stream with a trigger signal at the start, so it was visualized on an oscilloscope.
There was some main problems, anyway ..... the output was not linear (each tipology of sensors have its own sensitivity curve, and i had no ways for make a correction amplifier that follow an irregular curve) ..... also, the spectrum was a bit compressed on the red side, due to the usual nonlinearity of the prisms in light diffraction ..... diffraction gratings are more linears (also they have problems, they compress a bit more the BLUE side, opposite to the prisms ..... i'm wondering if this can be corrected shooting first the light in a prism, and then through a grating) ..... anyway, i suppose it can be done, with some little optomechanical works ..... for me the main problem is not the hardware part, is the software part
Uhm, i think i have to start to make some experiments, when i get a decent glass grating (or, anyway, a decent grating with the maximum possible lines-per-mm)
@Leo: sorry, but they are secondary colors, made from the add-on of two different "primary" colors ..... so, as example, magenta is not a wavelenght, is just the interaction of two different wavelenghts in our eyes.