Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

Review: 510nm Direct Green Diodes / Build Photos (DGH-N1, DGH-N2)

Just got a spectrometer reading. Looks like these are 515, not 510.

attachment.php
 

Attachments

  • DG.png
    DG.png
    15.3 KB · Views: 489





the 520s will look as different from 532 as the 510s do from 520. ~10nm is ~10nm..

Nope.

The visual impact of a change in wavelength is not linear (or more accurately, is not consistent per-nm)

The human-eye-perceived difference between say 490 and 500 would be substantially greater than the difference between, for example, 525 and 535, even though it's a difference of 10 nm in either case.

Or look at 473 vs 490 compared to 520 vs 540. In the latter you have an even larger difference in terms of nm, but a much much less perceptible colour difference to our eyes.

If you look at spectral charts, you'll see that there's a relatively larger portion of the green spectrum (as compared to say the blue, or red areas) the appears relatively homogeneous to us.
 
Last edited:
Actually according to Will these will vary. they'll be between 510nm and 515nm. Unless you want to pay his premium for 515-520nm diodes.
 
Nope.

The visual impact of a change in wavelength is not linear (or more accurately, is not consistent per-nm)

The human-eye-perceived difference between say 490 and 500 would be substantially greater than the difference between, for example, 525 and 535, even though it's a difference of 10 nm in either case.

Or look at 473 vs 490 compared to 520 vs 540. In the latter you have an even larger difference in terms of nm, but a much much less perceptible colour difference to our eyes.

If you look at spectral charts, you'll see that there's a relatively larger portion of the green spectrum (as compared to say the blue, or red areas) the appears relatively homogeneous to us.


I'll let you know how they look to my eyes.. those are all that matter to me. Subjective science is garbage..

I'm aware that eye sensitivity is different depending on the wavelength, but a 10nm difference in green is a visible difference to my eyes. Just the same as 10nm in red is and 10nm in blue is... to my eyes. And yes, this is experience talking. It's this same experience that makes me thoroughly skeptical of any charts claiming to quantify human eye sensitivity variation on any level. To my eyes, the charts are quite wrong. To someone else's eyes they might be perfectly right. And to a third person's eyes the charts might be off by only a little. So which of the three sets of eyes is "right"? That's the question..

Is the 10nm difference as pronounced in green as it is in blue and red to my eyes? IDK, it's difficult to quantify that too.. but I'd say it's close. I can only rely on spectrometer readings and my own eyeballs when estimating wavelength since they're the only resources available.. hence my previous reply. There is a very strong likelihood (nearly 100% if you ask me) that another person's response will be different, which I should have stated as well.
 
Last edited:
You know it's interesting.. I probably should ask Karl Guttag directly because so far he seems like he might be the only person I've seen so far that understands color space, what it is and, most importantly, what it isn't.

After reading a good bit on color space etc, I've put the question of why almost all human responses I've solicited over the years disagree so much with the CIE ideas of sensitivity out there in several venues, both forums and web groups. NO ONE has EVER been able to give me a good, solid answer. I suspect I've been looking in the wrong places. I also highly suspect that the color space charts are being used incorrectly much of the time. When I think of the CIE charts, I think of a reference that is used, for example, to set the color mix on a TV set to something that will be perceived as balanced across a wide range of people, or to set the RGB mix in a whitelight laser projector for the same reasons. But very often in the laser community I see people quoting numbers out to 2 or 3 decimal places, citing the CIE charts as a rock-solid reference that precisely pinpoints each and every human being's color sensitivity. I strongly disagree with that usage.
 
Last edited:
The "number of decimal places" used, is a pretty silly thing to attack. The level of precision you find behind a decimal is just a factor of what you're using as your base reference. If your base reference is "100" and you need to fit 800 measurements into the scale from 1 to 100, then you've got no choice but to use *some* decimal points. If your "base" is 1.0, then you've got to use a whole bunch more. If instead you're using a base of "1,000,000 neural impulses for each 1mW of 555nm photons", then you might not need *any* decimal points of accuracy at all.

I understand your argument about subjectivity, and the lack of consistent perception across the population. I even agree with your argument based on my own more limit anecdotal evidence. However, I think arguing (essentially) "look, there is a lot of variance from person-to-person, yet we're claiming X decimal points of accuracy, and that's a conflict", doesn't make logical sense.

Let me illustrate my point another way:

On the whole, you may think the CIE charts don't hold up well if comparing 473 to 488, or 515 to 532. But at much smaller increments, how could you compare 403nm to 404nm, without using a whole lot of decimal points? By one particular chart, the relative (to 555nm) eye response of 403nm and 404nm light is:

403nm: 0.0005196
404nm: 0.0005796

Now, you may have your thoughts on how 403 or 404 stack up relative to 555, and you may think it's crazy to compute these figures beyond, say, ONE decimal point. But surely you wouldn't argue that our eyes see 403 and 404 exactly the same way? Even rounded to 3 decimal points, we would end up with the absurd proposition that our eyes perceived BOTH 403 and 404nm light as 0.001x as bright as 555.

So while it may indeed be impossible to create any sort of chart that captures everyone's perception accurately, that doesn't mean that there's a problem with calculating a reasonably high degree of accuracy in their relative measures. 0.0005196 vs 0.0005796 may not be perfectly accurate, but it's a *hell of a lot* more accurate than saying 0.001 vs 0.001 or even worse, rounding to one decimal place (0.0 vs 0.0) which would give the impression that we couldn't see it at all.
 
Last edited:
The "number of decimal places" used, is a pretty silly thing to attack.

Not a silly thing to attack at all, even though 'attack" is a harsh term.. answer me this: What is the point of calculating to 5 decimal places if the thing you're calculating can't possibly be quantified to that degree of accuracy? Not only is it a waste of time, but it misleads objective viewers into believing that subjective things that cannot be accurately quantified, in fact can be. I understand your point about needing so many intervals between extremes, hence the use of several decimal places, but I don't think that something as imprecise as this really needs to be calculated in reference to that many intervals. With a change of x nm, there's really no telling precisely how everyone will perceive that change. One thing I can say for absolute certain is that NO human being can consciously perceive a difference of 1nm, even side by side, so there's little point in such a calculation from my view.
 
Last edited:
Not a silly thing to attack at all, even though 'attack" is a harsh term.. answer me this: What is the point of calculating to 5 decimal places if the thing you're calculating can't possibly be quantified to that degree of accuracy? Not only is it a waste of time, but it misleads objective viewers into believing that subjective things that cannot be accurately quantified, in fact can be. I understand your point about needing so many intervals between extremes, hence the use of several decimal places, but I don't think that something as imprecise as this really needs to be calculated in reference to that many intervals. With a change of x nm, there's really no telling precisely how everyone will perceive that change. One thing I can say for absolute certain is that NO human being can consciously perceive a difference of 1nm, even side by side, so there's little point in such a calculation from my view.

It's not "5 degrees of accuracy".

Again, look at:

403nm: 0.0005196
404nm: 0.0005796

What that data tells us, is that we're looking at an around 10.5% difference in terms of perceived intensity between 403nm and 404nm:
(1-(0.0005196 / 0.0005796))*100

The fact that the original numbers are expressed to 6 decimal points DOES NOT MEAN that we're saying a comparison between the perceived brightness of 532nm and 473nm light is determinable to some insane degree of precision.

Your argument is a "whole to part" logical fallacy. Just because at a grand scale we may question the comparisons present in this model between vastly different wavelengths (532n vs 473 for example), it doesn't mean that the smaller components (minute differences between 403nm and 404nm for example) are inaccurate, even when expressed to great precision.
 
Last edited:
It's not "5 degrees of accuracy".

Again, look at:

403nm: 0.0005196
404nm: 0.0005796

What that data tells us, is that we're looking at an around 10.5% difference in terms of perceived intensity between 403nm and 404nm:
(1-(0.0005196 / 0.0005796))*100

The fact that the original numbers are expressed to 6 decimal points DOES NOT MEAN that we're saying a comparison between the perceived brightness of 532nm and 473nm light is determinable to some insane degree of precision.

Your argument is a "whole to part" logical fallacy. Just because at a grand scale we may question the comparisons present in this model between vastly different wavelengths (532n vs 473 for example), it doesn't mean that the smaller components (minute differences between 403nm and 404nm for example) are inaccurate, even when expressed to great precision.


I highlighted the core of your post. The rest, all those numbers are almost pointless as far as I can tell. Look, you can do as you see fit. I'm not here to stifle you.. all I am is confused as to why it matters what the difference is between 402 and 403nm, calculated to the 5th decimal place, when no human could perceive it and even if they could it could not possibly be quantified to that degree? I'm just confused is all :o. It's discussions like this and the learning experiences that they represent that are what I enjoy most about forums.. I'm not trolling you.. at least not today lol..
 
Last edited:
Anyone with any experience on the blue side of the spectrum will tell you the CIE charts are off. I mean... 7 watts of violet is as bright as 5mW of green? Really? :crackup: It seems to work pretty well with cyan and above though.
 
It works great when used as intended, absolutely. I'm not arguing against the idea that the charts and other data are a very good tool.

It occurs to me that my argument could be somewhat application dependent, but I can't think of any application where human color perception must be calculated across a sub-nanometer range..

EDIT: To try to clarify, I'm not suggesting that human color perception can't be approximated using mathematics. All I've been saying is that I can't think of a reason in the world to attempt to calculate to that degree of precision. And that's after a LOT of thinking. This specific topic interests me a bit.. Regardless, if a person wants to crunch some numbers just for the sake of crunching numbers I see nothing wrong with that.
 
Last edited:
I highlighted the core of your post. The rest, all those numbers are almost pointless as far as I can tell. Look, you can do as you see fit. I'm not here to stifle you.. all I am is confused as to why it matters what the difference is between 402 and 403nm, calculated to the 5th decimal place, when no human could perceive it and even if they could it could not possibly be quantified to that degree? I'm just confused is all :o. It's discussions like this and the learning experiences that they represent that are what I enjoy most about forums.. I'm not trolling you.. at least not today lol..

I completely understand your complaint (re the accuracy of these measures). However, framing that concern as one the relates to the "degree" to which we express the numbers, is incorrect/illogical - it doesn't make sense.

You're misunderstanding what these numbers represent. They are RELATIVE values, not ABSOLUTE values. So "degree of accuracy", or "calculated to the Xth decimal place" isn't a compatible notion with the type of numbers we're evaluating. It just doesn't make sense to look at these numbers that way. We're not talking about "measurements".

EXAMPLE:

Relative to the distance light can travel in 1 hour, here are the distances myself, and my friend Bob, can travel in the same amount of time:

ME: 0.000000004470442
BOB: 0.000000005960589

Now, that DOES NOT mean that we have "5 degrees of precision". What those numbers are actually expressing, is that I can run 3 miles, and Bob can run 4 miles. It looks like a great deal of precision if you just do a basic "count" of the digits behind a decimal point - but that's really just an arbitrary side effect of how I've expressed the same data.

I could JUST AS EASILY express the same thing, the relative distance Bob and I can travel in the same amount of time, as:

ME: 6
BOB: 8

In both examples, we're using THE SAME data, with the same level of precision. When you're talking about RELATIVE measures like these, it's non-sensical to talk about "decimal levels of precision", because they're just an arbitrary side-effect of the way you choose to express the SAME comparisons.
 
Last edited:
I understand that, I just said I did a couple posts ago.

Ok, lets refer to your example:

403nm: 0.0005196
404nm: 0.0005796

Could you not have stated the difference with sufficient accuracy by using these numbers:

403nm: 0.00052
404nm: 0.00058

and even more importantly, why does this small of a difference matter? What is a 1nm calculation's purpose given the fact that no human being can perceive that small of a difference and this whole thing is about human perception. Could not the scale be graduated in a coarser manner and still accomplish the task with more than enough precision given the natural variation across the whole population? That's all I'm asking. You say this is illogical, but it doesn't seem so to me.
 
Lol.. Yeah I'd imagine that about sums it up.

Maybe someday I'll get it.. but thus far, with lasers anyway, it's been good enough for me to look at two wavelengths at the same power and say "well that one's about twice as bright looking..". By referring to various lasers I can estimate pretty well how visible a different wavelength laser will be to me. By referring to those charts in a general manner one can select wavelengths that will give an optimum palette. Is it necessary to differentiate between two wavelengths that are 1nm (or less) apart to achieve an optimum palette when viewed by a large population? Is it necessary to differentiate between two wavelengths that are 1nm (or less) apart to estimate with sufficient accuracy how bright any single wavelength will be to viewers across a large population using the chart? I don't think it is myself.

Different strokes they say..

EDIT: Well we've ended up a little off topic from where we started.. suffice it to say my original statement that 10nm is 10nm to eye is inaccurate even by my own understanding of color perception. A 10nm difference might be obviously perceived for one person and not for another.
 
Last edited:


Back
Top