rhd
0
- Joined
- Dec 7, 2010
- Messages
- 8,469
- Points
- 0
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
expect to be harvesting 520s (which visually will look almost indistinguishable from DPSS 532).
the 520s will look as different from 532 as the 510s do from 520. ~10nm is ~10nm..
Nope.
The visual impact of a change in wavelength is not linear (or more accurately, is not consistent per-nm)
The human-eye-perceived difference between say 490 and 500 would be substantially greater than the difference between, for example, 525 and 535, even though it's a difference of 10 nm in either case.
Or look at 473 vs 490 compared to 520 vs 540. In the latter you have an even larger difference in terms of nm, but a much much less perceptible colour difference to our eyes.
If you look at spectral charts, you'll see that there's a relatively larger portion of the green spectrum (as compared to say the blue, or red areas) the appears relatively homogeneous to us.
The "number of decimal places" used, is a pretty silly thing to attack.
Not a silly thing to attack at all, even though 'attack" is a harsh term.. answer me this: What is the point of calculating to 5 decimal places if the thing you're calculating can't possibly be quantified to that degree of accuracy? Not only is it a waste of time, but it misleads objective viewers into believing that subjective things that cannot be accurately quantified, in fact can be. I understand your point about needing so many intervals between extremes, hence the use of several decimal places, but I don't think that something as imprecise as this really needs to be calculated in reference to that many intervals. With a change of x nm, there's really no telling precisely how everyone will perceive that change. One thing I can say for absolute certain is that NO human being can consciously perceive a difference of 1nm, even side by side, so there's little point in such a calculation from my view.
It's not "5 degrees of accuracy".
Again, look at:
403nm: 0.0005196
404nm: 0.0005796
What that data tells us, is that we're looking at an around 10.5% difference in terms of perceived intensity between 403nm and 404nm:
(1-(0.0005196 / 0.0005796))*100
The fact that the original numbers are expressed to 6 decimal points DOES NOT MEAN that we're saying a comparison between the perceived brightness of 532nm and 473nm light is determinable to some insane degree of precision.
Your argument is a "whole to part" logical fallacy. Just because at a grand scale we may question the comparisons present in this model between vastly different wavelengths (532n vs 473 for example), it doesn't mean that the smaller components (minute differences between 403nm and 404nm for example) are inaccurate, even when expressed to great precision.
I highlighted the core of your post. The rest, all those numbers are almost pointless as far as I can tell. Look, you can do as you see fit. I'm not here to stifle you.. all I am is confused as to why it matters what the difference is between 402 and 403nm, calculated to the 5th decimal place, when no human could perceive it and even if they could it could not possibly be quantified to that degree? I'm just confused is all . It's discussions like this and the learning experiences that they represent that are what I enjoy most about forums.. I'm not trolling you.. at least not today lol..
A 10nm difference might be obviously perceived for one person and not for another.