It can also be due to the a fore mentioned stop down, as, once the eye measures a certain level of brightness, additional brightness is all after your eye's light meter has already been pegged.
IE: The lower the intensity, the more range is left before the needle is pegged, etc.
Looking at the beam from the side is going to avoid as much of the bounce back that increases glare, such as when looking at the dot, etc....similarly leaving some meter range for the eyes.
Eyes are lousy at subjectively judging brightness in general. According to my tests at least, they seem to be least challenged when the sources are not TOO bright, and, they can do side by side comparisons.
If I show a light to a subject, ask them to rate its brightness, and then show them the same light the next day...asking the same question, the answers are all over the place...especially if they are conditioned differently before hand (night adaptation level, etc).
If I simply make the same light brighter/dimmer while they see its intensity change, as little as a 5% change is observable....but if one intensity is on, there's a pause, and then a different intensity, even if the time gap is mere seconds...the percent change will widen by a LOT, closer to 25% being needed to be sure it went brighter or dimmer for example.
If the time gap widens, so does the range of perception, so that even orders of magnitude can be misjudged.