Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

Buy Site Supporter Role (remove some ads) | LPF Donations

Links below open in new window

FrozenGate by Avery

Could you see a laser  from the moon?

Joined
Dec 1, 2008
Messages
271
Points
0
In an idle moment at work I was wondering about this so I did some calculations to work out how bright a beam would appear for various lasers at various distances. Here are the answers I got (probably all wrong but fun anyway):

A 250 mw  pointer with a yellow beam as viewed "in beam" would have the apparent brightness of a very faint (magnitude +5) star at a distance of ~ 20,000 miles. I used yellow because the sun was my reference magnitude. Green would be a bit brighter and so visible from a bit further out but not enough to materially affect the answer. In order to see a laser from the moon with the apparent brightness of a very faint star, a yellow laser would need to have a power of ~ 50 watts.

So sadly you would not be able to see the beam of a laser pointer from the moon (unless your pointer were outputting tens of watts!).

However, if you were an astronaut in orbit at 250 miles, it's a different story. A 1 mw yellow pointer would appear like a dimmish star (mag + 2.5). A 250 mw pointer would appear as bright as Venus (which is the brightest celestial object in the sky after the sun and the moon). Green would seem even brighter.

So an astronaut could easily see you shining your pointer at him, even it were a feeble little 1mw job !

Note that these are figures for what the observer could see if they were viewing the beam directly (i.e. looking into the laser). They are not of course figures for the distance at which the spot could be viewed by the person holding the pointer. Those distances are orders of magnitude less.

[Assumptions: solar irradiation 1mw per mm2, solar mag -26, Venus mag -4, faintest visible star mag +5, beam divergence 1mrad, point source retinal image 9 microns diameter, sun retinal image 200 microns diameter, pupil diameter 7mm, no allowance for atmospheric attenuation)
 





Nice calculations ^^
I have no idea if this is right, but its fun to think of. =)

How strong could you puch a green laser, if you had it all cooled with liquid nitrogene? I mean, yo do get IR lasers doing several houndreds of Watt (is that IR? in the CO2 Tubes?) I guess you would fry the green crystals, but those could be cooled too.. :)
 
Stianbl said:
Nice calculations ^^
I have no idea if this is right, but its fun to think of. =)

How strong could you puch a green laser, if you had it all cooled with liquid nitrogene? I mean, yo do get IR lasers doing several houndreds of Watt (is that IR? in the CO2 Tubes?) I guess you would fry the green crystals, but those could be cooled too.. :)


You wouldn't get anything, if you cool lasers too far they don't lase either, I cooled my green labby down to the point where it would only lase at like <0.5mW, and all I was using was a TEC.

-Adam
 
Those are convincing calculations, but they only work for ideal conditions (i.e., perfect darkness). In the real world, the light from the laser would quickly be drowned out by atmospheric interference and light pollution. There would be no way for a 1 mW laser to be visible at 250 miles away.
 
Of course I am assuming that there are no clouds between the notional observer (in space or on the moon) and the surface!

Atmospheric pollution would actually make  little difference to the figures. You can test it this way. Sunlight in space is about 1300 watts per square metre. On the earth's surface at the equator in clear dry conditions, it is about 1000 watts per square metre. So the attenuation is less than 30%. This is less than the effect of green versus yellow on brightness perception. So I think my figures in my calcs are broadly accurate.

Of course beam visibility in the horizontal plane is a different matter. The curvature of the earth's surface imposes limits and of course there is far more attenuation - hence the low brightness of a setting sun. Calculations for in-beam viewing of a near horizontal beam on earth would be much more complicated and would give much shorter viewing distances.

But my calculations are for a beam that is shone verticallly upwards towards a notional observer in space or on the moon - and that beam only has to pass through a few miles of increasingly thin and dry air.

Light pollution is a different matter. I am of course assuming that the beam is shone upwards from a dark area (countryside, sea, desert etc) and not from the middle of a city!
 
This wouldn't work at all. You forgot something- Rayleigh scattering. The same phenomenon that allows us to see the beam is what limits its visibilty. Basically, the atmosphere scatters the beam until there's nothing left of it. It would take a multiwatt laser to be seen from space.

-Mark
 
davidgdg said:
would have the apparent brightness of a very faint (magnitude +5) star at a distance of ~ 20,000 miles.

Any star at 20,000 miles would be anything but faint. ;)

Stianbl said:
How strong could you puch a green laser, if you had it all cooled with liquid nitrogene?

There are many factors to laser performance in addition to temperature.
 
A star at 20,000 miles would make this conversation very difficult :)

My point was that Rayleigh scattering is inefficient, irrespective of the apparent brightness of the source or its distance. The sun loses 30%. shining through the atmosphere and not of all that is Rayleigh. The same is true for any source. So there is no reason to think that a laser beam would lose more than 30% (and if green, less).
 
I do see some practical problems with observing a earth based laser from the moon:

In order not to be overwhelmed by the sunlight reflecting off the earth, you would have to be looking at the totally dark half of the planet. This is possible from the moon at some points in time, though not very often so.

Additionally, you would probably have to be standing in the dark on the moon not to be overwhelmed by ambient light.

This combination gives a problem: Being both in the dark on the moon, and looking at only the dark side of the earth is only possbile during a lunar eclipse (viewed from the earth). I am not sure any astronaut has even been on the moon during one, though that might prove interesting even without lasers ;)
 
davidgdg said:
In an idle moment at work I was wondering about this so I did some calculations to work out how bright a beam would appear for various lasers at various distances. Here are the answers I got (probably all wrong but fun anyway):

What was the beam divergence you used for these calculations ?
 
Benm said:
I do see some practical problems with observing a earth based laser from the moon:

In order not to be overwhelmed by the sunlight reflecting off the earth, [highlight]you would have to be looking at the totally dark half of the planet[/highlight]. This is possible from the moon at some points in time, though not very often so.

Additionally, [highlight]you would probably have to be standing in the dark on the moon [/highlight]not to be overwhelmed by ambient light.

This combination gives a problem: Being both in the dark on the moon, and looking at only the dark side of the earth is only possbile during a lunar eclipse (viewed from the earth). I am not sure any astronaut has even been on the moon during one, though that might prove interesting even without lasers ;)

i would think that if you are seeing the darkest part of the earth, then the moon is totally dark as its in the center of the earth-moon shadow.

But even in the shadow some light is refracted around the earth.

I believe this is a real image taken from the moon as the earth was in transit in front of the the sun. if the camera could still image the dark area of the earth (if its imaging in visible light) then there is never total darkness.

tsemoon_gartstein_720cropped.jpg
 
also consider, at a divergence of 1 mrad over 20,000 miles (in a vacuum) your spot size is about 20 miles in diameter, that is a lot of energy spread out. I believe the intensity would be about 1/632282112 the power of a .125 inch spot.

imagine a laser outputting only 0.0000000015mw, that's about how bright it would be.

These calculations were thrown together, as i was leaving work. so like with everything i do when im rushing, they are prone to inaccuracy
 
Thats cool! i was looking into (OFSDT) Optical free space data transmission to make a P2P network connection from one of my friends houses to mine. but instead, i used focused microwaves because it was cheaper.

But these things are still awesome! capable of up to gigabit speeds at over 5 miles LOS. They look pretty cool too.

FreeSpaceImage1.jpg
 
They did measure an incredibly accurate distance of the moon to earth during one of the apollo missions with a laser....
 





Back
Top