Welcome to Laser Pointer Forums - discuss green laser pointers, blue laser pointers, and all types of lasers

LPF Donation via Stripe | LPF Donation - Other Methods

Links below open in new window

ArcticMyst Security by Avery

why do lasers seem to burn "better" than sunlight?

ixfd64

0
Joined
Sep 12, 2007
Messages
1,174
Points
48
From what I've read, the intensity of sunlight that reaches the Earth's surface is about 1 kW/m², or 100 mW/cm². For instance, a 500 mW laser would supposedly have the same burning power as a ~2.5 cm convex lens.

But one thing I've noticed is that laser light seems to burn better than focused sunlight at the same power. Case in point, I have a 6 cm magnifying glass (which would focus π * (6 / 2)² ≈ 28.3 cm² = 2.83 W of sunlight) that could barely ignite newspaper except maybe on a very sunny day. In comparison, even 500 mW lasers seem to easily light things on fire.

I know sunlight and laser light are very different, but what makes lasers so "good" at burning things? For example, does coherent light transfer energy more efficiently than sunlight? The only thing I could think of is the difference in irradiance; for example, my 6 cm lens can hardly get the dot to below 4 mm while a laser beam could be easily focused down to below 1 mm. Am I missing any others?
 
Last edited:





Joined
Mar 13, 2013
Messages
315
Points
0
I think there no way a laserlight "burns" better.

Its all about powerdensity and for the magnifying glass you are right that you cant focus it too a dot as small as your laser which is the most important value for burning because at the same power 1mm vs 4mm dot the 4mm has only 1/16 of powerdensity
Also your standard magnifying glass isnt broadband AR coated so you lose maybe 30-50% of the sunlight by reflection and sunlight isnt monochromatic it has a wide spectrum like your laser and may will be not as well absorbed as a 445 if you burn white paper
 

DrSid

0
Joined
Jul 17, 2010
Messages
1,506
Points
48
What he said ^^

Could be nice experiment with LPM. That should measure the losses on optics.
 
Last edited:
Joined
Feb 5, 2008
Messages
6,252
Points
83
But one thing I've noticed is that laser light seems to burn better than focused sunlight at the same power. Case in point, I have a 6 cm magnifying glass (which would focus π * (6 / 2)² ≈ 28.3 cm² = 2.83 W of sunlight) that could barely ignite newspaper except maybe on a very sunny day. In comparison, even 500 mW lasers seem to easily light things on fire.

Uhh man, not sure what alien technology lasers are you using :D but from experience, only the transition from 1.5W to 2W in 445nm power can be described as "Now it's easy-ish to set piece of paper or cardboard on fire".

500mW will not set anything on actual fire, except a match, by the business end. It'll leave marks on pretty much everything though.

Anyway like it's already said, it's all about power density. Total output power isn't actually nearly as important as we make it out to be.
 
Joined
Mar 27, 2013
Messages
2,416
Points
63
Exactly, if you take a 5W laser and put a diffuser in front of the aperture, you won't be able to burn anything unless you stuff the thing down the aperture
 
Joined
Sep 12, 2007
Messages
9,399
Points
113
1) 1kw/m^2 is very optimistic. You might get <300W depending on your latitude, season, time of day, atmospheric conditions etc.

2) A lot of that energy will be in IR and some in UV. Since most common optics only pass visible and some NIR, the power loss through the lens can be quite high.

3) The sun is not a point source, so it cannot be focused to a terribly sharp point. This means the power density can be quite a bit lower than a high-power pointer.
 




Top