I have done a search for this but could not find the answers I wanted. It would be great to get some help with this from anyone who is willing to explain something that I am sure is quite basic to all you experts, but difficult for someone new to the field to get my head around.
I am trying to work out the Output Power of lasers and how they translate to photography. As you probably all know, most photographers work in 'stops' to determine how to get the correct exposure (very broadly speaking). I am not looking for a photography lesson btw, just trying to quickly get to my point.
So what I am trying to understand is if I need a 532nm laser (for example) that is half the power of a 200mW (photographically speaking a stop less powerful), what would the output power be? Photographic/practical logic would suggest 100mW. However, Adam from DL mentioned this in recent correspondence, "Normally when optical power increased 6 times then the brightness doubled". So mathematically speaking would this be a 3.125mW laser? It seems too weak/small a figure to me.
Any help understanding laser physics greatly appreciated. Thanks, Mark
I am trying to work out the Output Power of lasers and how they translate to photography. As you probably all know, most photographers work in 'stops' to determine how to get the correct exposure (very broadly speaking). I am not looking for a photography lesson btw, just trying to quickly get to my point.
So what I am trying to understand is if I need a 532nm laser (for example) that is half the power of a 200mW (photographically speaking a stop less powerful), what would the output power be? Photographic/practical logic would suggest 100mW. However, Adam from DL mentioned this in recent correspondence, "Normally when optical power increased 6 times then the brightness doubled". So mathematically speaking would this be a 3.125mW laser? It seems too weak/small a figure to me.
Any help understanding laser physics greatly appreciated. Thanks, Mark
Last edited: