If its a buck, putting the double voltage will draw half the current of one cell (ideally).
IE if your laser (driver) draws 1A when fed 4.2V, in perfect conditions, it would draw 500mA from 8.4V.
Its all about power. The driver will draw much current needed @ the battery voltage, to suffice the diode current you set...
BUT the driver should have the capabilities to accept a input that high..
Lets hope this help you understanding
I measured something similar some time ago;
I set a Flexdrive to source 4.8W of power to a 445nm diode. Don't remember exact voltages/currents, but it were 4.8W in the diode.
And I started measuring the battery out current:
@ 4.2V, it outputs 1.28A, giving 5.37W IN, thats 88.2% efficiency.
@ 3.7V, it outputs 1.65A, giving 6.1W IN, thats 73% efficiency.
@ 3.3V, it outputs 2A, giving 6.6W IN, thats 62% efficiency.
@ 2.92V, it outputs 2.01A, giving 5.86W IN, thats 78% (?) efficiency.
- During all tests, the diode power IN were 4.8W.
- I used a bench supply where I do study. Remember there are no LPM readings, all power measurements are electrical power.
- These tests were made for a battery lifetime study of mine, so don't assume 100% accuracy on it.
- It were done in a electrical engineering school, so their multimeters should be accurate... maybe not.
- I remember playing with higher voltages, and the Flex got a peak efficiency @ around 5 or 5.5V. I can't remember. Will test again.