I don't know what you mean with 'drawing voltage'. Voltage and current are like pressure and flow in plumbing. A diode should be powered with current, and even when they specify a voltage, it's only the voltage typically observed when the correct amount of current flows. A capacitor is a buffer of voltage, and can deliver massive current, when given the opportunity (that's why you always have to short the caps before connecting them to the laser diode).
A resistor is a current limiter because it respects ohm's law. When a current of I amps flows through a resistor, it will lower the voltage by U volts when the resistance is R and the law U=I*R holds. This can be exploited when both the current and voltage of the diode are known. The diode is 150mA and 2.4V (for example). You have 6 volts, that is 3.6 volts excess. A resistor can take 3.6V at 150mA when it is 3.6/0.15 (remember the milli prefix?) = 24 Ohm.
A resistor is often considered a very bad driver because it will not limit the current as required by the laser diode specifications. More specifically, when the input voltage increases, the resistor will NOT absorb all excess voltage because that would require another resistance (amount of Ohm). The current will rise instead to find a new balance. When the batteries are only half dead, they deliver less voltage, the current drops more drastically than the voltage. Result, weak laser and the batteries deplete even slower as they get emptier. When the batteries reach 2.4V, there will be no current at all.
Moreover, new alkaline batteries and fully charged NiMh batteries have significantly higher voltages than rated, which WILL damage your laser when you blindly expect them to deliver rated voltages.