The diode will pass what ever current the driver is pulling from it, and the voltage drop/loss through the diode at that current (different diodes have different voltage drop curves) will consume power. The power consumed is the voltage drop multiplied by the current, so if you use a common silicon diode with a .7 volts of drop, multiplied by 1 amp, you are consuming an extra 700mw of power, so it is not insignificant. At 2 amps you are loosing 1.4 watts from the voltage drop through that diode which is converted into heat. You can choose lower voltage drop diodes such as a Schottky diode and reduce the loss to about half of what a silicone diode produces. If I were to solder a diode in series with the power lead to the driver I would use a diode rated for at least double the current you will be pulling through it and use a Schottky diode of some kind too. If you could clamp the diode to some kind of heat sink, that would help to keep the diode cool at higher current levels.
Yea, not having that diode in there is an advantage, just got to be sure you watch which way you put the battery in or get a driver which is designed so it doesn't care if the voltage is reversed.