This is typically referring to the current being run through the diode. Generally, with the drivers everyone uses, the diode will take whatever voltage it needs as long as you control the current running through the diode. Every diode can handle a different amount of current, but most diodes in use out there have been explored and used enough that there are known "safe" ranges where diodes can be operated. For instance, Blu-ray diodes are known to be typically be ok running at 38 mA of current through the diode for continuous usage. Different red diodes can be run at different amounts of current, depending on the diode and its application: long open-can diodes are routinely run at 420 mA of current when well-heatsinked, and even higher if actively cooled. Senkat's diodes are not quite as robust, so they are typically run at slightly lower currents for continuous usage with no active cooling.
Generally speaking, the more current run through a diode, the more photons will be emitted from a diode. There are diminishing returns, where adding extra current at low currents can increase output a lot, put the same increase at higher currents will not increase light output nearly as much. And, diodes can only take so much current. Not all current gets used to output light, as nothing is completely efficient, so the rest of the input energy goes into heat in the diode. This heat can and will mess up the diode. One common way for the diode to fail is for the cavity mirrors to fail, known as COD, or catastrophic optical damage. This is what turns a laser diode into simply an LED, as it can't lase without mirrors.