I think this has largely been solved by the availability of these nice, cheap thermal meters that have become so common. The thermal meters are generally wavelength-independent, as long as the calorimeter absorbs the wavelengths evenly, so there's a lot less error margin. When we started, several people just had photodiode meters and such that need a scaling factor/calibration value to measure 405nm light, and those scaling factors were all over the place, hence this effort.
The thermal meters, almost all of them from the same source and seemingly all very self-consistent, seem to have solved this problem quite nicely. So I'd say that it was a valiant effort, and one that we can now bring to an end, so I agree with Dave, donate 'em.
Although I will add, I've learned in the past year+ that meter calibration is a big problem no matter where you go or what you do. Even in a well-funded university lab, we struggle with measurements continuously. Of course we're looking at measurements a bit more complicated with equipment that's a bit more complicated, but the point remains: even in high-end labs, the fundamentals are still fundamental.