Another problem-set problem, inspired by an email today:
Imagine measuring the brightness of two stars, one of which is a 100,000 K blackbody, and one of which is a 3,000 K blackbody. You are interested in the flux ratio in the V band. In one experiment, you perform this pair of measurements with a (flatfielded, calibrated, etc) CCD with a standard V-band filter on it. In another, you perform this pair of measurements with a bolometer with the same filter on it. Do you expect the flux ratio you measure to be the same in the two cases, and, if not, what is the expected difference? Imagine that both detectors are very high in quantum efficiency over the wavelength range of interest, and that you are capable of making accurate, calibrated, sky-subtracted measurements in both cases.