In a low-research day, I had a short call with Andy Casey about various things Cannon-related. I pitched the very simple project of looking at how our results degrade with spectroscopic resolution and signal-to-noise. We have done signal-to-noise tests, but we have never degraded the spectroscopic resolution, which ought to be very informative. There is folklore that you can't do anything at resolutions less than 20,000 (or 30,000, or 100,000, etc.). Is there a resolution below which we can't extract abundances? Or do things degrade smoothly?
Another issue with the current versions of The Cannon is that the abundances are not strictly interpretable as pure abundances: Because we let the system learn whatever relationships it wants, it can use (say) titanium lines to help estimate the (say) magnesium abundance. In general it might do this if there is an empirical covariance between titanium and magnesium in the training set (which there will be, in general). So we have to be careful how we interpret its output. Melissa Ness is working on a solution to this, which is to censor the wavelengths available to some (or all) elements to those wavelengths that we know (from atomic physics) are conceivably relevant. This will lead to much more interpretable results. If the censoring is correct, it should also lead to better results!
how important resolution is will definitely be a function of the element. i bet you can get away with quite low resolutions for some elements.
ReplyDelete