interatively reweighted least squares: Gaussian Process edition

Yesterday and today, Mykytyn and I worked out the generalization of iteratively reweighted least squares (a method for down-weighting outliers that is slightly, slightly better than sigma-clipping) to the case of a Gaussian Process noise model. We take the standard algorithm and replace the residual scaled by the noise with the deviation from the conditional mean prediction scaled by the sum in quadrature of the observational noise and the GP conditional variance. We implemented this simple algorithm on Mykytyn's quasar time-series data and it looks like it works beautifully. Now he is going to see if our inferences look better with the re-weighted uncertainties.

No comments:

Post a Comment