2022-06-26

local linear regressions

For some reason, even though I dislike deep learning, I love local linear regressions. My friends tell me that RELU networks are locally linear, so I am really just a hypocrite. Anyway, today Adrian Price-Whelan and I built a regression in which we find nearest neighbors (among the training-set objects) in the space of ESA Gaia DR3 Bp/Rp spectral coefficients and, among those neighbors, we fit a locally linear model to predict the parallax of the test object. Technically we use a clever trick called the ”schmag“ but which should probably be called the reduced parallax, in which we correct the parallax into the inverse of the square root of the luminosity. Why? It's so we can use the parallax errors fairly, and include training-set objects with negative parallaxes.

Hill I will die on: If you cut your sample to high SNR parallaxes or positive parallaxes, you will bias any regressions you do to predict parallaxes or distances or distance moduli!

No comments:

Post a Comment