Today I finished the zeroth draft of a (first-author; gasp!) paper about linear regression with large numbers of parameters. My co-author is Soledad Villar (JHU). The paper shows how—when you are fitting a flexible model like a polynomial or a Fourier series—you can have more parameters than data with no problem, and in fact you often do better in that regime, even in predictive accuracy for held-out data. It also shows that as the number of parameters goes to infinity, your linear regression becomes a Gaussian process if you choose your regularization correctly. It is designed to be like a textbook chapter so we are faced with the question: Where to publish (other than arXiv, which is a given).
No comments:
Post a Comment