The day started at #astrohackny with Foreman-Mackey and I arguing about convolutions of Gaussians. The question is: Consider a model (probability of the data given parameters) with two (linear) parameters of importance and 150 (linear) nuisance parameters. There is a very weak Gaussian prior on the nuisance parameters. How to write down the marginalized likelihood such that you only have to do a 2x2 least squares, not a 152x152 least squares? I had a very strong intuition about the answer but no solid argument. Very late at night I demonstrated that my intuition is correct, by the method of experimental coding. Not very satisfying, but my abilities to complete squares with high-dimensional linear operators are not strong!
Taisiya Kopytova (MPIA) is visiting NYU for a couple of months, to work on characterizing directly imaged extra-solar planets. We discussed the simultaneous fitting of photometry and spectroscopy, one of my favorite subjects! I, of course, recommended modeling the calibration (or, equivalently, continuum-normalization) issues simultaneously with the parameter estimation. We also discussed interpolation (of the model grid) and MCMC sampling and the likelihood function.
At Pizza Lunch at Columbia, Chiara Mingarelli (Caltech) talked about the Pulsar Timing Array and its project to detect the stochastic background of gravitational waves. The beautiful thing about the experiment is that it detects the motion of the Earth relative to the pulsars, not the individual motions of the pulsars, and it does so using time correlations in timing residuals as a function of angle between the pulsars. The assumption is that the ball of pulsars is far larger than the relevant wavelengths, and that different pulsars are causally unconnected in time. Interesting to think about the "multiple hypotheses" aspects of this with finite data.