I spent the day at Extreme Precision Radial Velocities at Yale. It is a great meeting, because it is very focused on the instrumentation and code that underly radial-velocity planet search and characterization. Today was a stats-heavy day, with me, Eric Ford (PSU), and Tom Loredo (Cornell) leading off with pedagogical talks. I gave an entirely new (for me) talk about noise modeling, and it was followed by absolutely excellent questions (every question pointed out a talk slide I should have made). Loredo made a nice point, which is that statistics is not a method or tool, it is a language or framework for communicating about quantitative questions. I couldn't agree more!
At lunch, Ana Bonaca organized a gathering of probabilistic reasoners to discuss asteroseismology with Sarbani Basu (Yale). This gave us an opportunity to feel out some of the issues if we try to build a probabilistic model (a forward model of the time-domain data) to replace the standard practice of Fourier transformations (or periodograms or the like). That was productive and useful.
In the afternoon, one talk that particularly stood out was by Xavier Dumusque (CfA) about The Keplerian Fitting Challenge. He made fake radial-velocity data, filled with difficult but realistic noise sources, and challenged groups to find and characterize the injected signals. He did a great job describing the successes and failures of the different groups, and even awarded nice bottles of wine to the two top-performing teams. This project, like the GREAT projects for weak lensing, are important community-building and critical-review projects for difficult data-analysis challenges.