Jake Vanderplas (UW), internet-famous computational data-driven astrophysicist, showed up at NYU for a couple of days today. He showed us some absolutely great results on objective design of photometric systems for future large imaging surveys (like LSST). His method follows exactly my ideas about how this should be done—it is a scoop, from my perspective—he computes the information delivered by the photometric bandpasses about the quantities of interest from the observed objects, as a function of exposure time. Fadely, Vanderplas, and I discussed what things about the bandpasses and the survey observing strategy he should permit to vary. Ideally, it would be everything, at fixed total mission cost! He has many non-trivial results, not the least of which is that the bandpasses you want depend on the signal-to-noise at which you expect to be working.
In the afternoon, Hou, Goodman, Fadely, Vanderplas, and I had a conversation about Hou's recent work on full marginalization of the likelihood function. In the case of exoplanet radial-velocity data, he has been finding that our simple "multi-canonical" method is faster and more accurate than the much more sophisticated "nested sampling" method he has implemented. We don't fully understand all the differences and trade-offs yet, but since the multi-canonical method is novel for astrophysics, we decided to raise its priority in Hou's paper queue.