big models

What a day! There were talks of various lengths by Valluri (Michigan), Thalmann (Amsterdam), Morganson, Bañados, Deacon, and Goldman (all MPIA). So much to report that I will report nothing except that PanSTARRS calibration is under control; it is only data access that needs work, and high-dynamic range imaging efforts at MPIA could benefit from the stuff Fergus and I are working on!

At lunch Finkbeiner (Harvard) showed up, and we discussed how to model the dust in the Milky Way in two (projected) and three dimensions. I showed him some technology to deal with the more parameters than data non-problem, including regularizations that really only act when the data are failing to inform. I stuck with frequentist approaches, because we still don't understand how to do exceedingly enormous non-parameteric (think: infinitely parameterized) problems with full posterior output. There is a huge literature on this, so maybe one day we will have a breakthrough. I also stuck with methods that are fast, because if you want to build models with billions of parameters (and we do), you care.

1 comment:

  1. I'm curious what the frequentist approaches were. Note that a point estimation procedure is not inherently frequentist -- it becomes so if you start evaluating its performance based on frequency properties.