interferometry, meet probability

Great morning chats with Jahnke (MPIA) about Euclid survey and calibration strategy and Kapala (MPIA) about comparing H-alpha emission to C+ emission in M31. After that, Malte Kuhlmann (MPI-IS) arrived to discuss generative probabilistic approaches to radio interferometry (think: replacing CLEAN). We discussed his compressed-sensing-inspired methods (which produce nicely regularized point estimates) and confronted him with some highly astronomical considerations: How do we create something that is not just righteous but also gets rightly adopted by a significant user base? And: How do we make it so that we can use the output of our code to more responsibly evaluate significance and more responsibly propagate uncertainty than what is currently industry-standard in radio astronomy? On the former, a key idea is that whatever we do has to fit into the standard radio-imaging workflow. On the latter, the key idea is that we need to exercise a justified likelihood function. I have a very good feeling about this project. Kuhlmann's background is in math and stats, which is a good background for bringing important new ideas into astrophysics. The day ended with a great talk by Watkins (MPIA) about omega Cen, the content of which also inspired a spirited discussion of globular clusters on the 17:48 bus.


  1. Forgive my radio astronomy naiveté, but hasn't this been done a lot of times before? Gull and Daniell were doing it in the 70s (albeit with a funny prior based on "MaxEnt" and use of an optimizer instead of sampling).

  2. Yes, many times! But no-one has delivered a piece of software that has actually replaced CLEAN in practice, I think. Also, I think the landscape of "noise propagation" and "significance testing" has evolved substantially.

  3. Suggestion: write a new module for AIPY and make a pull request.