At the reading group #NYCastroML, I said a few words about chapter 5 of the book, in which Bayesian approaches are introduced. I emphasized the dimensional way of looking at probability density, which is the approach to debugging probability expressions I advocate here. I also said that the main reason to be a Bayesian is that it gives you the ability to marginalize away nuisance parameters. In most other respects, Bayes doesn't give you that much capability, and many people who think "Bayes!" just want an estimator in the end anyway. As I occasionally say here, my view is that when you present your observational results, they should not be in the form of posterior pdfs, they should be in the form of likelihood functions, possibly with the nuisance parameters marginalized out.
After the reading group I discussed with Price-Whelan and Johnston the scope and content of the nearly-finished stream-fitting paper. Very late in the day I spoke with Alekh Agarwal (Microsoft Research) about generalizing distributed computing methods beyond map–reduce (and the like). He was pessimistic that anything structurally different would gain more than it cost.
No comments:
Post a Comment