Bayesian spectrum analysis

I spent my research time today reading Bretthorst's book on Bayesian spectrum analysis [one big PDF file]. It is a beautiful and useful document; I think there will be many ideas in this book useful to the exoplanet problem. Roweis pointed me to this book; Yavin made me read it.

One small comment on this excellent book, which I am compelled by God and Man to make: Bretthorst, like Jaynes, is a believer in assuming that errors are Gaussian because that is the most conservative thing you can do if you have a noise variance estimate and nothing else. This is technically correct, and beautiful to see demonstrated. However, this is a very dangerous argument, because it only applies when you somehow, magically, know your noise variance. You never do, at best you know the curvature at the mode of the noise distribution (if you are lucky). The variance is dominated in most real systems by rare outliers. No finite experiment is likely to provide you a good estimate of it. Furthermore, even if you do know the variance, how would you know that you know? You would have to take fourth moments to confirm it, and I have never seen an experiment ever in the history of science in which fourth moments are accurately measured. Finally, the conservativeness-of-Gaussian argument is a maximum-entropy argument subject to a strict variance constraint. Jaynes and Bretthorst should know better: You never have absolutely perfect knowledge of anything; the noise should be found through an inferential process, not a constrained exact math problem!

Whew! I had to get that off my chest.

1 comment:

  1. Hi David

    You should have a look at the Wallis derivation of Maximum entropy distributions in Jayne's book "Probability Theory: The Logic of Science". It gives a combinatorial argument as to why you should maximise entropy given the constraints you actually know.

    Furthermore, that snearky trick of marginalising over all possible variances for the Gaussian (resulting the student t-distribution) seems defensible.

    This person has a great blog on this maximum entropy trick

    I unforutunately don't know their credentials and their derivations are a bit sloppy but I really like the trick :) Use max-ent to get the form and marginalise over the unknown parameters.

    Here is some more of their stuff on oil discovery, people dispersion, species abundance etc.


    The marginalised max-ent trick seems to work all over the show.

    Lastly, Ariel Caticha has some awesome stuff on entropy and has a really great derivation from some common sense axioms which generalises entropy to the continous domain.


    Check out the paper "Lectures on Probability, Entropy, and Statistical Physics", its very good :)

    I've become enamoured with probability and entropy in my work. I always enjoy hearing and reading other people's perspectives on the matter.

    Have fun