While Richards worked on faintifying
bright Mira-variable light curves and censoring them in the manner of an insane robot (or astronomical imaging pipeline), Long and I worked on Python-ifying, and numpy-ifying some slow marginalized likelihood code. The issue is that our likelihood model has two nuisance parameters per data point (the true uncertainty variance and the true censoring flux value, considered poorly known and different for every datum) which we want to marginalize out inside the repeatedly called likelihood function. Lots of ways to do this slowly; few ways to do this fast. The goal is to have the skeleton of a paper by tomorrow afternoon!
2011-11-10
pair-coding
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment