unconverged chains

I spent the day talking to anyone who would listen about unconverged MCMC (or equivalent) chains. The issue is a big one, and I have many thoughts, all unorganized. But basically, the point is that when likelihood calls take a long time (think weeks), then there is no way in hell we will ever have a converged and dense sampling of the posterior probability distribution for any parameter space, let alone a large one. At the IPMU meeting last week, most practitioners thought it was impossible to work without a converged chain, but I noted that we never have a converged chain in the larger space of all possible models; whenever we have a converged chain it is just in some extremely limited and constrained subspace (for example the 11-dimensional space of CDM or the like; this is a tiny subspace of all the possible cosmological model spaces). The fact that we don't have a converged chain on all the possible models and all the possible parameters conceivable does not prevent us from doing science. This has connections to the multi-armed bandit problem. I also have been thinking about Rob Fergus's 80 million tiny images project, which treats the result of a huge set of Google (tm) searches as a sampling of the space of all natural images. Of course this is not a converged or dense sampling! But nonetheless, science (and engineering) can be done, very successfully.

No comments:

Post a Comment