I had a great conversation with Markus Pössel (MPIA) and Johannes Fröschle (MPIA) about Fröschle's work re-analyzing data on the expansion of the Universe. He is looking at when the expansion of the Universe was clearly discovered, and (subsequently) when the acceleration was clearly discovered. His approach is to reanalyze historical data sets with clear, simple hypotheses, and perform Bayesian evidence tests. He finds that even in 1924 the expansion of the Universe was clearly and firmly established, by such a large factor that in fact it was probably known much earlier.
This conversation got me thinking about a more general question, which is simple: Imagine you have measured a set of galaxy redshifts but know nothing about distances. How much data do you need to infer that the Universe is expanding? The two hypotheses are: Galaxies have random velocities but with a well-defined rest frame, with respect to which we are moving, and the same but there is expansion. You don't know any distances nor any expansion parameter. Go! I bet that once you have good sky coverage, you are done, even without any distance information at all.
In the afternoon, Melissa Ness and I worked on fiber-number and LSF issues in the APOGEE data. There are clear trends of abundance measurements with fiber number (presumably mainly because of the variation in spectrograph resolution). We worked on testing methods to remove them, which involve correcting the training set going in to The Cannon and also giving The Cannon information to simultaneously fit the relevant (nuisance) trends.
At the end of the day, I gave my talk at MPIA on probabilistic graphical models.
Those interested in the history of the expanding universe should check out this book.
ReplyDeleteReferences please! I do have LeMaitre's 1927 article en francais.
ReplyDelete