At lunch-time today, Megan Bedell (Flatiron) and Ray Pierrehumbert (Oxford) gave talks at Flatiron. During Bedell's talk, she nicely laid out the large number of results we have on extreme-precision radial-velocity measurement; we need to start writing papers asap! She even gave a very simple and new description of what we found with respect to HARPS wavelength-calibration fidelity, a couple of years ago. So we need to write that up too.
Pierrehumbert showed fluid-dynamics results on atmospheres of tidally-locked planets (which are interesting, because they sustain huge temperature gradients around their surfaces). He has some cases where he can't find any steady-state solution for the atmophere; the resulting time dependences might have observable consequences.
Late in the day, I gave a presentation to the AAAC that oversees astrophysics and inter-agency cooperation in astronomy across NSF, NASA, and DOE. I was asked to speak about the future of data sharing, data re-use, and joint analyses. I drew inspiration from cosmology and went into two of my standard sets of talking points: The first is that we need to be thinking about likelihood functions, and how to share them: Data sets are combined by their (possibly partially marginalized) likelihood functions. The second is that when data get sophisticated or complex, there is no point in releasing it without also releasing the code that made sense of the data in real scientific projects. That is, code and data releases can't really be seen as separate things. And we might not be able to have a data release without having a code release (with appropriate licensing for repurposing and re-use). My slides were incomplete, but I put them up here anyway.
No comments:
Post a Comment