2018-06-19

#wetton18, day 1

Today was the first day of the Wetton Workshop at Oxford. There were many interesting talks from all over the map, but with a goal at understanding how we make sure that we stay open to unexpected discoveries, even as we make more and more targeted data sets and experiments. One theme that emerged is that of systematics: As you push data harder and harder&mdashin cosmology or exoplanet search or anything else—you become more and more sensitive to the details of your hardware and electronics and selection and so on. This led to a discussion of end-to-end simulation of data sets to understand how hardware issues enter and to see if we understand the hadware.

That's important! But I think there is an equally important aspect to this: If we don't take our data with sufficient heterogeneity, we can't learn certain things. For example, if you take all LSST exposures at 15 seconds, you never test the shutter, never test linearity of the detector, never find out on what time scales the PSF is changing, and so on. For another, if you take all the Euclid imaging survey on a regular grid, you never get cross-calibration information from one part of the detector to another, nor can you find certain kinds of anisotropies in the detector or the point-spread function. If we are going to saturate the bounds, we are going to need to take science data in many, many configurations.

Here are the slides from the public talk I gave at the end of the day. Note my digs at press-release artists' conceptions. I think we should be honest about what we do and don't know!

No comments:

Post a Comment