2006-08-03

weak lensing

On the third day of the Oort meeting here in Leiden, Gary Bernstein (Penn) told us how to analyze future data sets for weak lensing. It was nice to see some of the nitty-gritty, and in fact Bernstein and collaborators are working on some ideas that I love: For very high-accuracy data analysis, especially when instrumental effects are large and signal-to-noise is low, the best approach is to explicitly model the data (rather than measure things in it); in this approach your measurements are just parameters of your reduced-chi-squared-unity model of the data.

Bhuvnesh Jain (Penn) followed Bernstein with a discussion of the limiting systematics for future weak lensing surveys. He identified redshift distribution as a significant issue; this might be an independent motivation for the PRIMUS project. Interestingly, Jain was very confident that the atmospheric distortions to the point-spread function will not be a problem (I didn't agree).

No comments:

Post a Comment