Today I gave a colloquium at the University of Cambridge. My slides are here. I spoke about how to make precise measurements, how to design surveys, and how to exploit structure in noise. It's a rich set of things, and most of the writing about information theory in astronomy is only in the cosmology domain. Time to change that, maybe? It is also the case that the best book about information and inference ever written was written in Cambridge! So I was bringing coals to Newcastle, ish!
I'm giving two talks this week, one at #AAS238 and one at the University of Cambridge. Because I am a masochist (?) I put in titles and abstracts for both talks that are totally unlike those for any talks I have given previously. So I have to make slides entirely from scratch! I spent every bit of time today not in meetings working on slides. I'm not at all ready!
One of my PhD advisors—my official advisor—was Roger Blandford (now at Stanford). Blandford, being old-school, responded to a tweet thread I started by sending me email. I am trying to move over to always describing tensors and rotation operators and Lorentz transformations and the like in terms of unit vectors, and I realized that the most enlightened community along these lines are the quantum mechanics. Probably because they work in infinite-dimensional spaces often! Anyway, there are deep connections between vectors in a space and functions in a Hilbert space. I'm still learning; I think I will never fully get it.
Adrian Price-Whelan and I discussed today some oddities that Matt Daunt (NYU) is finding while trying to measure radial velocities in extremely noisy, fast APOGEE sub-exposures. He finds that the objective function we are using is not obviously smooth on 10-ish km/s velocity scales. Why not? We don't know. But what we do know is that a spectrograph with resolution 22,500 cannot put sharp structures into a likelihood function on scales smaller than about 13 km/s.
There's a nice paradox here, in fact: The spectrograph can't see features on scales smaller than 13 km/s, and yet we can reliably measure radial velocities much better than this! How? The informal answer is that the radial-velocity precision is 13 km/s divided by a certain, particular signal-to-noise. The formal answer involves information theory—the Fisher information, to be precise.
I had the great honor to be on the PhD committee of Lily Zhao (Yale), who defended her dissertation today. It was great and remarkable. She has worked on hardware, calibration, software, stellar astrophysics, and planets. Her seminar was wide-ranging, and the number and scope of the questions she fielded was legion. She has already had a big impact on extreme precision radial-velocity projects, and she is poised to have even more impact in the future. One of the underlying ideas of her work is that EPRV projects are integrated hardware–software systems. This idea should inform everything we do, going forward. I asked a million technical questions, but I also asked questions about the search for life, and the astronomical community's management and interoperation of its large supply of diverse spectrographs. In typical Zhao fashion, she had interesting things to say about all these things.
Soledad Villar (JHU) and I discussed more the problem of orthogonalization of vectors—or finding orthonormal basis vectors that span a subspace—in special (and general) relativity. She proposed a set of hacks that correct the generalization of Gram–Schmidt orthogonalization that I proposed a week or so ago. It's complicated, because although the straightforward generalization of GS works with probability one, there are cases you can construct that bork completely. The problem is that the method involves division by an inner product, and if the vector becomes light-like, that inner product vanishes.