simple, explainable algorithms

I spent the day in Princeton, visiting both the Department of Astrophysical Sciences and the Center for Statistics and Machine Learning. I had many great conversations, and I gave a talk that is about data-driven methods and their relationships to what we think of as being machine learning.

One highlight was a conversation with Ryan Adams (Princeton), who has brought some serious probabilistic methods to astronomy but is himself a computer scientist and statistician. He and I discussed the issues of algorithmic real-time, adaptive target selection for astronomical projects (especially EPRV-like). There is the full Bayesian decision thing, which I know how to do but which is expensive. But there is the idea that the decisions should be simple, and explainable. He pointed out that this is a huge area of research right now, and it connects to many things, especially in ethical situations: We want simple, explainable decisions! That's an interesting idea to bring into astrophysics.

There were many other great conversations, ranging across polarimetry of exoplanets, star shades and coronographs, neuroscience, stellar surface mapping, photometric redshifts, and astronomical catalog making.


phase-space volume; Oort dynamics

The research highlights of the day were a call with Matt Buckley (Rutgers) and a Physics Colloquium by Scott Tremaine (IAS). In the former, we discussed the design of a first paper about Buckley's work on measuring phase-space volumes of bound and disrupting dynamical objects in the Milky Way halo. He has some great results! But we don't understand the sensitivities to noise yet, or the in-practice issues of making robust measurements. And I mean “robust” here in the statistical inference sense.

In the latter, Tremaine answered most of the questions we formulated a few weeks ago about the origin and properties of the Oort cloud. My loyal reader may know that I am suspicious about many of the things that are said about the Oort cloud, but Tremaine showed numerical results that seem to back up most of the lore. He then switched to talking about interstellar asteroid 'Oumuamua. Aside from the usual loose talk of aliens, Tremaine said something remarkable: The pre-Solar-System velocity vector of the object is very close to current consensus on the Local Standard of Rest (something else of which I doubt the existence). Tremaine noted that it might conceivably represent an amazingly accurate measurement of the LSR! Too early to tell yet.


actions or orbit labels?

In our semi-regular Dynamics Meeting at Flatiron, Gus Beane (Flatiron) showed some simulations in a realistic potential that suggest that the actions we compute for the Milky Way might not be even close to being invariant. That caused a big fight to break out! Some were arguing that the actions shouldn't be called actions but rather “orbit labels”. Some were arguing that the effects he is seeing are exactly what you expect. Some argued that the variations he was seeing are way too large and there must be a bug! And some were arguing that it might be problems as fundamental as the reference frame: If you have the axes or origin slightly wrong, you compute everything wrong! But all these things are possibly happening in our analyses of the Milky Way, so Beane stepped on a very important issue for everything that is going on now in Milky Way dynamics with Gaia. As my loyal reader knows, I don't like action space: There aren't necessarily actions at all, and anything you compute using them might be extremely misleading, especially if you implicitly assume that they are invariant, or close.

In another conversation, Katie Breivik (CITA), Adrian Price-Whelan (Princeton), and I discussed a possible Decadal Survey science white paper about binary stars, population synthesis, and interdisciplinarity within astrophysics. That's a good project, but it requires a group effort from a large community; how to organize that?


writing and listening

As per usual, Tuesdays are low-research days. But I did get in some time with Bonaca (Harvard) about her next projects on Milky Way and tidal streams. And some time with Leistedt (NYU) about our possible pedagogical paper on hierarchical inference and models. We got through half of a paper outline.

The Astro Seminar was Jo Dunkley (Princeton) talking about Simons Observatory. It is a beautiful project and many things will come of it. But it is not obvious to me that it will be possible to see the gravitational radiation from inflation. Obviously we should look, though!


no short-cuts for planet searching?

I came in early to do some writing with Megan Bedell (Flatiron). We are so close to having her paper on the wobble software for data-driven modeling of HARPS spectra done! The results are just incredible and amazing. Can't wait. I still have to-do items there.

Late in the day I bounced off of Dan Foreman-Mackey (Flatiron) the idea that Price-Whelan (Princeton) and I had about making an effective noise model that would permit us to search for medium-period planets in radial-velocity data without explicitly modeling and marginalizing out the short-period planets. He simultaneously thought the idea was wrong (duh) but that it is an important thing to be thinking about: How do we make decisions about promising stellar targets for RV observations without keeping fully updated posterior or likelihood information about their full multi-planet planetary systems?


bits of non-traditional scientific writing.

I spent my research time today working in various long-term writing projects. I wrote a summary of all the things we could or should be doing to improve extreme-precision radial-velocity measurements from the software side. It is a long list! I sent it to various friendlies for comment; I am trying to prioritize my work in this area.

I also edited some documents in which I am brain-storming ideas about 2020 Decadal-Survey white papers. There are science white papers due in January and more project-specific ones due later in the year. On science, I am kicking the tires on something about binary star populations, since they cut across almost all areas of astrophysics. On projects, I am thinking about things involving the (otherwise moth-balled) LSST hardware and also EPRV spectrographs.

And I worked out a strategy for making very challenging (adversarial, almost) time-variable spectroscopy data, and a strategy for beating it in a data analysis. This is a great unsolved problem in astronomical data analysis: Precision radial-velocity measurement in the face of spectral variability!


information, interpretability, emulation, isocurvature

I started my day with a call to Adrian Price-Whelan (Princeton) to discuss my ideas around making decisions for observing using the EFDIG, which is my new acronym for expected future-discounted information gain. I want to make a real-time decision-making system based on this, but I don't want to spend tons of compute, for pragmatic reasons about comprehensibility and model-ability. I might be in trouble. In the middle of the call we had a funny idea about effectively marginalizing out short-period planets in a search for year-ish-period planets.

At lunch I discussed machine learning with Gabriella Contardo (Flatiron) and had a couple of duh moments: She pointed out that if you have a function or computation or simulation that can quickly go one way (from input to output) but cannot or cannot quickly go the other way (from output to input), then you have an ideal case for machine learning. Just generate data and train a model to go the other way. Duh! Machine learning to invert functions!

She pointed out that if you are trying to model a function that isn't one-to-one or many-to-one but rather many-to-many or one-to-many, in some sense, then vanilla machine-learning approaches won't be good: They are deterministic and single-valued, once trained. Vanilla methods, anyway. That was another duh for me. And yet I hadn't had these points emphasized so clearly and so sensibly.

Our conversation ventured into interpretability-land. I am all for generalizability—that's my jam—but recently I have been giving up on interpretability. Contardo isn't: Her feeling is that if she does her current project right (her project is to determine which light curves of stars in Kepler are in fact light curves of unresolved binaries), the features she obtains for light curves will be interpretable. Interesting!

Somewhere in the day I also had a nice chat with Stephen Feeney (Flatiron) about isocurvature perturbations in the initial conditions of the Universe and what they might do to Hubble Constant estimation. It looks like they might cause trouble! I wondered aloud about maximally adversarial isocurvature contributions.


hierarchical models

Tuesdays are low-research days, because teaching. But Boris Leistedt (NYU) made a proposal a few weeks ago that we write a pedagogical document on hierarchical modeling. He is thinking: in the style of our Data Analysis Recipes contributions. So we spent our meeting today brainstorming the content for the note. There definitely is a contribution for us to make. Key is to have some toy problems that map onto things astronomers care about but also illustrate the relevant methods and issues, and also are comprehensible to others in the natural sciences. We discussed fitting a line, fitting a mixture of lines, mixture-of-Gaussian models, and foreground-background models. I also like calibration models that have good causal structure.

I’m not sure if it counts as research, but I was involved today also in contract discussions for the Terra Hunting Experiment with HARPS3. We are trying to structure the Flatiron and Princeton buy-ins so that they help the project but also are executable by our institutions. That’s way above my pay grade but somehow I have to partially navigate it.


the measure problem

Not a high-research day. Conversations with Bedell (Flatiron) about papers we need to finish, and a plan to sprint in one week's time. Also a great Brown-Bag talk by Matt Kleban (NYU) about the many vacuum states available in a string-theory-like model with lots of axions. There will always be lots of volume that has a cosmological constant near our observed value, no matter that it is so damned low. But figuring out what this means for generic observers is hard, both because of the anthropic issues, but also because there is no measure for the spacetime. That's crazy and led to yet another discussion of the measure problem. It always surprises me that this is an unsolved problem.


Gaia and the halo

Today was the Big Apple Colloquium, with Amina Helmi (Groningen) presenting on Gaia results relating to Milky Way dynamics. I don't love the chemical arguments she gives that the observed halo stars must have come from a single progenitor; since we know that nucleosynthesis is pretty low-dimensional, all merging progenitors might lie on very similar chemical tracks. But the data do show great evidence of merging and non-equilibrium dynamics in the disk and halo!



variational inference for dust

As always, Wednesdays are research-filled days. At the beginning of the day, Bedell (Flatiron) reminded me that I have a boat-load of writing to do on our joint projects, and the urgency is high: We will have a submittable paper by next week if all goes well. So I printed out some old, old text to revise.

One of the research highlights of the day was a joint conversation with Anderson (Flatiron), Leistedt (NYU), and David Blei (Columbia), about building a big, self-consistent, probabilistically justifiable, statistically isotropic model of the Milky Way dust. This is a project we have been kicking around for years now, but never really got serious about. My thoughts were sharpened in the last two summers at MPIA by Sara Rezaei Kh (MPIA), with whom I did a bit of Gaussian process work. But that isn't really tractable for large data, and can't really deal with non-trivial likelihoods (like distance uncertainties, and covariant distance and extinction uncertainties). It looks like we might try black-box variational inference on the problem. This won't give exact inferences but it might be able to handle the size and complexity of the data we have.


initial conditions

In the astrophysics seminar today Tristan Smith (Swarthmore) convinced us, perhaps incidentally to his main point, that the Hubble Constant controversy could be resolved if the baryon acoustic feature (or peaks in the CMB) is moved relative to the vanilla CDM prediction. And it doesn't have to be moved far! So maybe there is just a bit of initial-conditions manipulation to make that happen, and then everything is in agreement! Interesting take on things.

My only other significant research for the day was a discussion with Kate Storey-Fisher (NYU) about the outline of our paper about correlation-function estimation, and a chat with Boris Leistedt (NYU) about a possible pedagogical piece on hierarchical modeling and graphical models.


R-M effect; failure

The day started with a great discussion with Luger (Flatiron) and Bedell (Flatiron) about the Rossiter–McLaughlin effect, which is the apparent velocity shift as a planet transits a rotating star. We discussed how this effect really is different from a radial-velocity shift; it is a line-shape change, and how we might model that within an extension to the wobble framework. That's a great idea and possibly an important contribution. The R–M effect has been important in exoplanets.

Late in the day, I experienced complete failure to produce a grant proposal. It was effectively due late last week, so I really had to produce today, but under the gun I failed. That was a hard blow! I love my job, but sometimes I find it to be difficult.



I spent the day today at CITA, which is my childhood home: My first-ever scientific paper was written here (when I was an undergraduate researcher) with Scott Tremaine (now IAS) and Gerry Quinlan. At the CITA weekly grass-roots discussion of matters cosmological, Deyan Mihaylov (Cambridge) spoke about gravitational-wave detection with Gaia. He made an amazing point (which, like most amazing points, is obvious in retrospect): The GW signature in Gaia has an earth term but no “pulsar term”, in the language of pulsar timing. That is, it only depends on local metric perturbations! That is extremely good for scaling and precision.

In that same forum, I spoke for the first time ever about the correlation function estimators I have been developing with Kate Storey-Fisher (NYU). I spoke extemporaneously—it's a discussion forum—but I realized that we do have a great story to tell. It includes context from the Landy-Szalay estimator world and context from the linear-fitting world. Plus some information theory for spice! It is a great audience at CITA and they helped me sharpen my case well.

A highlight of a long day of conversations was a chat with Katie Breivik (CITA) about binary population synthesis. She is interested in predicting gravitational-wave sources. But the issues are general. We discussed what aspects of the theory are most weak, and where we might be able to patch in a data-driven replacement. That conversation is only just started, but it's something I want to bring home to NYC and think more about.


listening at Toronto Physics

I spent the day today at CITA and UofT Physics in Toronto. The CITA Seminar was given by Alexander van Engelen (CITA), who spoke about the things we can learn from the CMB in the near future. He emphasized that there are still interesting things to learn about the primary CMB, which violates some beliefs I held prior to the talk! But he also put a lot of emphasis on the lensing or convergence map, which can be combined with other tracers to do a lot of science.

I had so many great conversations and discussions, too many to describe! But some highlights included the following: I chatted with Patrick Breysse (CITA) about testing cross-correlations and self-calibration for line-intensity mapping experiments with toy models. He has some nice ideas there. I chatted with visitor Deyan Mihaylov (Cambridge) about the possibility that Gaia might detect gravitational radiation! Bart Netterfield (Toronto) talked about very precisely pointed balloon-born optical telescope experiments. And Chris Thompson (CITA) had all sorts of crazy ideas about what might cause the fast radio bursts. His principal ideas involve cosmic strings and black holes!

I gave the UofT Physics Colloquium. I spoke about how Gaia and other kinematic surveys can measure the dark matter. I talked about the results that Ana Bonaca (Harvard) and Adrian Price-Whelan (Princeton) and Charlie Conroy (Harvard) and I will have on the arXiv on Monday!



I spent the day today at University of Waterloo, where I gave the Astro seminar. It was a great day! I prepared my talk on the bus from Toronto, which wasn't good from a nausea perspective! But I really find I give a better talk if I remake it from scratch before I give it. That is, old talks get stale, at least for me. So I have a brand-new talk about machine learning and data-driven models and criticisms thereof.

Before my talk, in the astro-ph discussion, and after my talk, with James Taylor (Waterloo) and with Mike Hudson (Waterloo), there were good ideas flowing about how to use galaxy morphologies and in particular galaxy granularity to determine galaxy distances and maybe also gravitational-lensing shear. This relates to photometric redshifts and also my ideas about making adversarial galaxies that don't reveal their shear via their ellipticities (or not strongly). Many other great conversations; too many to mention!

My visit ended with quality time with Dustin Lang (Perimeter), who always makes my day.


finding moons indirectly

The only research I personally did today was stressing out about the talks at Waterloo and Toronto that I haven't even started to prepare! And that isn't research either. However, Apurva Oza (Bern) gave a nice talk about sodium and potassium in the Solar System and in extra-solar systems. He pointed out that the outgassing / volcanism of Io means that there is a gas ring around Jupiter that might be visible in transit spectroscopy, and might permit the detection of moons even when there aren't visible moon transits. Or might confuse transit spectroscopy. In some cases the ring is partial and follows the moon, so it would lead to a predictable time-domain spectroscopic signal, in principle. Worth a search!


Finding planets near resonances

At breakfast, I had a long discussion with Megan Bedell (Flatiron) about what things should go into the discussion part of our wobble paper, in terms of the limitations and extensions of the model. We came up with quite a list! But I love any project that opens new paths.

I also had a long discussion with Rodrigo Luger (Flatiron) about searching for planets in Kepler data that are in 1:1 resonances. He is focused on the point that they will (in general) have large transit-timing variations. I would call these librations around their exact resonances. If we model these librations as approximately sinusoidal, the search space is tractable: A fixed period plus a TTV with some amplitude and period. That's a good idea! And Luger points out that there are strong priors on the amplitudes and periods of the librations. Of course there will be systems that even this setup will miss; there was a dispute between us and Foreman-Mackey (Flatiron) about what fraction. He argued for using a completely stochastic model for the librations. He might be right; but baby steps!

All this motivated by the possible discovery of a 1:1 by Mitchell Karmen (NYU). Of course the actual system he found almost certainly isn't a 1:1, we now think: It has many signs of “just” being an incredibly eccentric eclipsing binary with dilution from a third star.



I spent a good part of the day working through Fourier transform issues with Kate Storey-Fisher (NYU). We started out confused both about what the transform should be giving us and how to run the code correctly. So we switched to Gaussian functions for which we know the correct answer and at least understood the interface. Now to understand the correlation function!

In group meeting, two threads came together today. Bedell (Flatiron) asked for feedback about how to present wobble results for maximum impact. And Luger (Flatiron) took some of those wobble radial velocities and fit them with a model for the Rossiter-McLaughlin effect made by punking his own STARRY code for modeling photometric transits.


not much!

In a day obliterated by letters of recommendation, Humzah Kiani and I discussed extinctions in the Gaia footprint, and Kate Storey-Fisher and I discussed the Gaussian random fields we have been trying to simulate.