2019-05-30

multi-messenger events and software

Today was a one-day workshop on multi-messenger astrophysics to follow yesterday’s one-day workshop on physics and machine learning. There were interesting talks and discussions all day, and I learned a lot about facilities, operations, and plans. Two little things that stood out for me were the following:

Collin Capano (Hannover) spoke about his work on detecting sub-threshold events in LIGO using coincidences with other facilities, but especially NASA Fermi. He made some nice Bayesian points about how, at fixed signal-to-noise, the maximum possible confidence in such coincidences grows with the specificity and detail (predictive power) of the event models. This has important consequences for things we have been discussing at NYU in our time-domain meeting. But Capano also implicitly made a strong argument that projects cannot simply release catalogs or event streams: By definition the sub-threshold events require the combination of probabilistic information from multiple projects. For example, in his own projects, he had to re-process the Fermi GBM photon stream. Those considerations—about needing access to full likelihood information—has very important implications for all new projects and especially LSST.

Daniela Huppenkothen (UW) put into her talk on software systems some comments about ethics: Is there a role for teaching and training astronomers in the ethical aspects of software or machine learning? She gave the answer “yes”, focusing on the educational point that we are launching our people into diverse roles in science and technology. I spoke to her momentarily after her talk about telescope scheduling and operations: Since telescope time is a public good and zero-sum, we are compelled to use it wisely, and efficiently, and transparently, and (sometimes) even explainably. And with good legacy value. That’s a place where we need to develop ethical systems, at least sometimes. All that said, I don’t think much thought has gone into the ethical aspects of experimental design in astronomy.

2019-05-29

#PhysML19 and likelihoods, dammit

Today was a one-day workshop Physics in Machine Learning. Yes, you read that right! The idea was in part to get people to draw out how physics and physical applications have been changing or influencing machine-learning methods. And it is the first in a pair of one-day workshops today and tomorrow run by Josh Bloom (Berkeley). There were many great talks and I learned a lot. Here are two solipsistically chosen highlights:

Josh Batson (Chan Zuckerberg Biohub) Gave an absolutely great talk. It gave me an epiphany! He started by pointing out that machine-learning methods often fit the mean behavior of the data but not the noise. That's magic, since we haven't said what part is the noise! He then went on to talk about projects noise2noise and noise2self in which the training labels are noisy: In the first case the labels are other noisy instances of the same data, and in the second the labels are the same data! That's a bit crazy. But he showed (and it's obvious when you think about it) that de-noising can work without any external information about what a non-noisy datum would look like. We do that all the time with median filters and the like! But he gave a beautiful mathematical description of the conditions under which this is possible (they are relatively easy to meet; it involves the noise being conditionally independent of the signal, as in causal-inference contexts). The stuff he talked about is potentially very relevant to The Cannon (it probably explains why we are able to de-noise the training labels) and to EPRV (if we think of the stellar variability as a kind of noise).

Francois Lanusse (Berkeley) and Soledad Villar (NYU) gave talks about using deep generative models to perform inferences. In the Lanusse talk, he discussed the problem that we astronomers call deblending of overlapping galaxy images in cosmology survey data (like LSST). Here the generative model is constructed because (despite 150 years of trying) we don't have good models for what galaxies actually look like, or certainly not any likelihood function in image space! He showed a great generative model and some nice results; it is very promising. I was pleased that he strongly made the point that GANs and VAEs don't use proper likelihood functions when they are trained, and those problems might lead to serious problems when you use them for inference. In particular, GANs are very dangerous because of what is called “mode collapse”—the problem that you can generate only part of the data space and still do well under the GAN objective. That could strongly bias deblending, because it would put vanishing probability in real parts of the space. So he deprecated those methods and recommended methods (like normalizing flows) that have proper likelihood formulations. That's an important, subtle, and deep point.

After Lanusse's talk, I came to see him about the point that if LSST implements any deblender of this type (fit a model, deliver galaxies as posterior results from that model), the LSST Catalog will be unusable for precise measurements! The reason is technical: A catalog must deliver likelihood information not posterior information if the catalog is to be used in down-stream analyses. This is related to a million things that have appeared on this blog (okay not a million) and in particular to the work of Alex Malz (NYU): Projects must output likelihood-based measurements, likelihoods, and likelihood functions to be useful. I can't say this strongly enough. And I am infinitely pleased that ESA Gaia has done the right thing.

2019-05-28

snail models; platykurtic galaxies

Suroor Gandhi (NYU) made me in real time some really nice plots today of what a swarm of stars in phase space do over time. Her plots are for the vertical dynamics of the disk. The very exciting thing is that she can reproduce the qualitative properties of The Snail (the phase-space spiral in the local Milky Way disk). Now we have to look at dependence on initial conditions, time of evolution, and potential parameters.

Dustin Lang (Perimeter) and I spoke for a bit about our old project modeling simple galaxy profiles with mixtures of concentric Gaussians. He is trying to build a truly continuous, interpolate-able model for all Sersic indices, and of course (well it wasn't obvious to me before today) at Sersic index of 0.5, the galaxy profile is exactly a Gaussian, so a mixture of Gaussians makes no sense. And then at indices less than 0.5, the distribution becomes platykurtic, so it can't be fit with a concentric mixture of positive Gaussians. What to do there. Lang wants to add in negative Gaussians! I want to say “don't go there”.

2019-05-27

proposal writing

I spent the long weekend, including today, working on my proposal (with Bedell) for the NASA XRP (exoplanets) competition called something like “Extreme-precision radial-velocity in the presence of intrinsic stellar variability”. We propose to do things around modeling the spectra in the joint domain of time and wavelength to improve radial-velocity precision. But we also have an information-theoretic or conceptual part of the proposal. Does NASA support conceptual work? We hope so!

2019-05-24

LIGO housekeeping data, MCMC

Yesterday I gave a talk about data science at Oregon Physics. Today I talked about dark matter—on the chalk board. I talked about various bits of vapor-ware that we are doing with ESA Gaia and streams and The Snail. I also talked about the GD-1 perturbation found by Bonaca and Price-Whelan. That was followed by an excellent and fun lunch with the graduate students, in which they interviewed me about my career and science. Oregon Physics has a great PhD cohort.

In the afternoon, Ben Farr (Oregon) and I hived off to a rural brewery to discuss LIGO systematics and MCMC sampling. I have fantasies about calibrating the LIGO strain data using the enormous numbers of housekeeping channels that are recorded simultaneously with the strain. Farr was encouraging, in that he does not believe that big models of this sort have been seriously done inside the Consortium. That means there might be a role for me or for a collaboration that includes me.

On the MCMC front, we discussed a few different sampling ideas. One is a project by Farr called kombine, which is an ensemble sampler that uses the ensemble to inform an approximation to the posterior, which in turn informs the sampling. Another is vapor-ware by me called bento box which hierarchically splits your problem into a tree of disjoint problems until you get to a set of unimodal problems that are individually trivial. I realized while I was talking I could even use HMC with reflection moves to simplify the problem at the hard boundaries of the boxes in the bento box.

On the drive back to the airport, we found that we agreed on the point that no-one should ever compute a fully marginalized likelihood. That was refreshing; Farr is one of the very few Bayesians I know who get this point. It inspired me to spend my time at the airport tinkering with the paper I want to write on this subject.

2019-05-23

Oregon, falsifiability, and the LIGO project

Today was my first day of a two-day visit to Ben Farr (Oregon) and the University of Oregon. I got lots of work done during the travel phases of the day, because I have a NASA proposal due while I'm here in Oregon! Nothing like a deadline.

I had a great day. Highlights included a discussion with James Schombert (Oregon) about various philosophical matters related to falsification. He explicitly brought up my paper about plausibility and science, which I had nearly forgotten! It's nice to know that people are finding it useful still. I really wrote it to get some things off my chest, things that had been troubling me since graduate school in the 1990s. In that paper I argue that we prefer theories that are both observationally reasonable and also theoretically reasonable; there isn't really such a thing as purely empirical falsification. At least not in the observational sciences.

But of course the main theme of my visit was LIGO. The lure of discussing this project with Farr is what brought me here. We postponed our ideas for new projects until tomorrow and, maybe surprisingly, spent our time talking about university-based project management! Because although LIGO is well funded to build hardware and deliver strain measurements, what is done with those to detect and characterize systems and populations is left to the science community, which is a looser collaboration, and which must raise most of its money externally. And, like with the SDSS family of projects, relies on essentially volunteer efforts from many ornery faculty. That's an interesting set of problems in organizational management, psychology, and political science!

2019-05-22

radial velocities from slit spectra

Marla Geha (Yale) made a surprise visit to Flatiron today, and bombed the weekly Stars and Exoplanets Meeting with a discussion of the challenges of measuring velocity dispersions (and hence masses, and hence dark-matter-annihilation limits) in ultra-faint dwarf galaxies in the halo of the Milky Way. As my loyal reader knows, this problem is very similar to problems we are working on at Flatiron around extreme-precision radial-velocity (EPRV) spectroscopy. Geha's problem is both harder and easier. It is easier because she only needs km/s (not cm/s) precision. It is harder because she has to use a slit spectrograph and point it at very faint stars! It is easier because she has both sky emission lines and telluric absorption lines to help calibrate. It is harder because differences in slit illumination mean that the sky lines and the telluric absorption don't agree for the wavelength calibration!

After stars meeting, the conversation continued among Geha, Bedell (Flatiron), and me. We discussed many things, including the point that the offset between tellurics and sky lines is a wavelength offset, not a radial-velocity offset. Or it is even something more sophisticated, related to the spectrograph optics. We discussed the point that her problems are fundamentally hierarchical, because some parameters are associated with a star, some with an exposure, some with a slit-mask, some with a time and so on. We also discussed how the wobbble framework that Bedell and I have built could be extended to capture these effects. It's certainly possible. Oh I nearly forgot: We also discussed masking and apodization of sky lines and telluric lines in the science spectra, and how to do that without biasing down-stream measurements. Spergel (Flatiron) pointed us to some literature that he was pleased to say is older than any of us (Spergel included).

I should say that Geha's admirable goal is to re-reduce all of the nearly 105df stellar spectra in the DEIMOS archive! Now that's my kind of project.

2019-05-21

imaging asteroseismic modes on the stellar surface

Many threads of conversation over the past weeks came together today in a set of coincidences. Conversations with Bedell (Flatiron), Pope (NYU), Luger (Flatiron), and Farr (Flatiron) ranging around stochastic processes and inferring stellar surface features from doppler imaging all overlap at stellar asteroseismic p modes: In principle, with high-resolution, high-signal-to-noise stellar spectral time series (and we have these, in hand!) we should be able not only to see p modes but also see their footprint on the stellar surface. That is, directly read ell and em off the spectral data. In addition, we ought to be able to see the associated temperature variations. This is all possible because the stars are slowly rotating, and each mode projects onto the rotating surface differently. Even cooler than all this: Because the modes are coherent for days in the stars we care about, we can build very precise matched filters to combine the data coherently from many exposures. There are many things to do here.

2019-05-20

predicting one population of transients from another

Tyler Pritchard (NYU) convenes a meeting on Mondays at NYU to discuss time-domain astrophysics. Today we had a discussion of a very simple idea: Use the rates of short GRBs that are observed and measured (using physical models from the MacFadyen group at NYU) to have certain jet–observer offset angles to infer rates for all the way-off-axis events that won't be GRB triggers but might be seen in LSST or other ground-based optical or radio surveys. Seems easy, right? It turns out it isn't trivial at all, because the extrapolation of a few well-understood events in gamma-rays, subject to gamma-ray selection effects to a full population of optical and radio sources (and then assessing those selection effects) requires quite a few additional or auxiliary assumptions. This is even more true for the bursts where we don't know redshifts. I was surprised to hear myself use the astronomy word "V-max"! But we still (as a group) feel like there must be low-hanging fruit. And this is a great application for the MacFadyen-group models, which predict brightness as a function of wavelength, time, and jet–observer angle.

2019-05-17

hierarchical probabilistic calibration

Today Lily Zhao (Yale) visualized for me some of the calibration data they have for the EXPRES spectrograph at Yale. What she showed is that the calibration does vary at very high signal-to-noise, and that the variations are systematic or smooth. That is, the instrument varies only a tiny tiny bit, but it does so very smoothly and the smooth variations are measured incredibly precisely. This suggests that it should be possible to pool data from many calibration exposures to build a better calibration model for every exposure than we could get if we treated the data all independently.

Late in the day, we drew a graphical model for the calibration, and worked through a possible structure. As my loyal reader knows, I want to go to full two-dimensional modeling of spectrographs! But we are going to start with measurements made on one-dimensional extractions. That's easier for the community to accept right now, anyways!


2019-05-16

forecasting tools; beautiful spectrograph calibration

Our five-person (Bedell, Hogg, Queloz, Winn, Zhao) exoplanet meeting continued today, with Winn (Princeton) working out the elements needed to produce a simulator for a long-term EPRV monitoring program with simple observing rules. He is interested in working out under what circumstances such a program can be informative about exoplanets in regimes that neither Kepler nor existing EPRV programs have strongly constrained, like near-Earth-masses on near-Earth-orbits around near-Sun stars. And indeed we must choose a metric or metrics for success. His list of what's needed, software-wise, is non-trivial, but we worked out that every part of it would be a publishable contribution to the literature, so it could be a great set of projects. And a very useful set of tools.

Zhao (Yale) showed me two-dimensional calibration data from the EXPRES instrument illuminated by their laser-frequency comb. It is astounding. The images are beautiful, and every single line in each image is at a perfectly known (from physics!) absolute wavelength. This might be the beginning of a very new world. The instrument is also beautifully designed so that all the slit (fiber, really, but it is a rectangular fiber) images are almost perfectly aligned with one of the CCD directions, even in all four corners of the image. Not like the spectrographs I'm used to!

2019-05-15

do we need to include the committee in our model?

Josh Winn (Princeton) and Lily Zhao (Yale) both came in to Flatiron for a couple of days today to work with Megan Bedell (Flatiron), Didier Queloz (Cambridge), and me. So we had a bit of a themed Stars and Exoplanets Meeting today at Flatiron. Winn talked about various ways to measure stellar obliquities (that is, angles between stellar-rotation angular momentum vectors and planetary system angular-momentum vectors). He has some six ways to do it! He talked about statistical differences between vsini measurements for stars with and without transiting systems.

Zhao and Queloz talked about their respective big EPRV programs to find Earth analogs in radial-velocity data. Both projects need to get much more precise measurements, and observe fewer stars (yes fewer) for longer times. That's the direction the field is going, at least where it concerns discovery space. Queloz argued that these are going to be big projects that require patience and commitment, and that it is important for new projects to control facilities, not just to apply for observing time each semester! And that's what he has with the Terra Hunting Experiment, in which Bedell, Winn, and I are also partners.

Related to all that, Zhao talked about how to make an observing program adaptive (to increase efficiency) without making it hard to understand (for statistical inferences at the end). I'm very interested in this problem! And it relates to the Queloz point, because if a time allocation committee is involved every semester, any statistical inferences about what was discovered would have to model not just the exoplanet population but also the behavior of the various TACs!

2019-05-14

normalizing flows; information theory

At lunchtime I had a great conversation with Iain Murray (Edinburgh) about two things today. One was new ideas in probabilistic machine learning, and the other was this exoplanet transit spectroscopy challenge. On the former, he got me excited about normalizing flows, that use machine learning methods (like deep learning) and a good likelihood function to build probabilistic generative models for high dimensional data. These could be useful for astronomical applications; we discussed. On the latter, we discussed how transits work and how sunspots cause trouble for them. And how the effects might be low dimensional. And thus how a good machine-learning method should be able to deal with it or capture it.

In the afternoon I spent a short session with Rodrigo Luger (Flatiron) talking about the information about a stellar surface or about an exoplanet surface encoded in a photometric light curve. The information can come from rotation, or from transits, or both, and it is different (there is more information), oddly, if there is limb darkening! We talked about the main points such a paper should make, and some details of information theory. The problem is nice in part because if you transform the stellar surface map to spherical harmonics, a bunch of the calculations lead to beautiful trigonometric forms, and the degeneracy or eigenvector structure of the information tensor becomes very clear.

2019-05-13

eclipsing binaries

I had a good conversation with with Laura Chang (Princeton) today, who is interested in doing some work in the area of binary stars. We discussed the point that many of the very challenging things people have done with the Kepler data in the study of exoplanets—exoplanet detection, completeness modeling, populations inferences— are very much easier in the study of eclipsing binary stars. And the numbers are very large: The total number of eclipsing binary systems found in the Kepler data is comparable to the total number of exoplanets found. And there are also K2 and TESS binaries! So there are a lot of neat projects to think about for constraining the short-period binary population with these data. We decided to start by figuring out what's been done already.

2019-05-08

Pheno 2019, day 3

I spent the day at Pheno 2019, where I gave a plenary about Gaia and dark matter. It was a fun day, and I learned a lot. For example, I learned that when you have a dark photon, you naturally get tiny couplings between the dark matter and the photon, as if the dark matter has a tiny charge. And there are good experiments looking for milli-charged particles. I learned that deep learning methods applied to LHC events are starting to approach information-theoretic bounds for classifying jets. That's interesting, because in the absence of a likelihood function, how do you saturate bounds? I learned that the Swampland (tm) is the set of effective field theories that can't be represented in any string theory. That's interesting: If we could show that there are many EFTs that are incompatible with string theory, then string theory has strong phenomenological content!

In the last talk of the day, Mangano (CERN) talked about the future of accelerators. He made a very interesting point, which I have kind-of known for a long time, but haven't seen articulated explicitly before: If you are doing a huge project to accomplish a huge goal (like build the LHC to find the Higgs), you need to design it such that you know you will produce lots and lots of interesting science along the way. That's an important idea, and it is a great design principle for scientific research.

2019-05-07

Snail

I spent a bit of research time today writing up my ideas about what we might do with The Snail (the local phase spiral in the vertical dynamics discovered in Gaia data) to infer the gravitational potential (or force law, or density) in the Milky Way disk. The idea is to model it as an out-of-equilibrium disturbance winding up towards equilibrium. My strong intuition (that could be wrong) is that this is going to be amazingly constraining on the gravitational dynamics. I'm hoping it will be better (both in accuracy and precision) than equilibrium methods, like virial theorem and Jeans models. I sent my hand-written notes to Hans-Walter Rix (MPIA) for comments.

2019-05-06

not much

My only research events today were conversations with Eilers, Leistedt, and Pope about short-term strategies.

2019-05-03

Dr Alex Malz!

Today it was my great pleasure to participate in the PhD defense of my student Alex Malz (NYU). His dissertation is about probabilistic models for next-generation cosmology surveys (think LSST but also Euclid and so on). He showed that it is not trivial to store, vet, or use probabilistic information coming from these surveys, using photometric-redshift outputs as a proxy: The surveys expect to produce probabilistic information about redshift for the galaxies they observe. What do you need to know about these probabilistic outputs in order to use them? It turns out that the requirements are strong and hard. A few random comments:

On the vetting point: Malz showed with an adversarial attack that the ways cosmologists were comparing photometric-redshift probability outputs across different codes were very limited: His fake code that just always returned the prior pdf did as well on almost all metrics as the best codes.

On the requirements point: Malz showed that you need to know all the input assumptions and priors on any method in order to be able to use its output, especially if its output consists of posterior information. That is, you really want likelihood information, but no methods currently output that (and many couldn't even generate it because they aren't in the form of traditional inferences).

On the storage point: Malz showed that quantiles are far better than samples for storing a pdf! The results are very strong. But the hilarious thing is that the LSST database permits up to 200 floating-point numbers for storage of the pdf, when in fact the photometric redshifts will be based on only six photometric measurements! So, just like in many other surveys that I care about, the LSST Catalog will represent a data expansion, not a data reduction. Hahaha!

It was a great talk, and in support of a great dissertation. And a great day.

2019-05-02

Dr Mandyam

Today I had the pleasure of serving on the PhD committee for Nitya Mandyam Doddamane, who defended her thesis on the measurement of star-formation rates and stellar masses in spectroscopic surveys of galaxies. She compared different stellar populations models, based on different parts of the galaxy spectral energy distributions, and galaxy environments, to make inferences about which galaxies are and aren't forming stars. She has some nice examples that use environment to break some degeneracies in interpretation. In that sense, some of what she did was a causal inference. She also looked at aperture biases, comparing fiber spectroscopy to integral-field spectroscopy from various SDSS surveys. Her results are nice, and were beautifully presented, both in the talk and in the thesis. Congratulations Dr Mandyam!

2019-05-01

Galactic archaeology

It's a long story, but we have been experimenting continuously with the rules and principles underlying the weekly Stars and Exoplanets Meeting that we run at Flatiron for the NYC astrophysics community. One of the things I say about it is that if you want a meeting to be open, supportive, easy, and community-building, it has to have a strong set of draconian rules! In our most recent set of discussions, we have been talking about theming the meetings around specific science themes. Today was our first experiment with that! Joss Bland-Hawthorn (Sydney) is in town, so we themed the meeting around Galactic Archaeology. We had five short discussions; here are some highlights:

Megan Bedell (Flatiron) showed her incredibly precise 35-element (?) abundance measurements vs stellar age for her Solar twin sample. The abundances are very closely related to the age (for this sample that is selected to have Solar [Fe/H]). Suroor Gandhi (NYU) showed her results on the dependence on dynamical (or really kinematic) actions on [Fe/H] and age for low-alpha and high-alpha stars in the local Milky Way disk. These show that the two different sequences (high and low alpha) have different origins. And Rocio Kiman (CUNY) showed her M dwarf kinematics as a function of magnetic activity that could be used to constrain a disk heating model. All three of these presentations could benefit (for interpretation) from a forward model of star formation and radial migration in the Milky Way disk, along with heating! This is related to things I have done with Neige Frankel (MPIA) but would require extensions. Simple extensions, though.

Adam Wheeler (Columbia) showed us abundances he has measured all over the Milky Way from LAMOST spectroscopy, training a version of The Cannon with GALAH abundances. It's an amazing data set, and he asked us to brainstorm ideas about what we could do with it. He seems to have features in his catalog that look similar to the midplane issues that were causing me existential angst this past August. Bland-Hawthorn said that he sees similar things in the GALAH data too.

And Bland-Hawthorn himself talked about the possibility that some future instrument could measure stellar accelerations and get the Milky Way acceleration field directly! He started by commenting on the conclusions of the Bonaca et al work on a possible dark-matter perturber acting on the GD-1 stellar stream. His remarks played very well with things Bonaca and I have been discussing around making a non-parametric acceleration map of the Milky Way.

In summary: A great experiment!