2022-05-10

Dr Tomer Yavetz

Today Tomer Yavetz (Columbia) defended his PhD, which was in part about the dynamics of stellar streams, and in part about macroscopically quantum-mechanical dark matter. The dissertation was great. The stellar-stream part was about stream morphologies induced by dynamical separatrices in phase space: If the stars on a stream are on orbits that span a separatrix, all heck breaks loose. The part of the thesis on this was very pedagogical and insightful about theoretical dynamics. The dark-matter part was about fast computation of steady-states using orbitals and the WKB approximation. Beautiful physics and math! But my favorite part of the thesis was the introduction, in which Yavetz discusses the point that dynamics—even though we can't see stellar orbits—does have directly observable consequences, like the aforementioned streams and their morphologies (and also Saturn's rings and the gaps in the asteroid belt and the velocity substructure in the Milky Way disk). After the defense we talked about re-framing dynamics around this idea of observability. Congratulations, and it has been a pleasure!

2022-05-09

discretized vector calculus

On Friday, Will Farr (Flatiron) suggested to me that the work I have been doing (with Soledad Villar) on image-convolution operators with good geometric and group-theoretic properties might be related somehow to discretized differential geometry. It does! I tried to read some impenetrable papers but my main take-away is that I have to understand this field.

2022-05-06

discovering quantum physics, automatically?

I have been working on making machine-learning methods dimensionless (in the sense of units). In this context, a question arises: Is it possible to learn that there is a missing dimensional input to a physics problem, using machine learning? Soledad Villar (JHU) and I ignored some of our required work today and wrote some code to explore this problem, using as a toy example the Planck Law example we explained in this paper. We found that maybe you can discover a missing dimensional constant? We have lots more to do to decide what we really have.

2022-05-05

making a mock Gaia quasar sample

I had conversations today with both Hans-Walter Rix (MPIA) and Kate Storey-Fisher (NYU) about the upcoming ESA Gaia quasar sample. We are trying to make somewhat realistic mocks to test the size of the sample, the computational complexity of things we want to do, the expected signal-to-noise of various cosmological signals, and the expected amplitude and spatial structure of the Gaia selection function. We have strategies that involve making clean samples with a lognormal mock, and making realistic samples (but which have no clustering) using the Gaia EDR3 photometric sample (matched to NASA WISE).

2022-05-04

making Fourier fitting super fast

At the request of Conor Sayres (UW), I have been looking at distortion patterns in the SDSS-V Focal Viewing Camera (FVC), which is the part of the system that looks at whether the focal-plane fiber robots are where they need to be. The distortions are extremely bad; they are large in amplitude and vary on extremely small scales on the focal plane. So I have to fit an extremely flexible model. Here are my comments:

First, you should use mixtures of sines and cosines for problems like this. Not polynomials! Why? Because sines and cosines do not blow up at the edges.

Second, you should punk fast Fourier transform (FFT) codes to speed up your regressions. I wrote code to do this, which wraps the finufft best-in-class non-uniform FFT code in scipy.sparse linear-algebra code. This wrapping makes the FFT operators into linear-algebra operators and permits me to do solve() operations. That move (wrapping FFT in linear algebra) sped up my code by factors of many!

2022-05-03

coordinate freedom vs equivariance, again

With Soledad Villar (JHU) and others I have been discussing making generalizations (or restrictions?) of image convolution operators to make machine learning respect more symmetries. One kind of generalization is going to 3-d images, and another is making the weights in the convolution filters geometric objects, like vectors, pseudovectors, and tensors. Then we developed a group-averaging technique to make these geometric filters equivariant. And now we are considering products and contractions of these geometric objects to make universally approximating function spaces. I don't love the word “equivariant” here: In my view the symmetries are coordinate freedoms, not relations between inputs and outputs. But the machine-learning world has spoken.

2022-05-02

accretion and mathematical physics

In the CCPP brown-bag today, Andrei Gruzinov (NYU) went through the full mathematical-physics argument of Bondi (from the 1950s) that leads to the Bondi formula for accretion from a stationary, thermal gas onto a point mass. He also talked about a generalization of the Bondi argument that he developed this year (permitting the gas to be moving relative to the point mass) and also a bevy of reasons, both theoretical and observational, that the Bondi solution never actually applies ever in practice! Haha, but beautiful stuff.