2022-08-29

applying the SVD to nontrivial objects

Singular value decomposition (SVD) is a method for finding the equivalent of eigenvalues and eigenvectors for a rectangular matrix. It is what we use when we want to know the rank of a rectangular matrix, or make a low-rank matrix factorization (indeed, it is precisely what is used in principal components analysis or PCA).

The cool thing is: The method is exceedingly general; it can find the rank or a low-rank approximation to any space; it doesn't have to be a vector space exactly. It just has to obey certain algebra rules. So in my work with Soledad Villar (JHU) we use it to find a basis to represent all the linearly independent geometric images (grids of tensors) possible subject to constraints (like symmetries). I wrote words about using the SVD in this context in our nascent paper. Here is some example output of my SVDs in eye-candy form:

2022-08-25

introduction for our geometric-convolution paper

My loyal reader knows that Soledad Villar (JHU) and I are working on a replacement for convolutional neural networks, that preserves convolutional structure, but enforces important physical symmetries (most importantly coordinate-freedom, or what I call coordinate-freedom). Today I deleted the introduction to our paper and re-wrote it from scratch.

When we last wrote the introduction, we thought we were writing code for cosmology. Now we think we are doing something way more general. In writing this, I realized that we need a figure that shows some kind of gallery or set of examples of the kinds of data we are talking about, which are images or grids of scalars, vectors, and tensors.

2022-08-22

getting ready for the job market

I had a great conversation with Kate Storey-Fisher (NYU) today about her preparations for the academic job market. We talked about places, applications, proposals, and so on. In particular, we spent time talking about the structure of a good job proposal, which I think involves lots of scales from very very big picture down to very specific ideas for particular shovel-ready projects. We also talked about what makes Storey-Fisher unique on the market and how to talk about that uniqueness in the application. I think an odd thing about applications is that you have to narrate your work—which usually is a set of random and contingent projects—like it is a scientific program with coherence. This is odd, but not really irrelevant, since the ability to narrate it well shows an ability to make connections and see themes.

2022-08-19

what to say for Jim Peebles

I have been honored by an invitation to speak at a meeting in honor of Jim Peebles (Princeton) and his 2019 Nobel Prize in Physics. I spent the day working on things I want to say at this event. Obviously I have to be very critical of the Nobel Prize and all prizes! Haha. But I want to talk about large-scale structure and also the problem that we only ever get to observe one Universe. What does that mean for inductive reasoning and epistemology?

2022-08-18

continuum normalization of spectra

How do you continuum-normalize a spectrum in a low-resolution spectrograph? You can't really, unless you have exceedingly good models for stars (which you could use to fit the normalization); but of course if you had extremely good models for stars you probably wouldn't have to normalize!

I had an interesting conversation about this with Alex Dimoff (MPIA). He is continuum-normalizing spectra in a high-resolution echelle spectrograph. He has a good measurement of the “blaze function” so I suggested that he just add some polynomial or sine-and-cosine adjustments to that in each order. My main advice about continuum normalization is to avoid things that are very sensitive to signal-to-noise: As you degrade the signal-to-noise, your continuum estimate should not systematically deviate low. Most methods in play right now have this problem, and bad. Think: Fit to pixels that are “consisten” with being continuum pixels. That's going to depend very strongly on signal-to-noise.

2022-08-17

yet another version of The Cannon?

I started to write an implementation of The Cannon today. Probably a mistake! But I don't love any of the implementations out there, even my own. I was inspired because I want to train a model of the ESA Gaia RVS DR3 spectra using the SDSS-IV APOGEE labels for training. I want to write the implementation well so it's robust, easy to maintain, simple, and consistent with astropy. Is this a mistake?

2022-08-16

metallicity and spiral arms

In Milky Way Group Meeting at MPIA today we discussed this paper on spiral structure as observed by Gaia. The paper shows that the spiral arms appear not just in the density of young stars but in their metallicities (at very low amplitude). What does this mean? I think maybe it's just the response of a smoother disk population to a perturbation: If you have a smooth disk with a metallicity gradient in it, and you perturb it, you wind up a spiral in the disk and that spiral appears as a low-amplitude abundance feature, because the spiral involves synchronizing the radial oscillations of stars at different guiding radii (and hence, given the gradient, different abundances). It is easy to work out quantitatively though. Maybe I should do that? Reminds me of what I have been working on with Neige Frankel (CITA), but in the vertical (rather than azimuthal) dynamics.

2022-08-15

how many equivariant linear functions are there?

O. M. G. As my loyal reader knows, Soledad Villar (JHU) and I are trying to build a replacement for convolutional neural networks that can handle geometric objects (scalars, vectors, pseudovectors, tensors of any order, so on) and that can create functions that exactly (or approximately if you like) obey the symmetries of classical physics (rotation, translation, parity, boost, maybe even gauge). Our method produces polynomial functions of images (functions where both the input and the output are images), making use of convolutions, outer products, index contractions, index permutations, cross-products, pooling, and so on.

Meanwhile, Ben Blum-Smith (NYU, JHU) has been doing (scary to me) group theory stuff in which he has been computing the number of unique polynomial functions of images of fixed polynomial degree that are possible, given image inputs and outputs (of some tensor orders), when those polynomial functions obey the symmetries of classical physics. And he has results! He can tell us how many unique linear and quadratic, say, functions there are of vector images (say) that output vector images. It's a formula that depends on the image size and the degree of the polynomial.

Today Villar and I had a breakthrough: We used our geometric generalization of convolutions to produce all possible linear functions of small images, deduplicated the results using a singular value decomposition, and (thereby) counted all linearly independent group-equivariant linear functions there are that go from vector images to vector images. And our results agree with Blum-Smith's formula. So we may actually have a complete basis for all image functions that could ever exist in the context of classical physics?

2022-08-14

spherical-harmonic transforms of point sets

On the weekend I computed the spherical-harmonic transform of Kate Storey-Fisher's quasar sample made from ESA Gaia data. I also computed the spherical-harmonic transform of the random catalog we use to map the selection function. The two transforms are extremely similar in their complex amplitudes! Since the random catalog is made assuming perfect homogeneity and isotropy, this similarity directly translates into a measurement of the isotropy of the Universe.

2022-08-12

is the time just a housekeeping datum?

I had a lunch conversation with Melissa Hobson (MPIA) about finding Earth-like planets in long-term radial-velocity surveys. We discussed instrument calibration, and how one interpolates the calibration data from the arcs or LFCs onto the science exposures. I think we should be doing this not in time (or not only in time) but in other housekeeping quantities like instrument temperature state. That is, the most relevant calibration exposure might not be the closest in time, it might be the closest in instrument temperature. From my perspective, the time is just another piece of housekeeping data, and its value for calibration is to be determined empirically.

2022-08-11

jackknife practice and intuitions

Last week I gave a colloquium at MPIA in which I advocated the use of jackknife and bootstrap resampling to obtain empirical uncertainty estimates in a complex data analysis. Today I actually implemented jackknife in my project on cosmic homogeneity (and isotropy). I jackknifed by sky position: I split the sky into 12 nearly-equal regions for 12-fold leave-one-out. I have intuitions about when it is a good idea to jackknife on a quantity (like sky position) and when it is a good idea to jackknife on a quantity that is completely random, but I don't know exactly where my intuition comes from. In general it must be the case that jackknifing on different things answers different questions about your noise.

2022-08-10

direct planet spectroscopy

I had a great conversation over lunch today with Lorenzo Pino (Arcetri), who is measuring exoplanet direct spectra in systems with large, hot planets. He makes a data-driven model for the star spectrum (and its variations) in a time-domain spectroscopic campaign, and then stacks the residuals in the (computed) rest frame of the planet to get the planet spectrum. We discussed the next-order correction to this method that approximates simultaneous fitting of planet and star.

2022-08-09

Hekker group visit

Today I visited the group of Saskia Hekker (HITS). We discussed many things asteroseismological! We discussed:

  • the ESA Plato observing strategy
  • is the asteroseismic signal a Gaussian process to any degree of accuracy?
  • using asteroseismic information to improve and inform open-cluster membership
  • synchronization of orbital periods with primary-star rotation periods
  • are two distributions different?
and much, much more. I had a lovely day at HITS.

2022-08-08

do the stars make up a coordinate system?

Long, long ago, when I worked with Sam Roweis (deceased) and Dustin Lang (Perimeter) on locating images on the sky, we used to discuss coordinate systems: You don't actually need a long-lat or theta-phi coordinate system to describe the locations of things on the sky, right? You can just use angular relationships among sources to locate everything precisely and unambiguously! And with that approach, you don't need to make as many choices and standards and lines of code about reference frames. But, alas, this point of view is not in the ascendent.

Not being deterred, I put the bright stars on my maps (from this weekend) of Kate Storey-Fisher's ESA Gaia quasar sample. Can you find the big dipper and Orion? And Sirius?

2022-08-07

Lambert's projection

At the Heidelberg Tiergarten Schwimmbad I worked out the mathematics for an equal-area projection of the sphere, centered on the poles. It turns out that I reinvented Lambert's projection from the 1770s. Here's a plot of the Gaia DR3 quasar sample (censored by some dust cuts) in my new projection:

2022-08-05

flexible models and interpolation

I gave the Königstuhl Colloquium today, which was fun. I spoke about this paper and related matters. I got great questions. One was about when to fit a very flexible model vs just doing simple interpolation. I gave some kind of minimal answer in my talk but then I thought about it a lot more later. The key difference between fitting a very flexible model and interpolation is that the former can be made part of a bigger probabilistic model whereas the latter (without serious modifications) cannot. That's a big deal when (say) you are trying to find planet transits in the face of stellar and spacecraft variability.

2022-08-04

start a paper on homogeneity

After making (yesterday) all the plots that demonstrate the uniformity and large-scale homogeneity of Kate Storey-Fisher's Gaia quasar catalog (which she is writing up now), I decided (tentatively) to write a paper on cosmic homogeneity with these data: When a catalog shows beautiful homogeneity, that is both a statement about the catalog and a statement about the Universe. I wrote a title and abstract and some figure captions today.

2022-08-03

the fractal dimension of the Universe is 3

Back around 2004 I promised myself I would never compute a fractal dimension ever again! But I did today, using Kate Storey-Fisher's (NYU) new quasar catalog from the ESA Gaia data. And it turns out that it is 3. Good! Actual measurement with uncertainty coming soon.

2022-08-02

visualizing a tensor field

I am working with Soledad Villar (JHU) and others on making generalizations of convolutional operators (and image-based non-linear functions based on those convolutions) that can deal (correctly) with input data that contain vectors and tensors. That is, tensor convolutions of tensor images. Anyway, one of the problems is: How do you visualize a tensor field or an image of tensors? I implemented a possible solution, pictured below: You make a figure that has no symmetry, and you take that figure through the tensor! That only works for 2-tensors of course. 3- and 4-tensors? I'm at a loss.

2022-08-01

new paper scope

We had a great meeting today with Kate Storey-Fisher (NYU), Hans-Walter Rix (MPIA), Christina Eilers (MIT), and me to discuss KSF's progress on the ESA Gaia quasar sample. We looked at her large-scale structure results and her jackknifes and discussed paper scope. Options range from a quasar-catalog paper to a selection-function paper to a full cosmological parameter-estimation paper. Of course we decided to do all three! But importantly we decided that this week we would focus on writing a quasar-catalog paper. That's good, and achievable.