2022-04-01

distributions of dimensionless quantities

In finishing up our paper on dimensional analysis for machine learning, Soledad Villar and I have been discussing how to talk about out-of-distribution generalization of machine-learning methods. The space of dimensionless quantities is smaller in many ways, but I couldn't figure out how to argue that it is easier to match the test data to the training data in the dimensionless quantities than in the original, dimensional inputs. Villar pointed out that one way to see it is that many different distributions in the dimensional quantities map to the same distribution in the dimensionless quantities. For example, if you multiply all the masses by five, you haven't changed the distribution in the mass ratios, even though your mass distributions will no longer overlap. That's a good argument, and what we ended up arguing in the paper.

No comments:

Post a Comment