2020-04-08

more generative model work

Today it was back on the notebooks with Villar (NYU). We made improvements to the generative-model optimization, making it a model that explicitly optimizes a log-likelihood objective. That should be good for information-theoretic propertis. And indeed it seems to perform very well. We noticed some things, like that the results seem to be extremely sensitive to the dimensionality of the latent space. It may be that the models massively over-fit when the latent space gets too large, which perhaps isn't surprising. We started to develop some conjectures about the model performance in different settings; these could inform our work and writing on the subject.

No comments:

Post a Comment