Today at Flatiron, Melissa Ness (Flatiron) showed Dan Foreman-Mackey (Flatiron) and me that she can infer stellar effective temperature and surface gravity (but not metallicity!) from a Kepler light-curve, with a data-driven model (supervised regression). She feature-izes the light-curve into something for the model by taking the auto-correlation function. This is a clever idea, because it removes phase, or time-translation effects, from the data. And in a cross-validation she can show that she can infer temperatures to something like the precision with which they are measured in the training set! So this is extremely promising, and an interesting extension of things we have seen previously with stellar jitter. I opined that this could only get better if she looks into feature engineering; the autocorrelation function on a uniform grid cannot be the best feature set in any sense. But with encouragement from Foreman-Mackey, we decided to move ahead with a project now and leave feature engineering to later work.