2013-06-07

SMMQSO

Krikamol Muandet (MPI-IS) and I discussed briefly at the end of the day his work on support measure machines, which are a generalization of support vector machines. He has generalized SVM (which is a supervised classifier for data points) to take as input probability distribution functions rather than points. This is an impressive accomplishment, and the resulting machinery has a beautiful internal structure, in which distances between data points look like chi-squared differences between probability clouds. But he has also applied this machinery to the training and test sets used in XDQSO to compete with Bovy et al. What he finds is that SMM beats SVM (because it uses error information) but that SMM loses to XD (because it doesn't know as much about the causal, generative processes as XD does). We discussed a bit the general question of how domain knowledge ought to be included in machine-learning algorithms, and I started to think that maybe this might be our superpower here at CampHogg. More on that after a few weeks of Kepler goodness at SAMSI in NC.

No comments:

Post a Comment