I spent the day at Princeton at the PICSciE Symposium, run by the Research Computing group at Princeton. There were impressive talks about data-driven science all day. The highlight for me was a neural science talk by Uri Hasson (Princeton) about simultaneous monitoring of multiple brains in fMRI machines while they listen to coordinated audio, or tell and hear stories. He is able to show, first, that there are reliably correlations in the fMRI-indicated activity in the brain, and those correlations are a function of time in the story. Second, there are reliably correlations between brains that are listening to the same story. Third, there are reliably correlations between the brain of the speaker and the listener! It looks like speech might actually be a way to synchronize, in certain ways, the brain activities of pairs of people. That would be awesome if it holds up, and relates closely to things I read in Wittgenstein many many years ago.