After a low-research morning (punctuated by an excellent talk on the post-Higgs LHC by Andy Haas of NYU), I spent my flight to Seattle madly working on my talk for AstroInformatics 2012. I am talking about why the map–reduce (or Hadoop or whatever) frameworks for data analysis will not be sufficient for the future of astrophysics and why we have to develop new things ourselves. The map–reduce framework is great because it is a general framework for solving problems in log N time. But, to my knowledge, it can't do anything like real probabilistic hierarchical models without egregious approximations. I don't have much specific to propose as an alternative except making brute force a lot less brute.