We have plenty of data to work with and systems that can process enormous data sets. But some of the biggest successes - say, Google - are doing the processing based on pre-computing algorithms and work done with pen, paper and thinking cap. The Rev. Bayes reaches a very wide congregation today. As the Harvard Business Review puts it, "done in the absence of high-speed, low-cost computational capacity, that work put a premium on imaginative quantitative thinking". It's not just vintage idea though; Tesco is using simulated annealing, Nokia is using genetic algorithms - I learned both of those in the 1980s when they were only a decade old, if that.
But I found myself worrying; with all the pressure on academe to work well with business and have more immediate goals, will we have enough blue sky research going on to give us well-thought out algorithms to implement in the next generation of even higher performance computing? Any excuse to slow down and think more has to be good!