A few weeks ago, John Cook posted the following:
In a review of linear programming solvers from 1987 to 2002, Bob Bixby says that solvers benefited as much from algorithm improvements as from Moore's law.
Three orders of magnitude in machine speed and three orders of magnitude in algorithmic speed add up to six orders of magnitude in solving power. A model that might have taken a year to solve 10 years ago can now solve in less than 30 seconds.
A million-fold speedup is pretty impressive, but faster hardware and better algorithms are only two sides to the triangle. The third is development time, and while I think it has improved since 1987, I also think the speedup is measured in single-digit multiples, not orders of magnitude.
Which brings us, again, to Amdahl's Law and the purpose of Software Carpentry. The time needed to produce a new computational result is D+R, where D is how long it takes to get the code to work and R is how long it takes that code to run. R depends on hardware and algorithms; as it goes to zero, the time required to get a new result is dominated by the time required to write, test, maintain, install, and configure software . Reducing that is the "effiency" part of our long-term aim to improve novelty, efficiency, and trust.
 In practice, R doesn't go to zero for many interesting scientific applications, because scientists scale up their problems to keep running times constant. (As a colleague of mine once said, every simulation takes roughly one publication cycle to run.)
Originally posted 2013-02-03 by Greg Wilson in Opinion.comments powered by Disqus