I've spent a lot of time trying to figure out what the "big picture" is for Software Carpentry. What are the best practices every scientist should master? What are the principles of computational thinking? How exactly are we helping people? My latest attempt to put this all in a nutshell has three strands: novelty, efficiency, and trust:
The second of these, efficiency, is what sets us apart from other "computing for scientists" initiatives. On on side, supercomputing and big data have opened up entirely new kinds of science, just as radio telescopes and PCR did. On another, advocates of open access and reproducible research are changing the way science is done, and by doing so, making it easier for people to check and build on each other's work. Our goal is to help them implement all of these things without superhuman effort—to do with ten minutes of scripting what used to take an hour of mousing around in a spreadsheet.
Taken that to an extreme, you could say that our goal is to put ourselves out of business: to reduce the time scientists spend writing software to as close to nothing as possible, so that they can spend more time thinking about their science. I suspect, however, that some form of Parkinson's Law will kick in—that time spent programming will expand to fill the hours available. If every one of of those hours is spent productively, we'll know we've done our job well.
Originally posted 2013-01-28 by Greg Wilson in Education, Opinion.comments powered by Disqus