Coffee: Algorithms still outpace silicon as driver of computing progress.
Its easy to get blasé about the power of hardware improvement to make slow processes get faster. Why optimize code for any one task when the hardware designers will do the job for all of us?
Once in a while, though, its worth looking up from our microprocessor road maps
to realize that advancements in algorithms--the recipes of low-level computation--are still making major contributions to advancing the state of the art.
Several things have recently reminded me of the importance of these models of problem solving. As July began, AMD advised me of the 1.0 release of the Core Math Library for the AMD64 platform
, co-developed with the Numerical Algorithms Group Ltd.
With the release date for AMDs desktop-oriented Athlon 64 now announced
as Sept. 23, its certainly not too soon for developers to be given new tools for crafting code that can make the most of that power. Without those routines, we might be sadly disappointed in the real-world benefits of all that hot
new silicon--and I mean "hot" literally, sad to say.
I also noted the astonishing speed improvements in the 5.0 release of Wolfram Research Inc.s Mathematica
. I had previously observed, on the occasion of the 4.2 release, that speed improvements due to better algorithms in successive versions of Mathematica have sometimes outpaced Moores-Law rates of improvement
by a factor of four or more. The latest version, remarkably, continues that trend.
What forced this subject to the top of my stack, though, was an article by Rachel Chalmers
that led me to the archive of the late computer scientist Edsger Dijkstra
--the man best known for the terse sentiment "GOTO Considered Harmful,"
a 35-year-old observation that arguably inaugurated the discipline of structured programming, and also made "considered harmful" an inside joke among well-read hackers
I was struck by Dijkstras indictment of the teaching of programming languages, with or without GOTO statements, as giving students more exposure to the set of problems than to the set of solutions
. By defining learning in terms of notations, instead of teaching ways of thinking about problems, this practice perpetuates the viewpoints in force at the times that our established languages were designed.
Theres so much to learn about algorithms
, independent of the language in which theyre written, that its easy to wonder if hardware improvement is wildly over-emphasized as the vehicle for continued performance growth--especially when writing code, unlike making chips, is an industry segment that requires little capital and therefore lends itself to rapid migration
across national borders in search of cost-effective talent.
Rarely has the expression "work smarter, not harder" been so literally true. I dont mean this to be discouraging--rather, its meant as an observation that we all still have ways to become more valuable contributors to our teams.
Tell me what youre doing smarter.