Theres simply no possible way, using any known or imagined technology, for the typical Hollywood spaceship to pack enough joules to get itself off a planet—not to mention that no one ever seems to ask what its going to cost to fill er up.
And dont even get me started on the question of where the creatures in the "Alien" movies get the calories theyd need to move those ugly exoskeletons. If they use the humans they catch as hosts for their young, what do the adult aliens eat, and where do they grow it?
OK, thats entertainment, but have you looked at a server farm lately? The imbalances look just as bad. Before the decade is out, even conservative predictions suggest a crossing of the curves. The cost of powering and cooling a server over a four-year lifetime will soon exceed the cost of the server hardware, projects Luiz André Barroso, Google platforms engineering group leader.
In a paper published in the Association for Computing Machinerys Queue journal last fall, Barroso wrote, "One could envision bizarre business models in which the power company will provide you with free [server] hardware if you sign a long-term power contract."
The cost considerations are significant, of course, but so are the implications for infrastructure burden and external effects such as climate change. Barrosos analysis shows a flat-line trend in server performance per unit of power consumed, meaning that cheerful Moores Law forecasts of server throughput turn into ice-cap-melting projections of watt-hours used.
Worldwide production of electric power currently runs around 18,000 terawatt-hours per year. Thats my own extrapolation of 2003 statistics (the most recent year available) according to the trend of the last several years, based on data from the International Energy Agency. That sounds like a lot of power, but it helps to put that number in perspective. If we had to grow our crops by artificial light, the worlds entire electric capacity could maintain only the agricultural output of an area the size of Rhode Island.
Its kind of nice to have a nearby star, shedding roughly a kilowatt per square meter on the surface of the planet, along with the clever invention of green plants to kick off the process of turning that energy into food. Give us a few more decades, though, of Google-speed growth in our energy demands for information processing, and its easy to come up with scenarios at least as bizarre as Barrosos. Vast solar arrays, for example. Never mind the inconvenient problem that it takes several years for a solar panel to produce as much energy as was consumed in manufacturing it (according to figures from the University Center of Excellence for Photovoltaics at Georgia Institute of Technology).
Of course, any area covered by those solar collectors would be unavailable for growing food, and wed never take usable farmland out of production just to produce IT hardware. Oh, wait a minute. China is doing that already, with the total rate of Chinese farmland conversion proceeding at the rate of about one Rhode Island per year. And China overtook the United States in 2004 to become the worlds largest exporter of IT hardware, assuredly earning more per acre of factory than per acre of farm—but you cant eat a CPU.
Its in this environment that Sun Microsystems announced in May its appointment of a vice president for eco-responsibility, David Douglas. Hell be charged with minimizing the energy footprint of systems, not only while theyre running but also over their whole life cycle of manufacture and salvage.
Dont dismiss this as some Californian tree-hugging gesture: The real-world numbers are significant today and will be even more so tomorrow.
Peter Coffee can be reached at firstname.lastname@example.org.