Opinion: Long-range forecasts consistently err on the downside.
I just ran across my charts from a talk that I gave in 1995 with the title "Futures of Computing." I was struck by the differences between the 10-year forecasts that I made then and the capabilities (and price/performance ratios) that are actually available now.
In my talk, I started with two data points for processor speed and cost: A 16MHz Intel i386 sold for $390 in 1988, and a 66MHz Intel i486DX2 sold for $501 in 1993 and worked about 7.8 times as fast as the i386 chip. Assuming simple compound growth, this implied speed improvement of 45 percent per year with price growth of only 5 percent per year. My 10-year projection thus was an 870MHz processor, selling for $790, with 42 times the speed of the 1995 i486 processor and doing three times as many operations per clock cycle.
This projection was, I warned in my talk, a bit daring. Conventional wisdom at the time held that the x86 instruction set was too complex for continued clock-speed growth at the rates of the previous decade. Neither was current electronic technology thought to be capable of sustaining this trend. Fundamentally different architectures and materials might be needed.
I dont know if my projection sets a record for number of different ways to be simultaneously wrong, but it might come close. Right now, I can go online and order an individually packaged Intel Core Duo T2500 processor, with a clock rate of 2GHz, for a single-unit price of $320. This exceeds my clock rate projection by a factor of 2.3, an upside difference of 8 percent per year compared with my projection over the 11 years since I made that call. The chip price, meanwhile, is 60 percent less than I expected, a gap of 8 percent per year in the other direction.
That modern-day Intel chips twin cores each do something like 60 percent more work per clock cycle than the single core in a i486. Overall, that means I get something like seven times more work, for 60 percent less money, than I told my 1995 audience they could expect in 2005.
Actual price/performance, to put a single number on just how conservative I was, is thus almost 12 times better than what I thought in 1995 was actually an optimistic forecastsince we all know that exponential growth curves must be expected to level off as the limits of a technology generation draw near.
This was not the only projection I made that turned out to be much too modest. In 1987, I observed during my 1995 talk, a 1,200-baud modem cost $500; in 1992, I continued, a 9,600-bps modem (we no longer said "baud" by 1992) with on-the-fly compression cost $370. By 2005, I projected, youd be able to get 610K bps through an interface device that cost $240.
In fact, I currently enjoy four times that data rate through an interface that I got free with my DSL service agreement. Data transfer performance has therefore grown 13 percent per year more quickly than I expected. And price/performance? Well, what number do you use when the hardware is essentially free?
My point is that, as Bill Gates is well known for saying, we consistently overestimate how much well accomplish in a year but underestimate how much well accomplish in 10 years. The applications were imagining now, with plans to start building them next year, may therefore be crippled by our timid visions of the resources that will be available to them in 2015.
For enterprise IT to be a competitive differentiator in the 20-teens, it must become something much better than 1980-style personal computing thats merely faster and cheaper. Applications will have access to client devices that know where you are, whether youre moving, and what your appointment calendar says youre doing or about to do; application developers must use that context information to make a selective presentation from a rich, real-time data stream, telling you what you need to know but not overwhelming you with distracting irrelevancies.
The hardware to do this is available now. Programming skills to use that capability are in short supply, but you can find them.
The biggest shortfall, though, is probably in the imagination and the boldness to believe how much better things can still become.
Technology Editor Peter Coffee can be reached at firstname.lastname@example.org.
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.