What to Do With Power?

 
 
By Peter Coffee  |  Posted 2003-11-24 Email Print this article Print
 
 
 
 
 
 
 

The abundance of computing power creates challenges for IT, writes Peter Coffee.

When off-the-shelf Mac G5s are assembled into the worlds third-fastest computer, as Virginia Techs Terascale Cluster is currently ranked, its clear that once-exotic levels of supercomputing power are available to anyone who wants them. Whats more, that so-called Big Mac 2,200-processor system, which took six months to build at a cost of only about $5 million, was ready on time and under budget.

I apologize for the "only," but that cost is orders of magnitude less than the cost of other massive multiprocessor clusters, and Big Macs performance is still improving as the system is fine-tuned. For that matter, if youre content with merely being on the list of the worlds fastest 500 systems, you can spend one-tenth as much for a handful of boxes from any of several vendors: More than 40 percent are clusters, and the majority use fewer than 256 CPUs.

The question, then, is, what should an enterprise IT architect do to put this kind of power to work on practical chores? In a briefing presented earlier this year at the Seventh Workshop on Distributed Supercomputing, Fred Johnson, a program manager for the U.S. Department of Energy, laid out a set of challenges that apply to enterprise managers as well as to scientists and engineers.

The challenges range from the tracking of a single priority shipment to the sifting of a hundred-million click trails in search of the most effective online shopping experience. This means that enterprise managers must first identify and evaluate the best means of keeping their systems fed with accurate data at minimal cost. Wal-Marts interest in RFID (radio-frequency ID) tags is consistent with this mandate; so is the United Parcel Services development of the high-speed, high-density, high-reliability labeling technology originally dubbed UPScode, now called MaxiCode.

While RFID tags and MaxiCode scanners are feeding the system with millions or billions of new details, highly scalable storage and retrieval tools such as Oracles 10g grid-enabled database and Oracle Globus Toolkit must be rolled into place to turn those details into useful trends—and also to alert affected managers to the high-priority exceptions.

Theres more to that analytic picture than just the high-powered database engine. Just as you wouldnt ask a race-car driver to drive at 200 miles per hour without readable instruments to warn of impending engine trouble, so must data mining tools be coupled with effective dashboard technologies like those of Informaticas PowerAnalyzer—a product whose Version 3.5 won eWEEK Excellence Awards honors earlier this year and whose 4.0 update I reviewed in eWEEK this August.

Johnsons agenda also noted that many strategic problems today involve multiple specialties and disciplines. A single major project may require the coordinated input of market researchers, construction managers, legislative analysts and financial advisers. For example, you wouldnt want to deploy RFID tags in a retail environment without talking to your legal team about privacy and talking with your engineers about other systems that might be affected by new spectrum-sharing issues.

This means that a fair chunk of computing power must go into collaborative technologies, including calendar coordination, project and document workflow, and multimedia teleconferencing.

You might think high-energy science types would be mainly concerned with boosting hardware power for their computationally intensive problems, but among the highest priorities that Johnsons briefing identifies are scalability of operating systems, improvement of software libraries for productive application development and creation of application performance analysis tools that are capable of useful insights on complex platforms. These software issues must likewise be high on the agendas of enterprise IT builders, even as Intel, AMD and IBM all vie for leadership in mainstream 64-bit processing.

Give Sun a chance, therefore, to talk about why Solaris is worth its price on large-scale systems; give Microsoft some rug time to brief your teams on its Indigo framework; ask Parasoft to tell you about tools like its Jtest and .Test for Java and .Net development.

Then put your own energy to work.

Technology Editor Peter Coffees e-mail address is peter_coffee@ziffdavis.com.

 
 
 
 
Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...

 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel