IBM is entering the stream computing arena.
In a keynote address at the 2007 Technology Management Conference and Expo in New York on June 19, William Zeitler, senior vice president and group executive for IBMs Systems and Technology Group is introducing its System S prototype. System S is a software architecture that originated in IBM Research.
IBM officials said the company chose the New York event for the technologys introduction because Wall Street firms are among the worlds most aggressive users of IT and typically have demanding requirements to crunch data on the fly to profit from real-time decision making and risk mitigation.
The officials said the new computer system has the ability to assemble applications on the fly—based on the inquiry it is trying to solve—by using a new software architecture that pulls in the components it needs when they are needed to handling a specific task.
IBM Research scientists have been working for four years preparing this new kind of computing system, which an IBM spokesperson said promises to reinvent how data is used to make decisions.
In addition, the System S prototype features an advanced operating environment, algorithms and filters that can take advantage of any kind of hardware, including accelerators like IBMs Cell project and Blue Gene, IBM officials said.
“Its a unique software architecture for managing data in systems,” the IBM spokesperson said of System S. “It can use any type of hardware or processors it needs to get a job done. Whether its IBM blades combined with the Cell processor or it could even be run on Blue Gene.”
System S will enable enterprises to analyze vast streams of rich data in diverse formats including text, video, voice, RFID, GPS, satellites and other massive amounts of structured and unstructured data, the company said.
IBM said the new system builds everything around the problem being solved. For instance, the system will factor the different data types needed and how hardware systems and software algorithms process external data streams in real time as they come in. System S was designed to operate itself based on the data query at hand, IBM said.
In addition to financial markets, IBM officials said System S potentially could be used in areas like handling environmental sensory data. IBM also has a System S pilot project in the astronomy industry, the company said.
Meanwhile, although System S is not attuned to green computing in the sense of data centers and conservation, it can be used to monitor riverbeds and process sensory data regarding environmental conditions, the IBM spokesman said.
Moreover, IBM officials said the computing model of the last 50 years, which Web search firms like Google have pushed to the extreme, is transaction processing. But the transaction processing model works on “old” data, whereas System S uses streaming data, IBM said.
For instance, while traditional computing models retrospectively analyze stored or known data and cant continuously process massive amounts of incoming data streams, IBMs system utilizes a new streaming architecture and mathematical algorithms to create a forward-looking analysis of data from any source, the company said. This allows for the continuous refinement of a response as additional data is made available.
Meanwhile, although Google has based its search model on the transaction processing model, the company earlier this month acquired PeakStream, which also focuses on the stream computing model.
IBM said it is moving ahead with its stream computing strategy by introducing the System S technology and also by stating that part of its goal is to find collaboration partners who can drive new commercial stream computing applications.