By Renee Boucher Ferguson  |  Posted 2007-06-15 Print this article Print

Via ePedigree, the EPCglobal data project is simulating several architectural models—thin registries and thick registries—for both single and distributed networks. But outside the pharmaceutical industry, several competing network architectures are in play. The most widely described architecture is the EPC Network, which was developed to ensure global interoperability of tag data as products move along the supply chain. The EPC Network consists of three major components: EPC Discovery Services, essentially an electronic chain of custody for EPC tags; EPC Information Services, the interpreter communicating between a database and applications; and an Object Name Service, which identifies the location of the server hosting the appropriate information needed by an application.
The EPC Network, proposed by EPCglobal, is essentially a central repository. There is also a hybrid model that assumes all data is local to various enterprises, and intelligence discovery mechanisms are used to search for relevant data.
Then there are document models, on-demand models and registry models to consider. "This [Auto-ID Labs and SAP Labs project] is the first scientific analysis where we look at a couple of different architectures—those that have been proposed and are floating around in academia and industry," said SAPs Mantripragada. Given that the yearlong project is in its early stages, researchers at Auto-ID Labs and SAP Labs are reluctant to release initial results regarding architectural models. But Tao Lin, senior research scientist and EPCglobal data project manager with SAP Labs, said there are some widely understood findings about the proposed architectures. "[In] some of the architectures proposed by EPC or by other organizations, we found some assumptions in the beginning werent really correct. For example, having all the data at the source—say, at a manufacturer—is not very efficient," said Lin. "What we want to develop is a simulator to find out whether the architectures that have proposed by EPCglobal and industries actually can be scalable in this potential Internet of Things we are talking about. We hope that the proposed architecture can work well with the future applications. However, as EPC/RFID can potentially change IT infrastructure, we have to avoid any potential risks." Lin said the project findings will also impact SAPs product strategy. A larger implication of the study is that because the whole concept of RFID-tagged goods throughout the world is so new, people havent thought through the bigger issues yet. Much of the academic and industry work has been related to the physical layer and processes around RFID, with little attention paid to data management inherent in RFID, Lin said. Part of the data management problem, said Lin: In addition to the sheer magnitude of data that will be generated from RFID tags, the data can be stored anywhere. "Were talking about thousands, even millions of data storage systems—SAP systems, Oracle systems, mom-and-pop data systems—holding RFID/EPC data," said Lin. "The data could have a contribution to a query. So now were dealing with a problem that we have not dealt with before: Business transactions will be based data that could be stored in millions of storage places. RFID and EPC data is one contribution to business process automation." No one knows if todays infrastructure can actually support the data storage, Lin said. There is, however, Google. What Google brings to the table for the EPC data project is the concept of storing data in memory rather than in a database to enable extremely fast queries. MITs Williams is taking that approach to heart, researching how to distribute large in-memory caching capabilities across systems, enabling data to be accessible very quickly. Williams, along with SAP Labs, is looking at developing systems very much like Googles, with resident in-memory that is scalable. "The big challenge is streaming data. How do you process lots of real-time data? Thats the challenge RFID is bringing. Some of the challenges were looking at are security, scalability and extensibility of the network—how to handle real-time streaming data, and how do you track that," said Williams. "It will have to be [designed using] distributed systems, for sure—very close to the way the Internet works now. At the top level, each country hosts a top-level server, and then these top-level servers interface with each other. Thats the way its going to have to work. We cant just take [the data] and apply it to a network; its going to take a lot more sophistication than that." Check out eWEEK.coms for the latest news, reviews and analysis on mobile and wireless computing.


Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters

Rocket Fuel