Here stands the manager of corporate information systems: an autocrat, a dictator of data who rules with an iron fist.
Its nothing to be ashamed of. In fact, its probably in the job description. Order and predictability have always been business imperatives in managing information technologies, which has led to aggregating data on massive, powerful and centralized computers — the ubiquitous client-server model.
Now the walls of the information technology glass house are being cracked open by peer-to-peer (P2P), a distributed information architecture that seems, at first glance, to be chaotic, unstable, uncontrollable — everything a responsible IT professional would avoid.
But in fact, P2P can offer powerful new ways of computing. Because its an architecture that distributes application intelligence and processing tasks to a multitude of network-connected PCs, P2P-based applications can potentially be much faster, more scalable and reliable, and more cost-effective than client-server applications. A hive of bees will always be smarter than any single bee, to paraphrase John Sculley, a partner at investment firm Sculley Brothers and former CEO of Apple Computer.
Many of the worlds biggest technology companies — Microsoft and Intel among them — have already internalized the idea that P2P computing will be central to the next generation of Internet applications. These true believers see P2P applications starting to trickle through corporate networks and the Internet in a stealthy, grassroots revolution that will make network computing more efficient, more interactive, more fun — in short, better. They want you to join the P2P movement, and they want you to believe that its inevitable, because if this really is the dawn of a computing sea change, they want to profit as your P2P partner from early on.
“I dont think theres any space that will deal with the Internet that will not be affected by peer-to-peer or distributed computing architectures,” said Barry Bellue, president and CEO of Thinkstream, a start-up that has developed a distributed search technology. “This is the only way to manage the ever-exploding amount of data in the world.”
A quick status report: Intel, Microsoft, Sun Microsystems and a swarm of start-ups became captivated last year by an idea popularized by Napster, the freely distributed and anarchic — in fact, technically illegal — music trading service invented by a college student from Boston. What the anticapitalist exercise of Napster demonstrated to the world — besides the timeless allure of getting something for nothing — was that the conditions were right for a new class of Internet computing. Other experiments fueled interest in P2P, notably the University of California at Berkeleys SETI@home distributed computing project, which is not technically P2P but showed it was possible to tap into the latent power of Net-connected PCs.
The big idea these applications share is that they connect previously isolated PCs dotted across the Net. PCs have become increasingly powerful in recent years, with faster processors, fatter hard drives and better Internet connectivity. This confluence of factors means ordinary computers can become active servers, not just passive clients, on the network. In a P2P model, PCs become “the dark matter of the Internet,” in the words of Clay Shirky, a partner at investment firm Accelerator Group.
Actually, P2P isnt a new concept. It dates back decades; the earliest Internet programs communicated from computer to computer, with no server required in between to facilitate the connection. Indeed, one of the breakthroughs of ARPANet, the precursor to the Internet, was that it was based on the concept of connecting computers as equal peers.
But P2P has won fresh attention because it represents, in effect, the inverse of the client-server model that has dominated the way most Internet applications work today. The most widely used Internet applications are e-mail and the World Wide Web, both of which depend on big, expensive servers sprinkled throughout the Internet cloud. Users connect to them using relatively lightweight, dumb-terminal-like apps. With the Web, the clients main function has been to display information, rather than process it or distribute it to other machines. With P2P, generally speaking, clients are like worker bees that handle the bulk of an applications processing and data transfer, while centralized servers simply coordinate the hives activity.
P2P is, more than anything else, a new way of thinking about how Internet applications function and communicate.
“The Web is dumb clients talking to smart servers,” said Charles Fitzgerald, director of platform strategies in Microsofts .Net group. “We want smart clients talking to smart servers, smart servers talking to smart servers and smart clients talking to smart clients.”
Clearly, P2P is neither a discrete technology nor a single application. It will not replace the Web, nor will it mean the death of server-based computing. And it does not appear likely to spawn an entirely new industry, as some people once thought would happen with Napster. Companies that are succeeding in selling software-based P2P concepts generally dont call themselves “peer-to-peer companies.”
“I dont think peer-to-peer, in and of itself, is an industry,” said Michael Tanne, CEO of XDegrees, a start-up developing P2P software and infrastructure services. “It enables new capabilities. There are real benefits to peer-to-peer, and its influence will be felt greatly in the next few years. But if youre using peer-to-peer in a music application, youre a music company.”
Whats so radical about P2P? Its simply a new way of doing things weve always used the Internet and other networks for — to exchange and find data, or to collaborate remotely — making use of distributed, interconnected computers to do those things faster, more flexibly and more dynamically.
“Peer-to-peer is a side effect of the increased availability of ubiquitous networking,” said Tim OReilly, president of publishing and software company OReilly & Associates, who has been a central figure in the nascent development of P2P technologies. “These paradigm shifts happen gradually. Then people go, What were we thinking? It took a long time after the PC came out before the minicomputer vendors realized the world had changed forever.”
Its too soon to declare P2P the greatest Internet development since the Web, as some people — including Intel Chairman Andy Grove — have. In any case, P2P will fast become an important part of Internet business, exploiting the power of the computer masses. Vive la revolution.
Why Wintel Loves P2P
Why Wintel Loves P2P
Frankly, Microsoft and Intel are no ones idea of techno-revolutionaries. Their corporate cultures are about as far as you get from the antiestablishment philosophies that underpin Napster and Gnutella, an open source protocol for P2P file sharing.
Microsoft and Intel are keenly interested in promoting P2P computing because it puts the emphasis back on the desktop PC, instead of the giant servers that pipe data to computers. And thats still where these two companies fry most of their bacon, notwithstanding their efforts to break into high-end, high-performance server farms. PC vendors have had trouble selling new computers, partly because there are fewer reasons anyone needs a new computer these days. P2P computing, on the other hand, tends to yield richly interactive applications that use a whole lot more processing and disk space — thus driving demand for newer, faster PCs.
Intel has been much more publicly enthusiastic than Microsoft about P2Ps potential. Last August, Intel formed the Peer-to-Peer Working Group to try to figure out what industry standards are needed to bootstrap P2P applications. It has embraced some upstart radicals, too: This year, Intel Capital invested in Uprizer, a P2P-based content distribution company founded by Ian Clarke. An outspoken critic of copyright laws, Clarke is the creator of Freenet, an early P2P system designed to bypass Internet censorship.
In addition, Intel recently launched a philanthropic project that parcels out computing-intensive tasks over the Internet to PCs, which crunch the numbers and send back the results. Intel is donating that computing power to researchers seeking cures for cancer, diabetes and other diseases. In less than two months, 600,000 people have contributed more than 100 million hours of processing time — a vivid demonstration, the company believes, of how P2P can be harnessed.
Intel even has a “peer-to-peer evangelist,” Bob Knighten, who oversees all its P2P doings. Ironically, Knighten believes that once P2P really catches fire, it will fade into the woodwork. “My expectation is that within three years, well probably very seldom talk about peer-to-peer in these terms, like we dont talk about client-server anymore,” he said. “Peer-to-peer computing will simply become one of the ways of doing things.”
Meanwhile, though Microsoft hasnt publicly discussed its interest in P2P very much — “We havent been as vocal as some because we think that [P2P] is a means, not an end,” Microsofts Fitzgerald said — it has its fingers in several P2P pies.
Clearly, Microsoft wants to be the industrys leading provider of underlying P2P application technologies. P2P plays a central role in .Net, a set of programming tools for writing dynamic Web-based applications. Microsofts .Net uses industry-standard specifications such as eXtensible Markup Language, an efficient technology for exchanging application-specific data, and Simple Object Access Protocol (SOAP), which uses XML to access software services over the Internet.
“The key is we want to help people build smart clients with Internet-native tools,” Fitzgerald said. “Napster had to spend a lot of time building infrastructure instead of creating specific features. Our job is to build the plumbing.”
Microsoft is also developing Farsite, its own research into the feasibility of a distributed file system that would pool the unused storage of networked PCs. And the company is developing distributed search engine technology that would send a search request across multiple data resources over a network, according to a person familiar with the project. Microsoft declined to comment.
Then theres HailStorm, due by the end of the year, which is Microsofts strategy to deliver Internet services based on .Net that are oriented “around people,” instead of computers, the company said. One service, HailStorm Location, will provide a way for P2P applications to rendezvous, Fitzgerald said; its essentially a directory that will let P2P programs find each other by dynamically figuring out whether someone is connected to the Internet and what their Internet Protocol address is.
Microsoft critics worry that with HailStorm, the company is, in typical fashion, attempting to use its dominance on the PC desktop to lock everyone into the next generation of Internet computing as defined by Microsoft, using Net services operated by Microsoft. “Microsoft understands that owning APIs [Application Programming Interfaces] is a good business strategy, so with .Net, theyre setting out to control the next generation of Internet services,” OReilly said. “Fortunately, theres a lot of competition. Its not a done deal yet.” >>
Fitzgerald defended Microsofts strategy of delivering HailStorm as a set of hosted services: “In the short term, the amount of complexity in letting people run arbitrary [HailStorm] services is too great. We have to get the core functionality up and running. Long term, yeah, well let people run their own location services. But for a lot of people, it will be more attractive to buy that [service] off the shelf. Do you want to spend the time and resources to build that yourself? Or get that from Microsoft in a developer-friendly way?”
Sun believes more developers will prefer its approach to P2P computing technology, Jxta. Launched in April, Jxta consists of a set of protocols that let P2P applications find each other, exchange data, search for information and perform other simple tasks. Bill Joy, Suns chief scientist, likens the Jxta protocols — which Sun is providing for free under an open-source license based on the one that accompanies the Apache Web server — to the Webs HyperText Transfer Protocol.
Microsoft executives have belittled Jxta as a “science-fair experiment.” Sun fires back that Microsofts HailStorm is a proprietary system designed to keep Microsoft in control of the Internets infrastructure.
This spitting match between Sun and Microsoft is part of the contest over so-called Web services, a catch-all term that applies to dynamic Internet applications, unlike the relatively static HTML-based Web. Forrester Research refers to this as “the X Internet,” meaning executable and extensible. Its a battle that IBM has now joined in earnest, and developers will be choosing their alliances among these players in the coming months and years.
P2P at Work
P2P at Work
Setting aside the power struggle over which companies will provide the programming languages and infrastructure, P2P is already proving useful to business. Instant messaging is perhaps the best example of a P2P application that has been rapidly adopted by businesses as a central communications tool. IDC expects the number of corporate users of IM programs to grow from 5.5 million this year to more than 180 million in 2004. But not many people really think of IM as a P2P app.
As it happens, several leading P2P companies began work years before Napster popped up to give the music industry night sweats. “The people who are actually interested in peer-to-peer are providing solutions to problems,” Intels Knighten said. “They dont really care that its peer-to-peer computing per se.”
Groove Networks, which sells a P2P collaboration application, is indicative of this phenomenon. Ray Ozzie, the developer who created Lotus Developments Notes, formed Groove in 1997, inspired by the P2P capabilities he observed in the Doom death matches his teen-age son was playing with his friends over the Internet.
When the company started scouting prospective customers last year, it faced potentially deal-killing resistance from IT people, said Andrew Mahon, Grooves director of strategic marketing. They had heard about P2P in the context of Napster, and were wary about introducing that sort of thing into their networks. They didnt want to lose control.
“The IT folks would come into the meeting with their arms folded,” Mahon said. “Theyd have seen the research about security and other concerns about peer-to-peer. And thats natural, because the IT guys are the guardians of the infrastructure and they want to be responsive to the organization, but they dont want to wreck the infrastructure for short-term advantages.”
Once the Groove salesperson walked them through the authentication and encryption features built into the application, the IT managers stopped worrying and started thinking of how they could use Groove. “We dont consider IT a stumbling block that must be appeased,” Mahon said. “We see them as a customer.”
Groove has also discovered that P2P alone does not sell a product, unlike the Web technology four or five years ago, which some business managers scrambled madly to adopt, regardless of the business value.
“There isnt a sword hanging over their heads, and their boss saying, If you dont get your peer-to-peer strategy in place in the next six months, youre fired,” Mahon said. “In terms of the marketplace, what weve learned is that if theres no immediately apparent business value, theres no conversation.”
Another example of a successful young P2P company is Oculus Technologies, founded in June 1999 with the goal of improving product development processes. Chris Williams, Oculus president and CEO, said his company ended up using P2P because it was the best way to connect the members of product development teams. Oculus CO system connects information from disparate desktop applications: For example, it can let a designer link a single piece of data from a computer-aided design program to an Excel spreadsheet on the product managers desktop.
“Customers are looking for solutions. They dont want to hear that I do peer-to-peer. They want to hear that I solve a problem,” Williams said, adding with a chuckle, “We mainly use peer-to-peer when we talk to the media and the financial community.”
Oculus customers include United Technologies and Ford Motor, which is also the privately held companys largest investor.
Other companies with P2P-based products have had success in addressing customers specific IT needs, often in vertical industry segments. Consilient, based in Berkeley, Calif., has been working with BP Amoco to deploy Consilients Sitelets, portable information containers that move around the network in a P2P manner, to accelerate such business processes as consolidating financial data from BP offices around the world.
Boston start-up WorldStreet has tailored its P2P system to the financial community — again, to solve a very particular information problem. The issue WorldStreets prospective customers face is this: Analysts are bombarded with hundreds of e-mail messages, without any indication of how critical the information actually is. WorldStreets plug-in for Microsoft Outlook lets an investment banks customers, for example, decide what kind of research they want to receive, for which companies and from which financial analysts, among other parameters.
“Relationships arent static — thats why a portal or Web site doesnt make sense,” said Rod Hodgman, WorldStreets chief operating officer. “Whats different about our P2P product is that its a completely balanced relationship. You can set up profiles to accept only the information you care about. Its information per your specification.”
Bear, Stearns & Co. last month started using WorldStreet in a pilot project with about a dozen employees. Stanley Sakellson, the firms senior managing director of institutional equities, doesnt see WorldStreets P2P technology as an anarchic force introducing complexity into the network; rather, he sees it as reducing existing data overload problems.
“A typical portfolio manager gets 400 to 500 e-mails a day,” Sakellson said. “What WorldStreet does is filter the information to determine whether something is really pertinent. Machines are speeding the information delivery to people, and it will be machines that filter the information.”