Client/server computing was a technology ahead of its time—and more importantly ahead of the enterprise network infrastructure capabilities of the time.
While vendors and users largely agreed on the value of the model and on the best methods to implement the model, hardware and software vendors’ desire to lock users into their product lines as well as the stirrings of the Internet and World Wide Web standards all made for an interesting time in the technology reporting business.
The client/server emerged during the boom years of the PC industry. Client/server computing also helped fuel the adoption of Novell Netware as a means to harness the increasing power of distributed personal computers, combined with the central servers that delivered the application management, data and storage while trying to bring some adult CPU supervision to the unbridled demands for PC access to data center resources.
Those demands came flooding in from the corporate desktops that were connected to local-area networks that were fairly speedy for the time as well as from laptop users who were dialing in via modems. The concept behind client/server computing was solid and an extension of what had been happening in mainframes and dumb terminals ever since the first command line started blinking.
The idea was that if you could distribute the compute workload, you could reach a broader computing audience, get results to the people doing the work and—as was said at the time—move intelligence to the computer periphery.
However, the era of client/server withered due to its own success. Think of 100 people working on editing documents both within corporate offices and on the road via modem. Tracking changes, setting levels of access priority and solving that old database problem of handling changes from many people working on one document and one person on many documents quickly produced client/server overload.
Take that simple document example and move it up to enterprise resource planning applications or financial systems and the infrastructure sagged and often just came to a standstill. Now take those workloads and try to mesh them up with other, incompatible systems from vendors that weren’t in sharing mode, and you can see why the 1990s version of client-server computing wasn’t going to scale.
When you look at the client/server-related articles in the ’90s, you’ll find lots of arguments about networking and application standards, as well as questions concerning whether or not PC operating systems were really enterprise-ready or how convoluted licensing practices were holding back the advent of robust enterprise applications. With only a few new buzzwords inserted, the arguments sound very much like those of today concerning cloud computing.
A 1995 article looking at the differing client/server visions of Microsoft and IBM “focuses on the approaches of International Business Machines Corp. and Microsoft Corp. in client/server computing. One other article from the same year compared “Microsoft’s plans to build, buy or license a transaction-processing monitor for Windows NT; IBM’s Customer Information Control System TP monitor and Novell Inc.’s SuperNOS strategy.”
The Wikipedia.org entry on client/server has a decent definition, “The client/server model is a distributed application structure in computing that partitions tasks or workloads between the providers of a resource or service, called servers, and service requesters, called clients. Often, clients and servers communicate over a computer network on separate hardware, but both client and server may reside in the same system.
“A server is a host that is running one or more server programs which share their resources with clients,” according to Wikipedia. “A client does not share any of its resources, but requests a server’s content or service function. Clients, therefore, initiate communication sessions with servers which await incoming requests.”
eWEEK 30: Client/Server Model Withered Under Cloud Computing Shadow
Enterprises remained committed to the client/server computing model for well into the new millennium. However, the founding of Salesforce.com in 1999 was emblematic of the first stirrings of a major shift in enterprise computing from client/server to cloud computing.
The company’s founder and CEO Marc Benioff, who initially came to prominence developing the first Windows client/server applications for Oracle in the early 1990s, told anyone who would listen that client/server computing as it was originally conceived was a technological dead end. The smart and efficient way to go, Benioff said, was to move to “multi-tenant” cloud computing applications in which a single instance of the software runs on servers to provide application access to a multitude of computers via the Internet.
Salesforce.com brought its customer relationship management software to market with a motto that highlighted another benefit of cloud computing: “no software.” This meant that since the application ran entirely on the Internet that could be accessed with a Web browser at any time of the day, there was no need to install application software on the clients. Furthermore, desktop administrators didn’t have to worry about application updates because all the software upgrades took place in the cloud—not on the client.
Thus began the campaign by the emerging cloud computing companies to convert enterprises from applications installed on-premise to software-as-a-service” apps running in the cloud. It took time for enterprises to trust cloud computing as an alternative to client/server because they worried about whether the applications and more importantly their corporate data would be secure when stored on the Web. The rapid growth of Salesforce.com and companies like it convinced many CIOs that it was safe to at least try out cloud computing.
The steady adoption of the Google Apps cloud productivity applications and other cloud software companies such as NetSuite, Workday, SugarCRM, MySQL and RightNow Technologies, which was acquired by Oracle in October 2011, helped convince enterprise IT managers that cloud computing was a safe bet.
Another factor was the steady increase in power of mobile phones, especially the introduction of versatile smartphones such as the Apple iPhone in 2007. These phones were now powerful enough to access business applications running in the cloud. So business users gradually went beyond just using their smartphones to check email, manage appointments or send text messages. They are using them now to access sales applications, confirm travel arrangements or update database applications, as well as for many other tasks on the Web.
But these developments didn’t keep Microsoft, IBM, Oracle, CA Technologies and many other companies from continuing to release new versions of their client/server products into the second decade of the new millennium. But many of these products have moved fully or partly to the cloud.
Today, while many enterprises are still using legacy client/server applications, there isn’t a lot of discussion, investment or development of new client/server apps. The proof can be found in a search of Google News where most of the current discussion about client/server is focused mainly on moving aging client/server apps to the cloud.
Eric Lundquist is a technology analyst at Ziff Brothers Investments, a private investment firm. Lundquist, who was editor-in-chief at eWEEK (previously PC WEEK) from 1996-2008, authored this article for eWEEK to share his thoughts on technology, products and services. No investment advice is offered in this article. All duties are disclaimed. Lundquist works separately for a private investment firm, which may at any time invest in companies whose products are discussed in this article and no disclosure of securities transactions will be made.
eWEEK Editor in Chief John Pallatto contributed to this article.