Application developers have been fighting an escalating war against chaos in IT stacks of ever-growing complexity. Utility computing, with its highly distributed systems, must make a final assault.
The problem surfaced in 1965, when the Univac 1108A appeared on the scene as the first multiprocessor computer. That machine introduced a new low-level computer instruction—the test-and-set operation—to ensure that a processor could read a memory location and change its value before any other processor could attempt its own alteration of that data.
The challenge is grueling even in those systems where a developer has full knowledge of hardware configuration, with only a known portfolio of applications and inputs and outputs to manage. The contest rises to a much higher level when utility models run in dynamic environments, such as those of a grid computing arrangement.
Success in such a situation depends on preserving “loose coupling,” as developers term the technologies that minimize the risk of violating an applications crucial assumptions.
That goal of staying loose is probably not reachable by adding still more complexity to hardware, in the manner of the Univac, or to applications, in the manner of frameworks such as Microsoft Corps .Net. It may require instead a shift of complexity from the nodes of the network to the fabric of service delivery.
“Were seeing significant moves by companies wanting to adopt loosely coupled, service-based ways of achieving their goals,” said Frank Martinez, chief technology officer at service fabric provider Blue Titan Software Inc., in San Francisco.
Martinez warned, however, that there needs to be consistency and encouragement regarding loose coupling, and “thats surprisingly hard to maintain,” he said.
Application development frameworks, Martinez observed, generate much of an applications code and create many of its resource dependencies behind the scenes. “The tools do a great job of abstracting complexity,” he said. “Theres tremendous uptake of frameworks, but they lead to tightly coupled points in an otherwise loosely coupled architecture.”
Martinez advised that developers, seeking to counter this tendency, look at standards that are widely used, such as SOAP (Simple Object Access Protocol). SOAP supports the notion of intermediaries as a first-order principle, said Martinez, which enables developers to ensure loose coupling in ways that would not otherwise be possible.
|
A fundamental rule for maintaining flexibility is to ensure that interaction patterns are based on message exchanges and that those exchanges are interoperable across many platforms. “Thats really the key—that the interactions are messaging-based,” Martinez said.
With that kind of perspective achieved, Martinez added, “we need to start moving away from programming and integration in a way thats dependent on code and start programming with contracts. Thats what WSDL [Web Services Description Language] is all about. Bind to the functional aspect, keep your code simple and introduce policy-directed change. Dumb end points, smart fabric.”
The devices that serve as consumers of other utilities are likewise simple end points for complex networks—just as a plumbing fixture or an electrical appliance is not affected by changes in the details of network structure and function as long as end-point conditions are fulfilled.
The analogy can be carried too far. Bits are not as substitutable as gallons or watt-hours. The simpler the end points, however, the fewer the opportunities for them to make inconsistent assumptions or do things in incompatible ways.
After almost four decades of trying to contain the chaos of too many complex clients, sharing a primitive network that doesnt protect them from themselves, application developers should welcome the advent of the utility computing alternative.