Forces That Shape IT
If the enterprise IT stack were a physical structure, its architecture would not be described by a list of its rooms. To understand how that building could support the activities of an enterprise, one would need to know the sizes of the rooms, their equipment and special facilities, and their connections to one another and to the outside world.
In the IT architecture, applications are the rooms in the building, and their arrangement in relation to one another—their production and consumption of data, their need for storage and bandwidth and computational resources, as well as other interactions—determines whether that architecture is a pleasing, unified whole or an awkward arrangement that fails to do important things well.
New application development methods and tools, like new building materials and construction methods, challenge IT architects to learn—or to become historians and curators rather than developers and builders.
Forces That Shape IT
To understand the logic of a building, one needs to appreciate—if not necessarily accept—the assumptions that shaped its design.
Why is one enterprise housed on a parklike campus of low-profile buildings while another occupies a monolithic high-rise? The cost of land, nature of the enterprise work force, local geography and kinds of transportation available in an area affect these choices—just as an IT architecture is shaped by the costs of storage and bandwidth; the lifetime of enterprise applications; and the need to balance such competing factors as security, ease of support and computing price/performance.
The architecture of old IT was shaped by expensive computers, obscure and time-consuming application development tools, application-specific data formats and vendor-specific communication protocols—all working within a closed environment in which anyone with access was trusted.
Many of these influences now distort and weaken enterprise infrastructures. When the IT environment changes, an architecture that once made sense becomes an expensive monument to what used to be; that once-impressive IT monument may fail to accommodate fundamental changes. An IT architecture need not adjust to every passing fad; there are such things as enduring, even classic, designs that work because theyre right, even if theyre old. But there have also been breakthroughs in the flexible use of data, in the ease of real-time collection and analysis, and in the pervasiveness of network connections, for example: These enable or even compel corresponding changes in the ways that the enterprise envisions the spaces of application function or the arrangements of those functions into competitive capability.
Next Investment Wave
Next Investment Wave
As the next wave of IT investment approaches, application development technologies and tools are being driven by the emergence of Web services to undergo their most rapid change in decades. Developers no longer enjoy the luxury of specialization; the disciplines of business logic development, database administration, user interface design and network optimization are now arguably blending into a single portfolio and at the same time are becoming the front door rather than the back office of the enterprise. Faced with such changes, its especially urgent for enterprise IT builders to adopt an architects perspective.
Being an IT professional in changing times doesnt just mean wielding the steel and concrete of source code and servers and network infrastructure; it also calls for leadership in conceiving and creating the spaces of information and action that, in turn, become the playing field—or the battlefield—for enterprise opportunity.
The entry of PCs into enterprise applications was the replacement of old assumptions with new realities. The previous assumptions were that computers were expensive, that programming was difficult and time-consuming, and that information needs were stable and similar throughout the enterprise, with resources protected by a physical perimeter of trust.
PC technologies made computer investments more granular, as if an enterprise suddenly found itself able to add small buildings to a campus instead of needing to build new space in skyscraper-size increments.
Low-cost programming tools, such as the spreadsheet or Borland Software Corp.s breakthrough Turbo Pascal, encouraged departments and individuals to build short-lived applications to answer urgent or rapidly changing questions—as if office planners had suddenly discovered movable partitions and the resulting ability to rearrange office space without calling in a team of designers and contractors.
With this experience in mind, consider the emergence of Web services in similar terms. The spaces of enterprise applications have long been furnished with the equivalent of custom-built cabinets and fixtures; lip service was paid to the goal of software reuse, but it was difficult for developers even within a single organization to identify and share code that performed common functions.
Object-oriented technologies improved the definition of function and interaction among the modules of any given application, and they enabled some commercial traffic in modules for such horizontal needs as I/O and other universal elements of applications. Only toward the turn of the new century, though, did modules become self-disclosing—able to answer the question, “Can you do this for me?”
: Tech Outlook 2003″>
Within those application spaces, purpose-built data representations required a hodgepodge of specialized data connections—as if the Xerox copiers in one department needed a 170-volt power outlet, while the Minolta machines in another needed 190 volts (but used a confusingly similar power plug).
The transforming effect of Web services amounts to going through the offices; identifying common components of what people need and do; and buying the needed items of furniture and equipment in quantity, to be combined as needed in any given space—a file cabinet for an engineer here, three file cabinets for the corporate attorney there. All the new appliances use the same basic utilities and connections, or provide appropriate adapters or transformers as needed, instead of requiring redundant systems throughout an entire building.
Pervasive open networks have also opened the IT office space to the outside world, broadening workers access to resources but also exposing them to unprecedented threats.
The open-plan office of networked IT is a poor risk unless theres good security at the points of outside entry, as well as a clean-desk policy that prevents casual and inappropriate access to sensitive information. This is analogous to effective network security at the physical layers, robust design at the application layers and defense in depth for enterprise data.
Endless discussion has addressed the need for boundary security, but intrinsic flaws in applications leave administrators with an unattractive choice: either shutting their doors, or leaving them open to skilled invaders as well as increasingly nervous customers.
The standardization of protocols and data formats has converged on what Microsoft Corp., under the rubric of Windows DNA, identified as the tripod of application integration: TCP/IP for transport, HTTP for interaction and XML for representation. The last of these, XML, challenges development toolmakers to integrate flexible and powerful tools for XML authoring, inspection and dynamic transformation into their tool sets.
Borlands Delphi, in the tradition of Turbo Pascal, was one of the first to put XML facilities in a high-productivity development system, but many shops want a tool that isnt tied to any single language— Pascal-based or otherwise. Altova Inc.s XMLSpy 5 (see middle screen) has earned eWeek Labs Analysts Choice honors by combining versatile editing power with broad support for varied database platforms, multiplying that power by easing integration with Java and C++ programming. We havent yet seen an integrated environment whose XML tools were in the same class, although Oracle Corp.s JDeveloper deserves mention for its use of XML throughout its own structure for ease of customization. Emphasizing the breakdown of traditional discipline boundaries, tools such as Corel Corp.s Ventura 10 combine programmable XML transformations with more traditional publishing power. Developers need to be open to new aids like these.
With services being produced and consumed across the enterprise boundary, modeling and testing take on new importance. For example, Microsofts Visual Studio .Net deserves praise for its integration of Web service hosting and testing facilities into the development cycle, and tools such as Popkin Software and Systems Inc.s System Architect offer more capable process simulation functions.
: Tech Outlook 2003″>
As information and function interact up and down the entire supply chain, its vital for application code to have internal controls on what resources should be available to what tasks. Mainframe developers may get a grim satisfaction from the industrys painful rediscovery of disciplines that seemed to have been left behind by the luxury of “one user, one machine,” but the security issues today are more complex than ever.
The security thats always been available in Java, and thats far more accessible and easy to tailor in Java 2 Enterprise Edition than it was upon Javas debut, sets the standard—but the security features available to developers on Microsofts .Net platform deserve more attention than theyve gotten in all the confusion created by Microsofts less-than-clear positioning of the .Net brand.
Using these new tools to build applications is not the same thing as building a new architecture, any more than a Gothic cathedral built of steel is a skyscraper. Anyone could look at the latter structure and recognize inappropriate, inefficient use of materials, but it takes more effort to look behind the façade of enterprise IT and appreciate the new freedoms—and the new constraints—that will shape the skyline to come.
Technology Editor Peter Coffee can be reached at [email protected].