Defining Tomorrows Database
Defining Tomorrows Database
Todays "perfect storm" of sophisticated attacks, corporate governance mandates and public awareness of database security risks demands the prompt and determined response of database builders and application developers. In this report, eWEEK Labs profiles the combination of emerging technologies and developer strategies that enables an enterprise team not merely to survive but also to triumph against such challenges, without forgoing the opportunities that new database deployment options can provide.
There are clear upsides to new database architectures: Grid computing on high-bandwidth networks enables cost-effective and highly scalable analysis, while portable devices with wireless links give better support to decision makers in the field. These same technologies, however, also increase the number of places where an outside attacker might find vulnerability or where internal inconsistencies of data format or application development practice might rip apart a system from within.
The agenda of the enterprise database developer is therefore readily defined. Without abandoning costly investments in servers, software and developer skills, the next-generation database must deliver greater analytic capability and rich-media versatilityeven while serving users in remote locations who may have only limited bandwidth available to them via intermittent connections.
Tomorrows database must be able to meet critical needs in more places, must be able to reach those destinations on shorter notice, and must better defend itself against the threats of both malice and misfortune.
To meet these demands, the database platform and application developer must adapt to a changing mission and an evolving environment by devising new combinations of flexibility, speed and survivability. Just as command and control have replaced brute-strength armament as the decisive factors in victory at sea, its the database developers growing ability to combine business logic with real-time knowledge that will yield applications providing competitive advantage.
Assetsand clientsat risk
Database vendors would rather talk about their products increasing capabilities than about the threats that face those products after theyre deployed.
Late last month, though, an Australian food company became an unwilling poster child for the current state of database dangers. Several thousand of the companys customers received fraudulent e-mail, warning of a supposed product recall due to infectious contaminantsa complete fabrication but a convincing forgery of company-initiated e-mail, apparently made possible by successful penetration of the companys customer database.
Investigations were still in progress at this writing, but this kind of incident points to the need to think of database security not only in terms of preserving data against inadvertent disclosure or unauthorized modification but also against deliberate, systematic and damaging misuse.
Its never been more important, therefore, to design security into the database itself, wherever possible, instead of relying on every application that uses that database to be an effective security participant.
Application-level vulnerabilities actually account for the lions share of enterprise configuration management and incident-response costs, according to estimates by Gartner Inc. presented at Septembers Gartner Application Development Summit: The Path to Modern AD. A 50 percent reduction in application vulnerabilities prior to deployment, Gartner estimated, would reduce these costs by 75 percent.
With in-house IT head count under unrelenting pressure, and with application maintenance schedules under the control of outsourced development teams rather than local staff whose priorities can be dictated, it becomes all the more vital to eliminate post-deployment surprises.
Next Page: Limiting Direct Access to Data
Limiting Direct Access to
As database programming capabilities become more powerful and accessible, enterprise database developers increasingly have the option of limiting applications direct access to raw dataviewing it only through the filter of a business rule. For example, an application might execute a database-level procedure that performs a test and returns the result to the application, such as a true/false value for "Customer has sufficient available credit," instead of the application requesting and being given the customers current available credit amount and making the determination itself.
In this way, the database designer can devise and enforce policies on information access rather than leaving those policies to be communicated to, understood by and correctly implemented in the code of many different application developers.
New regulatory and legislative mandates are ratcheting up the penalties for handling data improperly, but this should not unduly tip the scales in the direction of deciding not to handle it at all. To avoid making a business failure out of a technical success, database developers should take part in business unit discussions of risk and reward.
Responding to pressures such as Californias SB 1386 legislation, which mandates public notice of a database security breach, developers might rationally define their database design and operational boundaries on grounds of simply minimizing riskpresenting the resulting decisions as fait accomplis to business units. Development teams should resist this temptation, though, and should instead elevate their participation in higher-level discussions of database and application missions.
Speed and maneuverability
Every extension of database technology into new applications domains increases the demand for speednot merely average performance but, increasingly, a guaranteed worst-case performance in real-time and services environments.
Developers may be tempted to satisfy the clamor of speed demons by unleashing other demonsspecifically, by relaxing security standards. Vendors such as DataMirror Corp., for example, are attempting to resolve that conflict by using increasingly affordable in-memory technologies to demolish key performance barriers without security compromises.
DataMirrors PointBase 5.1, last months update to that pure-Java (and therefore highly mobile) database platform, incorporates such refinements as memory-based hash joins to address application bottlenecks. It also provides data compression for TCP/IP transfers to improve synchronization performance on intermittent and/or limited-bandwidth connections, which are especially likely to be encountered with expanding database services to remote users.
Data compression is only part of an overall strategy that database developers should apply to minimize the need for continuous or prolonged connection when designing remote-user applications.
Distributed user populations are very much the target of another database technology vendor, Adesso, of Boston. Adessos chairman and chief technology officer, John Landry, spoke with eWEEK Labs for this report. "The most untapped return on investment on the planet is in the field," Landry said. "Theres enormous opportunity for productivity enhancement and knowledge enhancement, where right now theres at best e-mail and spreadsheets and at worst still paper-based procedures."
What has held back the broader use of databases in the field, Landry said, is a combination of too little connectivity and too much database heterogeneity. He said that even if a development team can stick to a single database vendor such as Microsoft Corp.as Adesso has, so farthe enterprise database, laptop computers and handheld devices will still be using different database products that present different faces to the developer.
Next Page: Building Platform Versatility
Building Platform Versatility
Reviews by eWEEK Labs of tool sets such as Microsofts Visual Studio support this assessment. These tools attempt to minimize the developer workload associated with these seams in the database fabric, but substantial developer effort is still needed to resolve them.
In-house teams should likewise maintain at least a back-pocket portability strategy, rather than wedding themselves too tightly to the internals of any single platform.
Reliance on stored procedures, although it has the benefit of placing logic as close as possible to associated data, does create performance-measurement challenges as that logic executes on remote platforms.
Late last month, Embarcadero Technologies Inc. announced its Rapid SQL 7.3 cross-platform stored-procedure benchmarking tool, promising developers precise identification of the objects and lines of code that dominate their applications execution times.
Approaching the problem of database performance from the deployment end is BMC Software Inc. BMCs SmartDBA product will soon provide improved real-time notification of database events to BMCs integrated real-time management console. "Our challenge is to take all the data in different databases and just send whats really needed," said Bill Miller, BMC mainframe business unit general manager, in Houston.
Speeding response time in database operations without increasing administrator head count is also a goal of IBM, with its expanded use of autonomic technologies in DB2 Universal Database Version 8.2, released in September.
The products high-profile features include a Learning Optimizer, dubbed LEO, that attempts to accelerate searches by dynamic analysis of database activity. One might expect this kind of capability to be especially effective in distributed applications with varying data bandwidth between different points in a network. Evaluation of Version 8.2 is under way at eWEEK Labs.
The same technology that makes databases more secure can also make them smarter, instead of frustrating application developers by making data less accessible.
For example, Autodesk Inc. announced late last month a preview program of its new Design Accelerator tools. These tools enrich database capability in the direction of generating design geometries based on functional descriptions. The core of this capability is a database of available components and their engineering characteristics.
Greater analytic capability is also promised as a major emphasis of the long-awaited Microsoft SQL Server 2005, released late last month into what the company is calling a Community Test Program. A third round of formal beta testing is planned for next quarter.
However, Microsofts vagueness on the subject of actual release datesand on key questions of database capability within its other products, such as Exchange gives developers new motivation to examine open-source alternatives or to outsource the tracking of the moving target of Microsoft technology to a metaplatform vendor such as Adesso.
A critical question, from the viewpoint of database application developers, is the treatment of business logic as database-resident code or as executable data. Late last month, at the Applied XML Developers Conference in Stevenson, Wash., Microsoft Client Platform Architect Chris Anderson, of Redmond, Wash., said that developers dislike architectures "that force XML to be more than data." Microsoft has promoted instead its XAML (Extensible Application Markup Language) as a way of bringing declarative semantics to the Windows platform.
Adessos Landry, meanwhile, urges developers to think of database application development evolvingin terms of code that has all the power of traditional imperative and object-oriented languages but that lives in the database itself.
Next Page: Towards Greater Reusability
Towards Greater Reusability
One of the hallmarks weve noted about Microsofts .Net model, in particular, is the degree to which the code that adds value to a database can enjoy the same administrative convenience and reliable synchronization behaviors that have long been taken for granted for the data thats being stored. This kind of integrated approach is needed to elevate the reusability of code above the level of cut-and-paste and to give it the greater robustness of actual inheritance that passes along the benefits of improvement to all users.
Microsofts next major upgrade of its database and development platform has been long delayed but has shown promise in making substantial strides in this direction, based on eWEEK Labs discussions with developers.
Database developers must also prepare for a quantum jump in the volume and arrival rate of new data from RFID (radio-frequency identification) tagging initiatives that are quickly crossing the line from future vision to present fact.
Widespread RFID adoption has been seemingly just over the horizon for some time, delayed by privacy concerns and supply chain inertia, but there are signs that large-scale deployment may finally be imminent.
Developers should understand that the resulting pressures on database size will not be smooth upward trends but, rather, will be pronounced bursts of activityfor example, in the generation of test data sets for new applications or during the maintenance of parallel systems during major upgrades.
In the stormy environment that faces database developers, the enterprise database is the aircraft carrier of the IT fleet: a formidable concentration of power but also a primary target. These combined characteristics pose a challenge for the strategist who wants to put power where its most useful but who is thereby forced to place a crucial asset very much in harms way.
Database developers will do well to remember the proverb that "a ship is safe in its harbor, but that is not what ships are for." It is the job of the developer to make the power of data available to those who need it, not to protect that data to the point that it serves no purpose.
Technology Editor Peter Coffee can be reached at email@example.com.Developers grapple with new environments, technologies
Expanded mobile connectivity with wireless and ubiquitous broadband service
Scalable processing power using grid-based and other on-demand architectures
Readily repurposed data and improved reusability of code due to XML representation and loosely coupled Web services design
Web resources: Next-generation database development