LAS VEGAS-Few IT managers ever get the opportunity to put together a well-written, well-researched strategy with detailed architecture to build a fresh new data center.
What is most likely to happen in the real world is that a manager will inherit an older data center-one in which the inventory increases over time due to business initiatives, mergers and acquisitions, shadow IT, and business growth.
Then, depending upon budgetary allowances, the manager may occasionally start a project to rationalize the value of the data center-typically driven by a large fiscal-year savings he or she may have achieved after consolidating a bunch of servers, thanks to virtualization software.
This scenario was reported as a typical one in a survey taken at the November 2007 Gartner Data Center Conference, where nearly three-quarters of the attendees said they were in the process of a physical data center consolidation project. This is still pretty much status quo today.
At the Gartner Data Center Conference Dec. 2 at the MGM Grand Hotel, Gartner Research analysts Donna Scott and Paul McGuckin offered their perspectives on what factors and best practices should be considered in developing a data center strategy and architecture, while balancing risk, cost, quality and agility.
Understand Tiered Applications Is Important
“The data center strategy and architecture define the facilities that host IT services and the strategy for the placement of those services,” Scott told about 2,500 attendees in a morning presentation. “It also defines the resiliency strategy (for example, for service outages, site outages, data corruption and so forth).
“The strategy and architecture is dependent on an understanding of the tiers of IT service criticality and associated service-level agreements. Most organizations have between three and five criticality tiers, with the highest tiers for their most mission-critical services with the most stringent service levels and the lowest tiers for the less-critical services with lower service levels.”
When developing the strategy and architecture, an “end-state-type” architecture may be created, Scott said. However, execution of that strategy typically takes place over many years, because building and consolidating data centers and migrating services takes many years to implement.
“An integral part of any strategy-building exercise is understanding key market, business and technology trends, assessing their implications for your enterprise, and using them to build an intelligent plan that aligns with the business strategy, growth and risk profile,” McGuckin said.
A strategic imperative, McGuckin said, is to assess and tier IT services according to mission criticality, identifying availability and recovery SLAs and strategies for achieving them.
Design Network, Applications to Work Together
“Another fundamental shift that has happened as a result of these new environments is the requirement to look carefully at the intersection between the application and network,” McGuckin said. “In the past, it was possible to design the network and application solutions independent of each other. That wasn’t the recommended approach, but you could get away with this in nearly every situation.
“That is no longer true. The application development [and] deployment team can do everything correctly, as can the network architect, but application performance problems will still be the norm in a global real-time business process.”
It also must be noted that newer application environments-including browser-based, thin-client interfaces, X M L-based SOA (service-oriented archictecture) and AJAX-are not optimized for the network. Even though these protocols emerged from the development of the Internet, they still must be optimized to run on it, McGuckin said.
“These environments make accessing user clients and resources across the network very simple from an application development perspective. However, making problems go away for one constituency tends to burden others. In the case of these new protocols, the burden falls onto the infrastructure-server and network,” he said.
A tactical guideline put forth by Scott and McGuckin was that an IT manager should always evaluate the security and privacy profile of supplier countries, choosing a country and provider only if the manager is certain he or she can specifically mitigate the risks that will be involved.
The Gartner Data Center Conference continues through Dec. 5.