LAS VEGAS—With the sun shining on a clear day outside of the Venetian Hotel here, the site of this weeks Veritas Vision conference, Gary Bloom, president and CEO of Veritas Software Corp., envisions a bright future for his storage company in its quest to enable utility computing.
Driving home the theme to the 3,400 conference attendees that storage is a logical place to embrace a utility computing architecture, Veritas is developing products to improve application performance, measurement and service levels across storage environments.
Bloom sat down with eWEEK Senior Writer Brian Fonseca at Veritas Vision to discuss his thoughts on Veritas strategy, why storage hardware competitors are hamstrung, and the nonstory surrounding information lifecycle management (ILM).
Where do customers stand with the concept of utility computing today?
CIOs are trying to do a couple of things. Theyre, one, trying to figure out what is hype and what is reality. Theyre then trying to figure out, does this apply to my business? There were very few CIOs in our executive forum [at Veritas Vision] who didnt see the connection to needing to align business requirements to IT, drive down operational costs and invest in differentiating their business.
With that said, I think that largely with our utility strategy being very practical—being not a rip-and-replace approach but being one we can incrementally move down the path of utility computing—is resonating extremely well. And the dialogue with our customers is shifting aggressively toward, How do I get started?
Weve answered the question about the [role of] discover, consolidate, standardize, what are the steps to utility computing. Weve answered that one hundred times in the past few days to help them understand how to get going. And then the announcement around us having some new services that really say you, Veritas, are becoming a bigger part of our IT plan and my architecture, and you actually help with that architecture.
How do this weeks Veritas Vision product announcements play into Veritas bigger utility computing picture?
Theyre all building blocks. So, what we need to do and what our customers are asking for is make all your building blocks two things. The best in the market available for us, and two, integrate them so they interoperate together. So, weve been working on feature enhancements of all these building blocks; thats why there are product versions coming out.
As you saw with CommandCentral, theres complete integration, not just offering new functionality around service-level management and cost allocations but also just integrating in all of our management technologies to a common console. Little things like single log-on are huge for customers.
From a management standpoint, are customers looking at Veritas to be their de facto storage management layer into utility computing and beyond?
When you talk to CIOs, theyre trying to figure it out. There are so many messages out there, theyre confused. They come to events like this to try to figure it out and sort it out, whats the best answer to me that makes the most sense. What I think they understand pretty clearly is that theres really nothing wrong with other vendors strategies, except that the strategies other vendors are articulating only apply to their environment.
And what I think is crystal clear to the CIOs aligning with Veritas is that we have a strategy that works in all environments. In other words, theres nothing wrong with Oracles 10g strategy, if youre 100 percent an Oracle customer.
Well, most of my customers have some Oracle, they have some DB2, theyre thinking about MySQL, they have lots of different things. Most of my customers are not just an IBM customer. Theyre an HP customer, theyre a Sun customer, theyre going to be a Linux customer.
I think the CIOs are starting to understand there are two approaches that they can do. They can align to one vendor or multiple vendors and have multiple strategies, or they can align to a heterogeneous utility computing strategy. Nobody else [besides Veritas] is out there articulating a heterogeneous strategy. Go listen to Oracles message—theres nothing about what theyre going to do for DB2.
How important is application performance to enabling utility computing?
What application performance management does is start giving the customer the view all the way through, the way the user views the system. You can measure the reliability and availability of the storage and the reliability and availability of the server, but if youre the CFO [chief financial officer] of your company and you call up the CIO and you say, Our financial system running SAP is down, or, Its so slow, my guys cant post any transactions, and we only have 24 hours left in the quarter …
How are you going to respond if the CIO comes up and says, Well, the storage looks like its running 99.9 percent availability. That guy is going to look at you and say, I didnt ask you how the storage pool is doing, the system is too slow!
So, you need to take the application users view of it, and you need to drill down through all layers of the stack. And thats what our application performance management technology does, is it does end-to-end application diagnostics to identify performance problems.
IT managers are finding themselves accountable for service levels they may not fully control or properly measure. How can Veritas lend a hand through utility computing?
If you look at what the steps are to implement utility computing, number one is discovery. Because every time we talk to a customer about their knowledge of what theyre actually doing, we find that most of them dont even know. In most cases, they dont measure it or the technologies werent there to manage it, but they just didnt think about it. It wasnt a priority.
Im not blaming them, Im not saying they made a mistake, but they have a lot on their plate, and its a tough job. I dont take anything away from the complexity of being a CIO in todays world. But what happened during the downturn the past two years was those CIOs for the first time went out and measured how much storage they had, they found out they were at 20 and 25 percent utilization and they had millions in revenue in unused storage capacity.
That led to a discovery phase—they start consolidating it and using it, and its become much more of a service at lower costs. Thats why that discovery phase is so critical. So what is your service level? Well, if you dont know, a good place to start is to start measuring your service level, so you know where you need to improve.
You may find out in some cases youre operating a service better than what the consumer of your service is willing to spend money on. Its how much am I willing to spend for the service level that meets my requirements, and making sure above and beyond I dont pay for a service level I dont need.
Storage virtualization is dominating the headlines behind new products from IBM and EMC. From a virtualization standpoint, where can customers expect to see Veritas go?
Were very interested in allocating and provisioning server capacity, either in real machines or a VMWare virtualized machine. Were obviously interested in clustering and making those highly available. The reason we picked up the Ejasent technology, we also want to be responsive to the application.
Meaning if the application needs to be positioned or moved in order to be provisioned to a different server, you can virtualize and provision a different server and move it. We think thats where the real business requirement is, move applications around, not move operating systems around. Its a phenomenal opportunity for us.
EMC is aggressively pushing its holistic ILM strategy. You are not overly enamored with the concept. Why?
There is a piece of ILM that is pretty interesting, which is the regulatory compliance piece, the ability to do archival and retrieval. We do that. But I dont understand to this day the tie-in to Documentum. How is an application that manages a document any different than an application that manages your general ledger [GL]? Its just an application.
The fundamental architecture of storage and infrastructure you keep—how you archive and retrieve and how you can do DR for backup and recovery—doesnt change based on whether its a GL or a document. Its all the same. [EMCs] entire strategy has to be centered around a primary hardware agenda, which is storage. What are other options do they have? This is the handicap that the hardware companies have.
They have to think about whats good for my hardware stack, and they cannot offer technologies that are going to make Network Appliance, or IBM or Hitachi better. Why would EMC want to put out software that improves the performance of Network Appliance, such that Network Appliance can win all the hardware sales? At a cheaper price? Its a business strategy that will never work. They have to preserve their hardware business.
Why is Veritas building-block approach toward utility computing an advantage to customers?
Building blocks ultimately are products and solutions customers can adopt and implement. They can implement today and get started and participate in utility computing strategy down the road, increase availability and increase performance of the business applications today. That means improving performance of servers, improving the performance of storage, and improving the performance of applications—and making all those elements more available.
The problem with technology if you go back to the applications environment, for a lot of those big SAP implementations, Oracle financial implements, Siebel CRM, you had to invest millions of dollars to wait three years for results and hoped it worked. With our technology, invest today and start getting results very quickly.