Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Subscribe
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Subscribe
    Home Database
    • Database
    • Networking
    • Storage

    Data Domain Leads Growing List of Deduplication Vendors

    Written by

    Matthew Sarrel
    Published January 4, 2009
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      Data deduplication promises to use enterprise storage more efficiently, reducing the need to buy as much media-tape or disk-and as a result save space, power, and cooling in the data center. Unfortunately, it is also a term that can have almost as many different meanings as there are specific technologies applied to achieve it.

      Broadly, the term applies to technologies that analyze data files, find and remove redundant blocks of information, and engage some sort of compression algorithm, usually g-zip or LZ. In general, files that are edited frequently but with few changes are excellent candidates for deduplication. For this reason, many businesses are turning to deduplication solutions to reduce storage space requirements for backup and archiving of corporate databases, e-mail server message stores and virtual machine images. If your WAN pipes are saturated with such traffic, then you definitely want to keep reading.

      The data deduplication market is dominated by Data Domain, so we’re starting a series of reviews of products in the space with that company. Other prominent players include NetApp, IBM, EMC and Quantum. Traditionally, reviews have focused almost exclusively on metrics involving the degree of deduplication, or the percentage of raw disk space saved by de-duplication. Not only are other factors-such as throughput performance and ease of installation-as important (if not more), but measuring space savings is extremely difficult to do accurately in a laboratory setting, i.e., without live data with frequent small changes made by many clients at once over a period of months or years.

      We wanted to approach reviews of data deduplication gear from a different angle. We chose to focus on ease and potential disruptiveness of implementation, throughput performance, manageability and features while testing in our New York City storage lab, and then interview several Data Domain customers about their real-world experience in order to gain insight into actual deduplication rates. Our primary goal was to evaluate the suitability of the Data Domain solution with respect to multi-site business continuity.

      Our testing was designed to simulate a three-location company with a data center, a regional headquarters and a branch office. The branch office backed up locally to a DD120 with 350 GB of internal storage, the regional to a DD510 with 1.2 TB of internal storage, and both of those units replicated to a DD690 with two external drive enclosures housing 10 TB of storage at the data center. Each unit was designed for maximum redundancy with redundant power supplies, NICs, and Fibre Channel controllers, as well as drive arrays configured for RAID 6 plus hot spares. We did this using two separate methodologies, the first being to use Symantec Veritas NetBackup to back up locally and then replicate between the various Data Domain units using Data Domain’s replication technology, the second being to use Data Domain’s OST (OpenStorage) to control the whole backup and replication process from NetBackUp. It is interesting to note that if your organization already uses NBU, then you can keep all of your old jobs and policies and merely redirect them from tape drives to Data Domain drives.

      Deployment could not have been easier, although some aspects are more focused on an enterprise storage skill set than on an IT generalist skill set. Installation should be done using the CLI either by telnet or attached KVM. I was pleased to see that at first login, I was forced to change the default password. We applied licenses for storage, replication and OST, then configured network, file system, system and administrative settings. We confirmed our settings, rebooted and then started setting up our CIFS and NFS shares.

      CLI vs. Web GUI

      The CLI is about as good as a CLI can get. It will finish commands for you, display command trees, and has extensive help. However, it is still a CLI and I prefer a nice polished Web GUI. Unfortunately, this is the main shortcoming of Data Domain that I encountered-the Web GUI is pretty barebones, although it will get the job done. I was able to monitor all three units on one screen, but to really manage them it was necessary to use the CLI. Data Domain officials said most of their clients only use the CLI (this was backed up by client interviews) and they are working to update the power and usability of the Web GUI for their next release.

      Documentation is excellent-well organized and informative-which goes a long way toward decreasing the potential disruptiveness of adding a new technology to the data center. For example, we upgraded the DD510 using the expansion kit to add six additional 250 GB drives in less than 10 minutes without having to take the unit offline. This established one RAID group of eight disks, one RAID group of six disks and one hot spare that can be used by either group.

      In our lab testing, we saw dedupe rates ranging from 5 to 99 times, depending on the file type and the number of times the same content was backed up. Typically, there will be a slight savings simply due to compression the first time you copy something, then the deduplication kicks in with subsequent copies and improves over time. Many enterprises would have a set up similar to ours for backup, archive and business continuity purposes. The cost and time savings provided by effective deduplication before replication across WAN links is staggering.

      We then left the shelter of the lab to journey into the real world to uncover the deduplication rates that Data Domain customers have seen over time. We visited The Rockefeller Group, a private corporation involved in commercial real estate, real estate services, and telecommunications services to commercial clients, and spent the morning with Peter Lantry, director of data center operations, and Sanja Kaljanac, senior IT services engineer. Kaljanac reported that they are achieving 100 times data reduction on the DD565 in their data center and 67.5 times data reduction on the branch office DD120s. This was supported by analysis of log files provided by additional Data Domain customers who had compression rates ranging from 10 to 40 times, and max throughput between 300 and 500 MB/s on DD690s. In addition to The Rockefeller Group, other real estate-related enterprises that use Data Domain are Land America Financial Group and Skidmore , Owings and Merrill.

      Our laboratory and real world testing demonstrates that Data Domain’s deduplication technology has real value when used to back up, restore and archive between locations and over WAN links. Given the amount of data required to maintain business continuity across a multi-location enterprise, traditional backup methodologies are being stretched to and even beyond their limits. The combination of a DD120 in branch offices and a DD690 or DD510 in the data center is capable of not only removing these limits, but also shattering them in such a way that you’ll rethink (and enhance) current business continuity processes.

      Pricing as tested: $293,540

      DD690-(base configuration with expansion shelves) $210,000

      DD510-$19,000

      Expansion kit for DD510-$13,000

      DD120 (includes replication)-$12,500

      Replication software license for DD690-$35,000

      Replication software license for DD510-$2,540

      Retention Lock Software for DD510– $1,500

      Matthew D. Sarrel is executive director of Sarrel Group, an IT test lab, editorial services and consulting company in New York.

      Matthew Sarrel
      Matthew Sarrel
      Matthew D. Sarrel, CISSP, is a network security, product development, and consultant based in New York City. He is also a technical writer.

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.