Delphix's New Data Management Platform Uses DataOps, Data Pods

Using Delphix’s new platform, structured production data need not be moved anywhere. Virtualized clones of databases of any size can be moved, analyzed, fixed, added to, subtracted from and patched, then slid easily into the production server.

Delphix.CEO.Chris.Cook.2017

SAN FRANCISCO--For decades, conventional enterprise IT has been based upon loading data into storage arrays to make it available for applications inside servers, so that workloads could be processed and business as usual conducted. But with the current data deluge clogging up networks and filling up storage capacities, the movement of digital bits is getting stickier and more problematic all the time.  

Now an upstart company with 300 enterprise customers called Delphix is turning this idea on its head for speed and efficiency’s sake, using DevOps, a new concept called DataOps and a tool called Data Pods.

These are all contained in Delphix’s Dynamic Data Platform, a new data management package launched Aug. 3 that offers users the ability to virtualize, manage and secure data with its centralized controller. It has been developed using the company’s underlying IP plus the data masking secret sauce Delphix added two years ago when it acquired Axis TS.

DataOps Defined

Delphix CTO Eric Schrock describes DataOps as “the alignment of people, process, and technology to enable the rapid, automated, and secure management of data. Its goal is to improve outcomes by bringing together those that need data with those that provide it, eliminating friction throughout the data lifecycle.”

DataOps has previously been used to describe analytics systems, Schrock wrote in his corporate blog. “But this nascent movement must evolve to be broad enough to cover the complete spectrum of data challenges while being specific enough to guide decisions around processes and technology,” Schrock said.

Using Delphix’s new platform, structured production data need not be moved anywhere. Virtualized clones of databases of any size can be moved, analyzed, fixed, added to, subtracted from and patched, then slid easily into the production server—replacing the current database—without so much as a hiccup in the system. Sensitive data can be masked out—an important consideration for government, science, health care, military and other segments.

Performing all these tasks on a workload can encounter plenty of hitches. The enemy that’s being defeated? Delphix calls it data friction. This happens when siloed data stores needed for hundreds to thousands of applications—and all closely guarded by separate IT managers—get larger, more complex and harder to move. Yet enterprise line-of-business employees need access to data in order to do their jobs: to speed up software development, to migrate apps to the cloud, to use analytics, machine learning and other tasks.

Ridding Systems of Data Friction

These competing forces cause data friction that needs to be managed—and managed well.

The Redwood City, Calif.-based company, which opened its doors back in 2008 with software that could clone multiple copies of Oracle databases, enabled patching and bug-fixing, then allowed those corrections to be added to the working database without stopping production, has taken that enabling IT and moved to a whole new level.

The Delphix Virtualization Engine inside the platform is app-specific, because it needs to understand the structure of each source application’s data store structure. Besides Oracle, Delphix now supports many databases, including IBM DB2, AP ASE SQL Server, Postgres and others. It also supports packaged and custom business applications that include Oracle EBS, SAP and PeopleSoft. Support for SAP HANA in-memory database is in beta, the company said.

The DDP can process any type of file, as long as it's in a file system to which it can connect; standard NFS and iSCSI access protocols are supported.

Data Pods: Key Differentiator

A key differentiating capability of the platform is its ability to provide Data Pods, or personalized virtual data environments, to data users for faster application development, cloud migration and governance projects.

The DDP combines virtualization, security, management and self-service automation in one place. Data is moved and managed between source and target environments, no matter if they are on-premises, in a cloud or both. Fast, do-it-yourself provisioning is standard, so line-of-business users don’t have to wait hours or days for an IT staffer to do it for them.

“We designed the Delphix Dynamic Data Platform as a direct response to the existing barriers to cloud migration, governance and agile development projects to give data consumers the ability to have access to the data they need, without waiting,” CEO Chris Cook (pictured) said at a launch event at the NASDAQ Environmental Center in San Francisco.

"Analysts say we're headed toward needing to store 163 trillion gigabytes–or 163 zettabytes–of data in the world by 2025. Enterprise data will grow more than twice as fast as total data; close to 100 zettabytes by itself. We’d better be ready.”

Competitors in the Market

Delphix competes, to an extent, with copy-data management providers that include Cohesity, Actifio and Catalogic. Delphix, however, is the only one that deploys the ultra-fast Zettabyte File System (ZFS), a core Delphix technology, which provides the ability to do filesystem snapshots.

Developed by Sun Microsystems, ZFS is a transactional file system, which means that the file system state is always consistent on disk. ZFS also uses the concept of storage pools to manage physical storage.

Key use cases for the Delphix Dynamic Data Platform include:

  • Data and Application Acceleration: Companies can now treat data like code by recording every change and reverting back to any point in time.  Storing, compressing and replicating data in near-real time means data no longer slows businesses down.
  • Data Privacy and Security:  Enterprises can prevent unintentional release of sensitive data with control over access, retention, and audits.  They can also mask sensitive data in critical applications to prevent release of personally identifiable information.
  • Cloud Adoption:  Enterprises moving data to the cloud can move massive amounts of data once and still refresh it from on-premise or hybrid sources, speeding up cloud adoption and operations.     

Henry Baltazar, Storage Research Director at 451 Research in San Francisco, told eWEEK that he believes that Delphix’s message “resonates with where we are at with digital transformation today. The biggest potential issue is that many other vendors have similar messaging around building a data platform and on integrating data management and data protection.  On the plus side, Delphix has always been strong with database stakeholders, and that should help them get some differentiation in a crowded market against vendors such was NetApp, Dell-EMC and startups like Rubrik can Cohesity.”

In what use cases might this platform work best? “The product and technology are focused on the database market. While unstructured data management is a key area, the database cloning and virtualization capabilities could provide differentiation for Delphix. We would note that other rival players such as Actifio also want to address database and test/dev use cases.”

Delphix on Aug. 3 also announced support for Microsoft Azure. Delphix Dynamic Data Platform already is optimized to run on Amazon Web Services and IBM Cloud.

For more information, go here.

Chris Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is Editor of Features & Analysis at eWEEK, responsible in large part for the publication's coverage areas. In his 12 years and more than 3,900 stories at eWEEK, he...