Solix Launches Big Data Suite for Hadoop

 
 
By Nathan Eddy  |  Posted 2014-08-05 Email Print this article Print
 
 
 
 
 
 
 
big data and hadoop

The suite also offers bulk data storage for enterprise analytics applications and archives enterprise applications on a petabyte scale.

Enterprise data management (EDM) specialist Solix announced the launch of Big Data Suite, an enterprise archiving and data lake application platform for Apache Hadoop.

The solution provides an Information Lifecycle Management (ILM) framework to govern enterprise data and analytics applications and Hadoop as a near-line repository to store less frequently accessed data.

The suite also offers bulk data storage for enterprise analytics applications and archives enterprise applications on a petabyte scale.

"[Research firm] Gartner says data growth is the number one CIO challenge, and we see from our customers that data is growing at an exponential rate, approaching petabyte scale, for many organizations on account of growing unstructured data such as videos, images, social and machine logs," John Ottman, executive chairman of Solix Technologies, told eWeek. "All this makes it more important than ever to be able to handle this petabyte level volume of data that organizations have and to be able to organize it properly."

The Hadoop Distributed File System (HDFS) offers a unified near-line storage platform for structured and unstructured data, bringing diverse users and application workloads to one pool of data for ongoing access and later use by enterprise data warehouse applications.

"The job of ILM is to help organizations meet compliance frameworks such as COBIT which require control processes for enterprise data management such as data retention policies based on business rules support for legal hold, and classification and masking of sensitive data," Ottman explained. "It is the job of ILM to maintain such mandatory data governance."

To that end, archive data is classified for security and compliance requirements such as legal hold, and universal access to all data is provided through structured queries, reports and full text search for business objects.

Based on the MapReduce programming model to process large data sets across distributed compute nodes in parallel, Hadoop provides data storage with scalability, high-availability, fault tolerance, automated backup and disaster recovery.

"Data is growing with videos, images, and more, and it’s approaching the petabyte scale for many organizations," he said. "To be able to efficiently access and manage this much data, you need a data management solution to handle the volume of data, and organize it properly for the best views of data by end users."

Ottman said small businesses with tight IT budgets still have data that they need to manage access intelligence from and would benefit from having a smaller scale data management system.

"I would recommend that they look into a Solix ExApps appliance. The software and hardware is shipped pre-loaded on a Dell hardware appliance," he said. "It saves time and money to get started, and this intelligence will be invaluable to the organization as they grow and scale."

 
 
 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Thanks for your registration, follow us on our social networks to keep up-to-date
Rocket Fuel