2Hadoop Wasn’t Designed for Enterprise Data
Like much ground-breaking IT (such as TCP/IP or Unix), Hadoop wasn’t originally built with the enterprise in mind, let alone enterprise security. Hadoop’s original purpose was to manage publicly available information such as Web links, and it was designed to format large amounts of unstructured data within a distributed computing environment, specifically Google’s. It was not written to support hardened security, compliance, encryption, policy enablement and risk management.
3Hadoop’s Security Relies Entirely on Kerberos
Hadoop does utilize Kerberos for authentication. However, this protocol can be difficult to implement, and it doesn’t cover a number of other enterprise security requirements, such as role-based authentication, LDAP and Active Directory for policy enablement. Hadoop also doesn’t support encryption on nodes or on data in transit between nodes.
4Hadoop Clusters Consist of Many Nodes
Traditional data security technologies have been built on the concept of protecting a single physical entity (like a database or server), not the uniquely distributed big data computing environments characterized by Hadoop clusters. Traditional security technologies are not effective in this type of distributed, large-scale environment.
5Traditional Backup/Disaster Recovery Isn’t the Same in Hadoop
6Hadoop Is Rarely Used Alone
7Compliance Mandates Still Apply in Big Data Workloads
Big data doesn’t come with a separate set of regulations and mandates. Regardless of the IT used to store and manage data, enterprise organizations must still comply with regulatory requirements for data privacy and security such as HIPAA (health care), PCI (credit industry) and SOX—even though approved traditional security technologies fail to fully address the challenges of big data environments.
8Cost of a Breach Undetermined
9Big Data Users on Their Own With Security
10Additional Steps Needed to Protect Data Cluster
This will be the case until IT that addresses the vulnerabilities of a Hadoop environment is made available. Organizations must regularly scan their cluster environment for vulnerabilities. They also must make it a best practice to replicate and back up data while storing it in a separate secured environment.
11Hadoop Users Must Keep Up to Date
As large data load batch processing becomes mainstream in the enterprise, new IT is coming out all the time—from established companies as well as startups—that is designed to make big data workloads more useful for businesses. Best practices for IT managers should always include regular visits to sites such as eWEEK, which covers all these relevant sectors of big data IT: security, storage, servers and data center systems as a whole.