Big Data Workloads Require a New Security Plan: 10 Best Practices
Find and Secure All Your Data Assets
In order to effectively fortify important data assets, you need to know where they are. If a breach occurs in the cloud, it is important to understand whether these assets have been compromised. If they have, know who should be held accountable. Carefully reviewing a service level agreement (SLA) before you sign is critical in order to know where your enterprise stands should a data loss incident occur. Knowing who has access to the source code for your database or application is equally as important.
At least two things can be counted upon as big data workload processing and analytics become strategic in enterprise business plans: increasing bandwidth requirements and the emergence of significant new security issues. Regardless of how large or small data workloads may be, IT staff needs to be able to capture and store it all, properly enrich, cleanse, analyze or otherwise process it and effectively access it securely. This refers back to our mantra here at eWEEK: It's all about control of the data. RSA Security President Art Coviello, who's been working on IT security systems for more than three decades, has seen it all and is still learning something new each day, told eWEEK that "as our data surfaces increase, so do our vulnerabilities." The more data we move, and the more places we move it, the more opportunity there is for foul play at some point along the line. With this in mind, eWEEK worked with Helsinki-based network security provider Stonesoft to put together this slide show on big data security best practices. Phil Lerner, VP of Technology at Stonesoft, produced this information. Stonesoft provides mid- and large-size organizations with software-based network security packages.