10 Requirements of a New-Generation Cloud Storage System

 
 
By Chris Preimesberger  |  Posted 2014-10-14
 
 
 
 
 
 
 
 
 
  • Previous
    1 -  10 Requirements of a New-Generation Cloud Storage System
    Next

    10 Requirements of a New-Generation Cloud Storage System

    by Chris Preimesberger
  • Previous
    2 - Single Storage Platform a Good Place to Begin
    Next

    Single Storage Platform a Good Place to Begin

    Using a single, centralized platform enables you to capture unstructured data from your storage islands into one storage environment, sometimes called a data lake. You can add the capacity you need, when you need it. You can drain data from the lake as needed, from wherever you need to do it.
  • Previous
    3 - Use Software-Defined Storage
    Next

    Use Software-Defined Storage

    It's simple: Using software-defined storage gives you the ability to manage and scale your storage environment easily. Older systems have no way of providing this kind of agility.
  • Previous
    4 - A CIFS/NFS File System Gateway Can Be Beneficial
    Next

    A CIFS/NFS File System Gateway Can Be Beneficial

    We all work with files and objects, and each storage system processes these differently. Why should you have to choose between the two? A file system gateway enables you to mix and match workflows between object and file interfaces. Data comes in as files, objects come out via the HTTP API, and vice versa.
  • Previous
    5 - Use Enterprise Authentication Integration
    Next

    Use Enterprise Authentication Integration

    Each company uses various enterprise management systems, such as LDAP and Active Directory. Your storage system should seamlessly integrate with these authentication systems to provide secure, authenticated access to data.
  • Previous
    6 - Required: Built-In, Multi-Site Disaster Recovery
    Next

    Required: Built-In, Multi-Site Disaster Recovery

    Everyone needs to back up data, but you shouldn't have to sacrifice latency for data access or ingest—or from where your data is accessible. The best way to do this is to have a built-in, multi-site disaster recovery in your storage system. This enables you to create as many replicas as you want and distribute them across multiple geographic regions in a single cluster.
  • Previous
    7 - Flexible Storage Policies a Must
    Next

    Flexible Storage Policies a Must

    Having the flexibility to manage your data—from where it is stored geographically to who can access it—is invaluable. So instituting flexible storage policies enables you to control these things by consolidating storage tiers within a single cluster. You'll have unprecedented freedom to provide the storage services that users and applications need.
  • Previous
    8 - Heterogeneous Hardware Must Be Supported
    Next

    Heterogeneous Hardware Must Be Supported

    All companies want to get the most out of their capital investments while also making their users as productive as possible. With a hardware-agnostic storage system, you can build a durable, scalable system using standard hardware from multiple vendors; you can even mix storage node density and device size.
  • Previous
    9 - It Needs to Be Simple to Manage
    Next

    It Needs to Be Simple to Manage

    Managing petabytes of data is a big task, but the process should be made as simple as possible. You want easy-to-use storage management tools, rolling and no-downtime upgrades, and cluster health monitoring. Make sure that's what you have.
  • Previous
    10 - TCO That Can't Be Beat
    Next

    TCO That Can't Be Beat

    Total cost of ownership (TCO) is a financial estimate intended to help buyers and owners determine the direct and indirect costs of a product or system. Software-defined storage systems are much more cost-effective than legacy systems because they run on newer, leaner code, faster processors and larger data pipes. They also are faster and easier to configure.
  • Previous
    11 - Use a Private Cloud Infrastructure
    Next

    Use a Private Cloud Infrastructure

    Storing data internally and in the cloud gives more flexibility and is ultimately better for TCO, especially with software-defined storage. For disaster recovery, it's better to have copies of the data on site and backed up somewhere else.
 

More data is being collected each day across devices, systems and networks, and the velocity of this data collection is picking up speed. With all this new data, however, come new challenges relating to capacity, security and accessibility. Storage software and hardware both need constant updating, and they also need to be tested periodically for disaster recovery functions. Legacy storage systems generally are no longer efficient or effective enough to support data center growth and scale, thus enterprises must continually review their investments in data storage. Because all data is valuable, the enterprise's Internet of things service providers bear the responsibility of ensuring that all their customer's data is safe, secure and well-managed. When factors such as privacy, bandwidth, compliance and costs are taken into account, companies of all sizes and sectors—enterprise or consumer—have a lot to consider in storing data. This slide show, based on eWEEK reporting and industry insight from Joe Arnold, CEO of software-defined object storage specialist SwiftStack, offers 10 requirements of a new-generation, 21st-century cloud storage system.

 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
Rocket Fuel