Software-Defined Storage Gaining Traction Among Businesses

The number of participants who answered that flash makes up higher than 40 percent of their storage capacity was just 9 percent.

sds and datacore

More than half of organizations (52 percent) look to extend the life of existing storage assets and future-proof their IT infrastructure with software-defined storage (SDS) in 2015, according to a survey of 477 IT professionals currently using or evaluating SDS to solve data storage challenges.

More than half of respondents (52 percent) identified the ability to add storage capacity without business disruption as the primary reason for choosing storage virtualization software.

Close to half of respondents look to SDS to avoid hardware lock-in from storage manufacturers, which allows them to lower hardware costs by shopping across several competing suppliers.

Supporting synchronous mirroring and metro clusters for high availability and business continuity, and asynchronous data replication for remote site disaster recovery are also high on the list.

"It is clear that big data and the analytics information can and will make a big impact on business, and as a result companies are willing to make investments if the right solutions are available to them," George Teixeira, president and CEO of DataCore, told eWEEK. "The problem or concern is that most companies lack the capabilities to handle big data projects, or have a skill set gap especially evident in the lack of data scientists and trained Hadoop engineers. Therefore, it is being limited to very large enterprises or those that can afford to offload big data projects to outside vendors to perform the task."

More than half of the respondents (53 percent) said they currently have assigned less than 10 percent of total capacity to flash storage.

The number of participants who answered that flash makes up greater than 40 percent of their storage capacity was just 9 percent.

However, the complexity which accompanies data growth and diversity is taking a big toll, as 61 percent of respondents indicated that human error was behind application and data center outages.

More than 60 percent of respondents experienced performance degradation or the inability to meet performance after virtualizing server workloads.

When asked what the typical causes of performance problems are, 61 percent of participants blame slow applications, and 46 percent single out legacy storage devices as the culprit.

While flash technology penetration expanded, it is still absent in 28 percent of the cases and 16 percent of survey respondents reported that it did not meet application acceleration expectations.

In addition, 21 percent reported that highly touted hyper-converged systems did not perform as required, or did not integrate well within their infrastructure.

On the other hand, SDS and storage virtualization are deemed very urgent now, with 63 percent of organizations making important investments in these technologies throughout 2015.

The survey found 81 percent also expect similar levels of spending on Software-Defined Storage technologies that will be incorporated within server storage area networks (SANs) and virtual SANs and converged storage solutions.

"In the near future, SDS will evolve to include aspects such as improved integration and transparent automation of data services," Teixeira said. "The ultimate goal of software-defined storage is to provide a single, unified set of storage services across all storage devices for maximum availability, performance and efficiency, as well as to ensure the overall health and protection of vital storage assets."