Software-Defined Storage: Getting Started
Drawn by the combined lures of automation, flexibility, increased storage capacity, and improved staff efficiency, a growing number of enterprises are pondering a switch to software-defined storage (SDS).
SDS lets adopters separate storage resources from the underlying hardware platform. The approach enables storage to become an integral part of a larger software-designed data center (SDDC) architecture in which resources can be more easily automated and orchestrated.
SDS has moved from the early adoption stage into the mainstream, with enterprises in banking, manufacturing, pharmaceuticals, healthcare, media and government rapidly transitioning to the technology. "These customers have adopted SDS for a variety of use cases, including long-term archives, backup storage, media content distribution, big data lakes and healthcare image archives," explained Jerome Lecat, CEO of Scality, a cloud and object storage technology provider.
Greg Schulz, founder of and a senior advisor at storage consulting firm Server StorageIO, said enterprises of all types and sizes are now poised to make the move to SDS. "Across the board, big and small, from government sector to private sector," he said., "Likewise, across different types of applications."
Successful SDS adopters typically began by selecting a discrete use case as a starting point. "Within the enterprise, we see Tier 2 applications, such as backup and archive, as an optimal way to store mission-critical data that is large-scale and a perfect way to demonstrate the scalability, availability and cost-advantages of SDS," Lecat said. "Over time, more use cases, including big data and deep learning, can be brought online to further improve the economic advantages of SDS."
Enterprises that recently moved to a hyperconverged infrastructure (HCI) are already working with SDS, noted Sascha Giese, a senior sales engineer at IT infrastructure monitoring and management technology provider SolarWinds. "A good starting point for such organizations would be to evaluate whether HCI has benefitted your organization and, if so, consider whether to expand the SDS footprint in your data center."
Even organizations that haven't embraced HCI usually already have some type of virtualization in their environments, observed Matt Sirbu, director of data management and data center infrastructure at Softchoice, an IT infrastructure solutions provider.
"VMware, HyperV are really software-defined compute solutions," he said. Software-defined storage products extend virtualization benefits to the data layer, but adopters also need to closely examine the supporting infrastructure. "Any business, when they come up to their next infrastructure refresh cycle, should start to evaluate newer technologies to see what the benefits will be to their organization by leveraging software-defined across all layers, compute and storage," he said.
Jonathan Halstuch, co-founder and chief technology officer of RackTop Systems, a data management technology supplier, noted that it's important to find an SDS product that can meet both current and future storage requirements, particularly in critical areas like compliance and security. "Be discriminating and find a solution that will reduce complexity and tasks for the IT department," he advised. "Then begin to migrate workloads that are the easiest to migrate or are datasets that have special requirements that are currently being unmet, such as encryption, performance or accessibility."
The end of a refresh cycle is a logical time to begin exploring SDS. "An organization should assess their technology roadmap for the next few years and consider making the switch to an SDS solution," said Maghen Hannigan, director of converged and integrated solutions at technology products and services distributor Tech Data. "If an existing environment is in need of a new storage administrator, it may be worth considering (hiring) a new systems administrator proficient in software-defined storage."
A refresh cycle-motivated commitment to SDS can be either large or small. "It may be as simple as dropping in an SDS solution in place of legacy storage," Halstuch explained. "However, it may make more sense to rethink the current architecture, review a hybrid cloud strategy and review the current staffing profile to determine what is the best SDS solution to adopt and how it fits into the long-term vision of the organization."
One mistake organizations often make when planning an SDS transition is to view the technology as a "point product" decision. "Software-defined solutions are ideally part of a larger stack that offers a common operational model for compute, storage, network and cloud," said Lee Caswell, VP of products, storage and availability at VMware. . "The software-defined solution offers a digital foundation with investment protection for any hardware, any application, and any cloud."
"In general, we see organizations regret their decisions to move to SDS either too abruptly or without proper planning," said Daniel Gilfix, marketing manager of Red Hat's storage division. "We witness the frustration of those who venture into the area without the proper skill sets, as if any storage administrator or cloud practitioner can pick up the knowledge and training overnight."
Perhaps the biggest mistake SDS newcomers make is believing that the technology is a "silver bullet" for all workloads. "It's important to look at the workload demands," Sirbu stated. "All organizations can benefit from (SDS) for a large portion of their workloads, but it really comes down to analyzing business requirements with available IT resources to come up with the optimal solution to run their operations."
Recommended For You
There isn’t a standard way of performing an application baselining or profiling. Here is a how-to video with suggestions on how to work through the process.
Hybrid and edge data centers are expanding the role of the traditional data center. This makes DCIM more important today. As with any management software, organizations need to know when it makes sense to keep it on-premise versus going with cloud-based DCIM.
The Interop 2019 speaker discusses ways that enterprises explore DevOps, the skills gap, and the rise of security as code.
Composable infrastructure provides a cloud-like experience for provisioning resources. Understand how it works and how it differs from Infrastructure as Code.
Big changes are happening with data center management as emphasis shifts from core to edge operations. The core is no less important, but the move to the edge opens new challenges as the environment becomes more complex.
Video overview on how to use a portable WAN emulator to validate bandwidth requirements to a backup server.