March 11, 2015
Software Defined Datacenters or (SDDC) have been the major architectural milestone for the data center virtualization industry. Compute resources have been effectively virtualized to be mobile and flexible, however, storage and networking still exist as less pliable programmatic, granular, and modular design components in virtualization. In his post, “VMware’s Strategy for Software Defined Storage,” Richard McDougall, the CTO of Storage and Availability at VMware, describes the various concepts that will overcome the modern storage challenges and accelerate the customer’s journey to the SDDC.
Key Innovations in Software Defined Storage
McDougall sums up the goal of Software-Defined Storage (SDS) in a single sentence,
“Just as we’ve experienced with pool compute resources, we want to be able to provision new storage as part of an application workflow, using application-aware policies for performance, cost, availability and recovery.”
One solution does not solve the needs of every application or every use case, but tools are available that address many of today’s storage challenges.
Virtual Volumes (VVOLS)
Until VVOLS, the capabilities of your storage array could only be provisioned by the SAN/NAS control plane and inherited in the hypervisor environment at the volume level with little visibility. With virtual volumes, you can apply your storage arrays superior performance and data protection capabilities per Virtual Machine as required by the applications running on them and do so in a single pane of glass.
VSAN leverages flash accelerated, direct attached storage across multiple hypervisor nodes as a single pooled storage repository. This is also known as a hyper-converged platform with no SAN or NAS required, enabling a scalable, low cost, and high-performance solution.
Traditional relational storage databases work well for one-to-many data mappings. Often many-to-many data relationships in computer aided, multimedia, or extremely dynamic platforms are more complicated and unpredictable and, therefore, require a new reference model for short-term and persistent data structures.
Big data platforms require a low cost, scalable, extremely dense, high performance, and highly available solution; that sounds almost impossible to find. It is basically the extreme of every storage requirement but CHEAP. Platforms like Hadoop enable commodity hardware with distributed data repositories and parallel processing for the Big Data use case.
Like what you read?
Mindsight, a Chicago IT consultancy and services provider, is an extension of your team. Our culture is built on transparency and trust, and our team is made up of extraordinary people – the kinds of people you would hire. We’ve always prided ourselves in delivering the full spectrum of IT services and solutions, from design and implementation to support and management. Our highly-certified engineers and process-oriented excellence have certainly been key to our success. But what really sets us apart is our straightforward and honest approach to every conversation, whether it is for a local business or global enterprise. Our customers rely on our thought leadership, responsiveness, and dedication to solving their toughest technology challenges.