Modern Storage Software Erodes Resistant Data Silos
September 20, 2016 Lance Smith
With the record-breaking $60 billion Dell/EMC acquisition now complete, both of these companies and their customers now have more options than ever before to meet evolving storage needs. Joining forces helps the newly minted Dell Technologies combine the best of both worlds to better serve customers by blending EMC storage and support with Dell pricing and procurement.
But there is some trouble in paradise. Even when sold by the same vendor, most storage systems have been designed as secluded islands of data, meaning they aren’t terribly good at talking to each other.
In fact, this silo effect is exacerbated by different storage types, and has caused complexity and sprawl in the storage game since the dawn of computing. When an acquisition takes place, it is tough for customers to add solutions from the newly adopted sibling if they previously used storage primarily from one company – in this case, either Dell or EMC. These storage silos are resistant to simplification, as they make it difficult for companies of all kinds to deliver seamlessly integrated systems that span different storage types and the cloud.
This problem isn’t unique to the big suppliers like Dell/EMC. Acquisitions are par for the course in the storage industry, with larger companies frequently snapping up emerging technologies to remain competitive. Beyond the business unit incorporation needed post-acquisition, the challenge of customer integration and adoption of new products takes major planning and execution once the ink is dry on the deal. In the age of acquisitions, storage diversity also presents a pretty big challenge for the engineering teams now working to help customers integrate storage from both companies into the customer accounts the two companies now share.
Simplify With Software
In the era of self-driving cars and virtual reality, it may be of no surprise that software can finally make all this easier. Data virtualization is a software technology that separates the storage control path from the data path and abstract logical data into a global dataspace spanning the different file, block, and object protocols used by different types of enterprise storage systems.
A storage-agnostic data virtualization solution can transparently connect storage infrastructures from any vendor, media, or protocol, allowing data to easily move between server, shared, and cloud storage for the first time. This approach makes it easy to add performance and capacity from any vendor.
But the ability to connect different types of storage is only half of the equation. Without the ability to automatically orchestrate data to the right type of storage at the right time, IT professionals would just have a bigger headache to manage manually across their different storage silos. With a platform that can virtualize different storage tiers across a global dataspace, customers can automatically orchestrate data placement across Dell, EMC, and even Amazon Web Services cloud storage according to policies that align data performance, protection and price requirements with the capabilities of each system.
This approach maximizes the value of customers’ existing investments while making it easy to integrate new storage through the universal storage compatibility created by the global dataspace. This universal compatibility gives IT professionals more options than ever before, as they can easily add storage not only from leading vendors like Dell, EMC, Hewlett Packard Enterprise, Pure Storage, VMware, Amazon, and more, but also adopt whitebox storage solutions emerging from the work of Facebook’s Open Compute project.
From Storage Past To Storage Future
In the past, storage virtualization tried to deliver some of these capabilities within a single storage system or across a storage type from a single vendor. The new approach of data virtualization is the first time it has been implemented across vendors and across the different storage protocols of flash, SAN, NAS, and cloud storage.
All the storage diversity available today is a great asset for customers who can finally spend their budgets according to their needs rather than having to settle for a one-size-fits-most-scenarios product. With data orchestration, enterprises can ensure that their specialized systems are aligned to their data. Until now, it has been difficult to know what data is hot or cold, and even more difficult to move data that is not on the right storage. Data virtualization finally delivers the insights and abilities needed to ensure the right data is in the right place at the right time.
In addition, cloud storage has so far been used primarily as cheap capacity storage for cold data. In time, more enterprises are expected to adopt cloud platforms to serve more workloads beyond just cold data archival. With data virtualization, it can be easy for vendors to offer and for customers to adopt and integrate these new services.
The Last Hardware Standing
With software defining so much of the hardware in our lives and adding new capabilities from our phones to our thermostats, it is about time for software to simplify the storage backbones serving all those screens. Data virtualization and orchestration is the next wave of intelligence coming to enterprise datacenters, and it will give early adopters both a competitive edge and the savings to make new investments optimized to serve their unique business needs, as told by the history and patterns hidden in their own data.
While just emerging on the market, storage-agnostic data orchestration across different storage types helps enterprises get the kind of flexibility their unique application stacks require. As The Next Platform has pointed out many times, enterprises have it much tougher than hyperscalers, which have to support only a few kinds of applications and perhaps only dozens of them compared to many thousands for the world’s largest enterprises. This scale of complexity is just as daunting as the scale of capacity that hyperscalers face, and it has to be attacked with clever software, not just cheap hardware.
Lance Smith is the chief executive officer at Primary Data, a supplier of data virtualization tools for the enterprise. He was previously the senior vice president and general manager of the I/O Memory Solutions division at SanDisk in the wake of the acquisition of Fusion-io, where Smith was president and chief operating officer. Prior to entering the storage business, Smith was heavily involved in the microprocessor industry, with business development and technical marketing positions at AMD, NexGen, Chips and Technologies, and Raza Microelectronics.