Software-defined storage and hybrid deployment approaches may keep disk drives and other 'obsolete' technologies around longer than you'd expect.
10 Hot Cities For IT Pros In 2015
(Click image for larger view and slideshow.)
Storage optimization is focusing increasingly on solid-state drives (SSDs). However, it's unclear when we'll see the proverbial tipping point in favor of SSDs that I prophesied in late 2013. It's not proving to be as rapid a sea change as I'd thought when I said that tipping point would arrive in 2015.
Nevertheless, the long-term trend in favor of SSDs is undeniable. The economics of SSDs will continue to benefit from Moore's Law improvements in storage density. In a January article for ZDNet, The Future Of Storage: 2015 And Beyond, author Rupert Goodwins provided a good discussion of innovations that will continue to boost flash memory's capacity while improving reliability.
Interestingly, ongoing investments throughout the storage industry will continue to increase hard-drive densities while lowering their cost-per-bit of stored data. Death notices for hard disk storage feel a bit premature. Yes, SSDs are near-ubiquitous in most of the new niches of the computing universe, including mobile devices, Internet of Things (IoT) endpoints, and beyond. But today's commercial SSDs, most of which rely on NAND-based flash memory (for their lightning-speed performance), still suffer from reliability and lifecycle drawbacks that prevent this technology from achieving slam-dunk predominance.
Where hard drive technology is concerned, what I find most fascinating are innovations such as shingled, two-dimensional, and heat-assisted magnetic recording. In the face of stiffening competition from SSDs, these and other innovations will continue to deliver incremental improvements in hard-drive scalability -- which remains the technology's strong suit -- over the rest of this decade.
Depending on how these innovations stack up against concurrent SSD improvements, the tipping point could shift out indefinitely. At the very least, these advances in hard-drive technology ensure that enterprise storage administrators will continue to adopt hybrid approaches that involve shifting mixes of both technologies. In this way, the advantages of each technology offset the disadvantages of the other.
"Enterprise storage continues to move rapidly to a hybrid model, where similar techniques, architectures, and developmental models are applied both within and beyond the enterprise's traditional boundary," said Goodwins in his article, adding an on-premises versus cloud-delivered-service dimension to the hybrid discussion.
As Goodwins noted, this form of hybrid requires that IT administrators implement a hardware-agnostic approach to managing disparate storage resources in a distributed, virtualized architecture. This, in turn, depends on deployment of such key technologies as software-defined storage and storage virtualization. By pinning their storage architectures on software-defined virtualization, IT administrators can flexibly mix and match diverse storage technologies -- past, present, and future -- while mitigating the economic and technical risks of overreliance on any of them.
Software-defined storage is driving the demise of storage rip-and-replace. That's because it allows disparate storage resources to be managed centrally and independently from hardware platforms and other details of various technologies. The software-defined approach supports greater operational efficiency and location-independent data mobility across heterogeneous, distributed storage resources within a private, hybrid, or public cloud.
In short, we're seeing a future in which legacy storage technologies may endure in real-world deployments long after they're obsolete. That's not a bad thing if it helps storage administrators recoup the full economic value of these investments.
Attend Interop Las Vegas, the leading independent technology conference and expo series designed to inspire, inform, and connect the world's IT community. In 2015, look for all new programs, networking opportunities, and classes that will help you set your organization’s IT action plan. It happens April 27 to May 1. Register with Discount Code MPOIWK for $200 off Total Access & Conference Passes.
Jim is Wikibon's Lead Analyst for Data Science, Deep Learning, and Application Development. Previously, Jim was IBM's data science evangelist. He managed IBM's thought leadership, social and influencer marketing programs targeted at developers of big data analytics, machine ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.