6 Reasons Private Clouds Aren't Dead Yet
Public cloud use is growing fast, but there are still plenty of reasons to adopt a private cloud. Product development, agility, optimization of resources, and cost containment are among them.
![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/blt818de0e71c46cd28/64cb289c0087696893ae351d/shutterstock_109030388.jpg?width=700&auto=webp&quality=80&disable=upscale)
The Cisco Global Cloud Index says that 68% of enterprise workloads will be executed in the public cloud by the end of 2020. Another 24% will be executed in private cloud infrastructure, bringing the total for cloud computing to 92%.
So, why don't all those users of the public cloud simply become the standard and everyone else move in that direction? It's hard enough to see why the traditional data center is still hanging around at 8% of the total by the end of 2020.
Why does it need to survive at all, and why will private cloud infrastructure be hanging on then as well? Why not realize maximum gains by moving everything into the public cloud? Aren't the largest economies of scale be achieved there?
[Considering Hadoop? Read Hadoop Pros and Cons for Enterprise Users.]
Part of the answer is that even public cloud providers understand some customers have reasons to keep a portion of their compute load off multi-tenant public cloud servers. Also, some customers have large applications where any form of latency is an issue, and they want their workload to run unimpeded by others on a bare metal server.
As a result, service providers offer the option of private cloud servers, accessible only through a virtual private network or over a private line.
But there are other reasons private clouds are necessary or desirable, which we'll explore in the following pages. The private cloud isn't necessarily a laggard, a dinosaur waiting for its day of extinction, post legacy data centers. In many cases, it's a more specialized beast, designed to fulfill specific purposes that can't be easily met in the public cloud.
If a task is mission-critical to the company, the resources devoted to it are frequently of a higher order, needing bigger servers, top-of-rack switches, and more instrumented monitoring than offered by the general purpose public cloud.
Remember, service provider AWS is building a cloud for the CIA, rather than putting the CIA in the public cloud.
The CIA needs a private cloud. Here's a look at why you may too.
In 2013, the CIA awarded a $600 million contract to AWS to construct a private cloud for it. It would be located on government premises and built to CIA specifications, but it would be operated by AWS technicians. The CIA knows that there are prying eyes about and needs to keep its computing physically and operationally safe. Renewed competition between Russia and the US, terrorist organizations, and the constant prevalence of malevolent hackers dictated that the agency turn to a private cloud.
But are its reasons so different from the ones confronted by enterprise IT, caught in a highly competitive environment and under threat from outside hackers? All the reasons the CIA decided it needed a private cloud -- primarily as a safe place to analyze masses of data -- could apply to your IT organization and company as well.
Public cloud providers have built resilient architectures that survive most -- but not all -- failure scenarios. On Sept. 18, AWS's popular DynamoDB NoSQL database system, on which several services depend, was knocked offline for five hours inside US East-1 in Ashburn, Va. Other services that depended on it were slowed or brought to a halt as well. Amazon experienced closer to a three-day outage over the Easter weekend in 2011, and Microsoft stumbled into an outage with Azure as it tried to enter the 2012 leap year. We've yet to learn every cloud failure scenario. If your business can't tolerate any time offline, you may need a private cloud.
You've adjusted to the digital economy and your DevOps team is capable of iteratively producing software for new products and continuously updating it. But too much of the critical competitive data being used keeps flowing off-premises into the cloud and back again. One the best reasons to build a private cloud is improving time to market for new products, and keeping the gains represented by those products under wraps.
OpenCAPI, the open source Coherent Accelerator Processor Interface, will enable new types of servers to process data at high rates of speed through GPUs and other accelerators that will be able to pull data in from devices attached to the PCI Express bus. We've entered a phase when servers are being redesigned for cloud use, and they'll surely find their way into the public cloud. But a private cloud builder who sees technology he or she wants to use can implement it at the same time and reap the benefits.
It doesn't matter how many workloads move to the public cloud, there are going to be some that won't go. That's mainly because of sensitive data and because they're working fine as they are, perhaps on an IBM mainframe. A private cloud can be geared to work with that reality. Instead of investing in expensive consulting skills to refactor the application and move it into the cloud, maximize the communications network between the legacy app and a private cloud, making use of existing skills in-house.
A legacy data center can't simply stay the same and allow the company to remain competitive. Compute, storage, and networking need to be managed as virtualized resources. Data on their operation need to be collected until machine learning can analyze the data center's optimum operating combinations. Policies applied by a policy engine need to govern more of the existing environment and predictable deployments. In short, a private cloud looks a lot like a software-defined data center with all the benefits of automation that follow. If you need a legacy data center, convert as much of it as possible into a private cloud, as OpenStack consultant Mirantis and Intel together advocate (PDF).
A legacy data center can't simply stay the same and allow the company to remain competitive. Compute, storage, and networking need to be managed as virtualized resources. Data on their operation need to be collected until machine learning can analyze the data center's optimum operating combinations. Policies applied by a policy engine need to govern more of the existing environment and predictable deployments. In short, a private cloud looks a lot like a software-defined data center with all the benefits of automation that follow. If you need a legacy data center, convert as much of it as possible into a private cloud, as OpenStack consultant Mirantis and Intel together advocate (PDF).
-
About the Author(s)
You May Also Like