Cloud data storage...public, private and hybrid, what's so hard to understand?
Well, the world cloud is the problem. Most people assume cloud storage means public cloud storage because that was the first form of cloud data storage available with AWS being the first to offer it back in 2006. If you go back that far, there weren't many private cloud data storage offerings. There was EMC Centera and Caringo CAStor and not much else for on-premises private cloud data storage.
Flash forward to 2016 and the ranks of public cloud storage providers has gotten larger with the addition of Google Cloud Storage and Microsoft Azure Blob while the builders of private cloud storage now include Amplidata (HGST), Caringo, Cleversafe (IBM), Ceph (Red Hat), Cloudian, DDN, EMC, Hitachi, Scality, and SwiftStack. Some of these object-based storage vendors also support the use of their software to build public storage clouds as well as private storage clouds.
Not many private cloud storage vendors support tiering data from a private storage cloud to a public storage cloud. Cloudian can tier data to AWS S3, AWS Glacier, and Google Coldline. Cloudian is also reported to be working on tiering data to Microsoft Azure Blob. When you are tiering data from a private storage cloud to a public storage cloud, you have what is somewhat incorrectly referred to as a hybrid storage cloud. Tiering is a better term because hybrid already means the use to two types of storage media in a single enclosure as you would find in a hybrid disk drive that incorporates traditional spinning disk platters with solid state storage to improve disk drive performance.
Private cloud storage is right for all of the reasons Cloudian found in its survey. Security, cost, compliance, governance and control over data storage are all important to most SMB and enterprise customers. And there are types of data that SMB and enterprise customers will never keep in a public storage cloud.
The "800-pound gorilla" in the room is unstructured data. Most SMB and enterprise customers automatically assume that public cloud storage is an appropriate place for unstructured data when it is a mi$take to put all of your unstructured data in a public storage cloud. You will pay to keep this data there forever. You will pay more to touch it, and if you want it all back, you will pay for that too.
Private cloud storage is like a Swiss Army knife. What you can do with it is virtually unlimited aside from using it as primary data storage for transactional systems or database applications, but even that will likely change when solid state storage like SSDs start appearing in private storage cloud architectures.
The biggest problem facing private cloud, object-based storage software vendors is they have too few customers. They have been so busy selling the "scale-out" capabilities of private cloud storage that they have lost sight of the sub-petabyte market which is many times larger than the market for petabyte-plus private cloud storage. What they should be talking about is storage simplification and the ability of private cloud storage to scale down to meet the needs of customers over a range of use cases.