When data center managers are looking at cloud storage solutions the primary focus is on the monthly cloud storage capacity cost. One of the hidden charges is the transfer or bandwidth costs. This is the cost essentially for the amount of data you access over the cloud. We have seen cases where this cost can be as much or more than the cost of the capacity!

George Crump, President, Storage Switzerland

December 18, 2009

3 Min Read

When data center managers are looking at cloud storage solutions the primary focus is on the monthly cloud storage capacity cost. One of the hidden charges is the transfer or bandwidth costs. This is the cost essentially for the amount of data you access over the cloud. We have seen cases where this cost can be as much or more than the cost of the capacity!If you move data to a cloud storage service and rarely or never access it again, these costs are not really a problem, but if you migrate data that you are going to need frequently to a cloud service and that service charges you for additional access to that data, these costs can get out of hand quickly. How do you keep these costs under control?

The first step is to be aware that cloud storage transfer costs exist and to figure out how to address them. Many of the major cloud storage providers have an access or bandwidth utilization cost, but not all. Review the various suppliers on the market and look to see if the ones without a transfer cost, or at least a very low one, can meet your needs.

The second step is to understand how you are going to use cloud storage. As we discuss in our article "What is Cloud Storage" there are many uses for the cloud and how you use it may effect the relative importance of data transfer costs. For example, if you are using the cloud for backups transfer costs could be a significant issue as you may be doing a fair number of restores from the cloud, but if you are using the cloud for archive purposes where you are going to upload data once and forget about it then transfer costs may be less of an issue.

The third step is to make sure that the data you are moving to the cloud has reached a certain level of inactivity. This can be done with storage analysis tools or it can be done with file virtualization tools like those from F5 and AutoVirt. In the file virtualization case these solutions provide a transparent movement layer that makes managing another tier of storage transparent.

The final step is to make sure that data is optimized. Ideally this is an engine that can optimize the information before it is migrated to the cloud or it can store it in an optimized but native format. Storwize can compress data and keep it in compressed format throughout the data lifecycle as can Ocarina Networks, who also adds deduplication to compression as well as a data migration capability. Reducing the data in an environment where you are getting charged by the GB makes sense. When you factor in the potential reduction in costs of bandwidth used it makes these technologies a very compelling part of a overall cloud strategy.

As is the case with any other tier of storage, getting full use out of it means understanding it strengths and weaknesses. In the cloud you are dealing with theoretically lower overall costs but you are also dealing with a slower and potentially more expensive access cost. Making sure you put the right data at the right time in the cloud is essential to driving maximum value from it.

Track us on Twitter: http://twitter.com/storageswiss

Subscribe to our RSS feed.

George Crump is lead analyst of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Find Storage Switzerland's disclosure statement here.

About the Author(s)

George Crump

President, Storage Switzerland

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. With 25 years of experience designing storage solutions for datacenters across the US, he has seen the birth of such technologies as RAID, NAS, and SAN. Prior to founding Storage Switzerland, he was CTO at one the nation’s largest storage integrators, where he was in charge of technology testing, integration, and product selection. George is responsible for the storage blog on InformationWeek's website and is a regular contributor to publications such as Byte and Switch, SearchStorage, eWeek, SearchServerVirtualizaiton, and SearchDataBackup.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights