When data center managers are looking at cloud storage solutions the primary focus is on the monthly cloud storage capacity cost. One of the hidden charges is the transfer or bandwidth costs. This is the cost essentially for the amount of data you access over the cloud. We have seen cases where this cost can be as much or more than the cost of the capacity!
When data center managers are looking at cloud storage solutions the primary focus is on the monthly cloud storage capacity cost. One of the hidden charges is the transfer or bandwidth costs. This is the cost essentially for the amount of data you access over the cloud. We have seen cases where this cost can be as much or more than the cost of the capacity!If you move data to a cloud storage service and rarely or never access it again, these costs are not really a problem, but if you migrate data that you are going to need frequently to a cloud service and that service charges you for additional access to that data, these costs can get out of hand quickly. How do you keep these costs under control?
The first step is to be aware that cloud storage transfer costs exist and to figure out how to address them. Many of the major cloud storage providers have an access or bandwidth utilization cost, but not all. Review the various suppliers on the market and look to see if the ones without a transfer cost, or at least a very low one, can meet your needs.
The second step is to understand how you are going to use cloud storage. As we discuss in our article "What is Cloud Storage" there are many uses for the cloud and how you use it may effect the relative importance of data transfer costs. For example, if you are using the cloud for backups transfer costs could be a significant issue as you may be doing a fair number of restores from the cloud, but if you are using the cloud for archive purposes where you are going to upload data once and forget about it then transfer costs may be less of an issue.
The third step is to make sure that the data you are moving to the cloud has reached a certain level of inactivity. This can be done with storage analysis tools or it can be done with file virtualization tools like those from F5 and AutoVirt. In the file virtualization case these solutions provide a transparent movement layer that makes managing another tier of storage transparent.
The final step is to make sure that data is optimized. Ideally this is an engine that can optimize the information before it is migrated to the cloud or it can store it in an optimized but native format. Storwize can compress data and keep it in compressed format throughout the data lifecycle as can Ocarina Networks, who also adds deduplication to compression as well as a data migration capability. Reducing the data in an environment where you are getting charged by the GB makes sense. When you factor in the potential reduction in costs of bandwidth used it makes these technologies a very compelling part of a overall cloud strategy.
As is the case with any other tier of storage, getting full use out of it means understanding it strengths and weaknesses. In the cloud you are dealing with theoretically lower overall costs but you are also dealing with a slower and potentially more expensive access cost. Making sure you put the right data at the right time in the cloud is essential to driving maximum value from it.
2014 Next-Gen WAN SurveyWhile 68% say demand for WAN bandwidth will increase, just 15% are in the process of bringing new services or more capacity online now. For 26%, cost is the problem. Enter vendors from Aryaka to Cisco to Pertino, all looking to use cloud to transform how IT delivers wide-area connectivity.
Server Market SplitsvilleJust because the server market's in the doldrums doesn't mean innovation has ceased. Far from it -- server technology is enjoying the biggest renaissance since the dawn of x86 systems. But the primary driver is now service providers, not enterprises.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?