Microsoft Azure Named Top Cloud Performer
Google ranked second, Amazon Web Services sixth in Compuware's CloudSleuth monitoring tests; methodology is likely to be debated.
Microsoft's Azure leads the pack in cloud performance after 12 months of testing, according to CloudSleuth, one of the few monitoring services that's assembled a year's worth of data on cloud providers.
Google's App Engine ranked number two. GoGrid, run from a Virginia-based infrastructure-supplying data center, is number three. OpSource, now owned by NTT of Japan, was fourth and Rackspace, fifth. Amazon's EC2 cloud ranked sixth. The top three rankings are the same as those reported by CloudSleuth in late winter.
CloudSleuth is a performance monitoring service that puts on display the response times it collects on a daily basis from 25 different cloud providers. Test downloads from a standard application are sought every 15 minutes by 30 different testing sites and response times recorded, with each cloud site hosting identical copies of the application. The application mimics a common Internet interaction, the download of a catalogue-type page and a second download, triggered by the completion of the first, of a page with a large image on it.
[What other ways can you monitor the performance of a cloud workload? See 20 Innovative IT Ideas To Steal.]
The tests, however, do not constitute a precise measure of performance inside the cloud. They are just as likely to show the switch and router latencies of the network between the cloud data center and the closest Internet connection point. In that case, the tests show the expected level of performance visible to end users. And in that sense, the statistics are indicators of relative overall performance for your workload if it's placed in one of the targeted clouds.
The tests are launched from 30 nodes scattered around the world on network backbones and operated by Compuware's Gomez network-performance monitoring unit. Compuware is the owner of CloudSleuth and uses it to display the capabilities of its Gomez performance monitoring engine. Gomez, a more extensive, end-user-oriented testing network, underlies CloudSleuth.
Every 15 minutes, the nodes function as "headless" users seeking a catalogue-type page download from the application. The first page has 40 small objects on it, mimicking the processing time required for a catalogue page, while the second is a single large object. The performance time measured is the combined times it takes for both downloads to complete; the sum of the two steps.
Only fractions of a second separate the leading providers, although fractions of a second in response time sometimes count as a key competitive advantage with e-commerce applications. The two-page download from Microsoft's 700,000-square-foot Azure center outside Chicago took 6.072 seconds. Runner-up Google App Engine took 6.445 seconds; GoGrid, 6.723 seconds; Opsource, 6.980 seconds; Rackspace, 7.192 seconds; and Amazon EC2, 7.204 seconds.
Note that about one one-hundreth of a second is all that separates Rackspace in the number five spot from Amazon at six, although a more significant 1.1 seconds separates number one Microsoft from Amazon.
Also, not all cloud sites owned by the same service provider performed the same in the tests, again raising the question of whether it's the data center, the network, or the distribution of testing locations that performs differently.
Amazon had four of its five EC2 sites included in the results. Its fifth in Japan was launched in February and was not available for the full 12 months of testing which began in Aug. 1, 2010, and ended July 31. Its top performing site in the tests was its U.S. East data center in northern Virginia, with the sixth place mark of 7.204 seconds. U.S. West came in ninth at 8.111 seconds. EC2's Dublin, Ireland site placed 13th at 11.724 seconds, and its Singapore site was number 17 at 20.957 seconds, last in the group for which Compuware released statistics.
Likewise, GoGrid's second data center in northern California was 11th at 8.770 seconds; Microsoft's second Azure data center in Singapore came in 15th at 16.103 seconds. Not only do the distances and network connections to the data centers vary in the testing, but the manner of construction of each data center is likely to be varied, depending on the time when it was built, hence the differing results for the same vendor.
The other vendors represented by the 17 cloud sites and their places were: seventh, Teklinks in Alabama at 7.521 seconds; eighth, BitRefinery in Colorado at 7.653 seconds; 10th, Terremark in Miami, at 8.617 seconds; 12th, CloudSigma in Switzerland at 11.071 seconds; 14th, IIJ GIO in Japan at 15.794 seconds; and 16th IT Clouds in Australia at 19.181 seconds.
CloudSleuth's methodology is likely to be probed and criticized by the cloud providers, particularly those that fared less well in the tests. One avenue of criticism is that the location of the 30 test nodes on network backbones is arbitrary. Such a limited number of sites could not be positioned equidistant from each cloud data center. The sites were selected to "reflect where Internet traffic originates," said Ryan Bateman, product marketing manager for CloudSleuth, in an interview. In other words, there are more sites in Europe and North America than Africa and Latin America, and more test sites on the two U.S. coasts than in the Midwest.
Nevertheless, CloudSleuth provide one of the few measures of comparative cloud performance, as seen by end users, said Lloyd Bloom, senior product manager at Compuware, in an interview. "We thought it was important to get these results into the public domain. It's important that you consider performance when you consider a cloud vendor," said Bloom.
About the Author
You May Also Like
2024 InformationWeek US IT Salary Report
May 29, 20242022 State of ITOps and SecOps
Jun 21, 2022