Cloud // Platform as a Service
08:00 PM
Connect Directly
Repost This

Google App Engine Caching Catches Up With Amazon

Google App Engine matches AWS in application caching service, but not, apparently, in price.

Google App Engine gained ground as a developer platform Wednesday when Google added a dedicated, as opposed to shared, caching service that can significantly speed up many applications.

The move puts App Engine on a more equal footing with Microsoft's Windows Azure and Amazon Web Services, which launched its ElastiCache service in August 2011. Platform-as-a-service (PaaS) offerings Heroku and Engine Yard also offer caching services.

The App Engine service is based on open source Memcached (pronounced "mem-cache d"), a caching system that automatically saves frequently used data to cache and, when necessary, pushes out aging data as it falls into disuse. It was created by Brad Fitzpatrick in 2003 to support the operation of the interactive consumer site LiveJournal. It was later rewritten in C and became a popular open source code module.

Caching systems are frequently used with transaction processing and high-activity Web applications. They store data as objects in server memory or add on solid state modules, retrieving data at high rates of speed without going to a disk-based data store. Facebook, Zynga, You Tube, Twitter and Reddit are all examples of applications that rely on Memcached.

[ Want to learn more about Amazon's ElastiCache? See Amazon Goes After Cloud App Speed Bumps. ]

Google App Engine's caching service is available through its API and is still in preview form. It nevertheless allows a developer using App Engine to designate an amount of cache to be used with his, and only his, application, rather than relying on the platform's shared caching service. Google announced the addition of the service to App Engine release 1.8.2 in a blog post Wednesday.

Google will charge 12 cents per GB for up to 20 GBs of dedicated cache. The alternative, shared Memcache service is free. The Google pricing appears to be higher than Amazon's ElastiCache pricing, although an exact comparison is difficult because of what might be varying amounts of CPU power provided with the caching service. Amazon lists the amount of CPU assigned to the cache node; Google didn't in its "preview" announcement.

Amazon ElastiCache's on-demand, standard extra-large cache node is priced at 62 cents for 14.6 GBs of memory, or 4.25 cents per GB. Google's charge would be $1.80 for a slightly larger 15 GB cache node at 12 cents a GB.

The Google blog said users could move between the shared cache service and dedicated cache without changing their applications.

Comment  | 
Print  | 
More Insights
Google in the Enterprise Survey
Google in the Enterprise Survey
There's no doubt Google has made headway into businesses: Just 28 percent discourage or ban use of its productivity ­products, and 69 percent cite Google Apps' good or excellent ­mobility. But progress could still stall: 59 percent of nonusers ­distrust the security of Google's cloud. Its data privacy is an open question, and 37 percent worry about integration.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Elite 100 - 2014
Our InformationWeek Elite 100 issue -- our 26th ranking of technology innovators -- shines a spotlight on businesses that are succeeding because of their digital strategies. We take a close at look at the top five companies in this year's ranking and the eight winners of our Business Innovation awards, and offer 20 great ideas that you can use in your company. We also provide a ranked list of our Elite 100 innovators.
Twitter Feed
Audio Interviews
Archived Audio Interviews
GE is a leader in combining connected devices and advanced analytics in pursuit of practical goals like less downtime, lower operating costs, and higher throughput. At GIO Power & Water, CIO Jim Fowler is part of the team exploring how to apply these techniques to some of the world's essential infrastructure, from power plants to water treatment systems. Join us, and bring your questions, as we talk about what's ahead.