Amazon Pumps Out New Database and Machine Learning Services
Amazon's announcements at AWS Re:Invent continue to extend AWS offerings far beyond basic infrastructure services.
Amazon Web Services, the leading provider of cloud-based IT services, this week announced a slew of new products and features, which it hopes will build on that lead and perhaps keep business customers from defecting to rivals like Microsoft Azure and Google Cloud Platform.
In this realm, known generally as machine learning, Amazon faces strong competition from Microsoft and Google.
The news came out of the company’s annual AWS Re:Invent conference which drew an estimated 40,000 AWS customers, partners, competitors, and media.
Most businesses already use some Amazon cloud computing services, which started rolling out in 2006 with the company’s Simple Storage Service or S3. Since then, thousands of startups -- and an increasing number of larger companies like Capital One, Netflix, and General Electric -- have turned to AWS services to build and run their software instead of shelling out money for their own servers and storage.
Keynote speakers this year included executives from Goldman Sachs, PG&E, Expedia, and the National Football League, all touting how they use AWS services.
Amazon wants to get the many companies that already rely on its basic computing, storage, and networking capabilities, to add more complex AWS services to the mix. And those were the types of services that it rolled out this week.
As Amazon has moved up the stack to add these capabilities, it also faces well-funded and determined competitors like the aforementioned Microsoft and Google but also IBM, Oracle, HPE, and Cisco. These companies rightly see AWS as an existential threat to their core businesses.
But back to the show. Several attendees cited SageMaker, a new tool that AWS executives say will ease the creation of machine learning models, as a high point of the event. AWS executives claim SageMaker can automate the painstaking set up of these models so everyday software developers, as opposed to data scientists, can create them.
“I really think SageMaker will make machine learning more accessible,” AWS CEO Andy Jassy told reporters at the event.
Anton Kropp, senior engineer at Curalate, a Seattle company that focuses on social media analytics, said SageMaker could save engineers time on the drudgery of setting up models so they can focus on more interesting things, like building the smart software that uses those models.
Machine learning, a subset of artificial intelligence, is technology that pores through reams of data, learning as it goes and refining its outputs so it can provide better, more accurate results.
Microsoft, Google, IBM, and others are also on a tear to bring machine learning to the masses. Amazon Alexa, Google Home, and Microsoft Cortana virtual personal assistants all rely on this technology to learn more about what their users want so they can return better results.
Andy Jassy, AWS
Other product enhancements, announced at the show included on-demand backup for to Amazon’s DynamoDB database. Fernando Medina Corey, a Seattle-based software engineer was excited about this capability. “We have lots of data in DynamoDB and if something gets corrupted now it can take weeks to fix it.” With this new feature, an administrator will be able to quickly roll back the database to before the time when the flaw occurred.
In addition, a new DynamoDB Global Tables feature means that data tables can be automatically replicated across two or more AWS geographic regions, so that users worldwide see the same data without anyone having to manage the data replication process. The Google Spanner and Microsoft Cosmos databases offer similar features.
The dizzying array of announcements was impressive but also poses something of a conundrum for current and would-be AWS customers. How do they balance their need to make use of the best and brightest features without fear of relying too much on one provider for key IT capabilities?
Most cloud users, if pressed, acknowledge that this is a looming issue.
John Nichols, director of enterprise architecture for PG&E, said in an interview that over the next two years the utility company will focus on evaluating and deploying AWS technologies to cut costs while updating infrastructure.
He cited Amazon’s new DeepLens video camera, which incorporates AWS’ AI smarts and works with SageMaker, as being of special interest for inspection and augmented reality (AR) applications in the field.
But, he also noted, PG&E wants to set up a multi-cloud strategy.
PG&E might, in that respect, mimic what Netflix has done, which is to use AWS for much of its infrastructure but also a second cloud provider -- Google in Netflix’ case -- to store data. Or, it might end up using a cloud service broker that would sit between PG&E and various cloud providers. Such a broker could route workloads to the most cost-efficient cloud as needed.
“As an architect, I have to think about how to get to the cloud and also how to get out of the cloud. The thing is if you characterize your workloads well, and if they are stable, they might be cheaper to run on premises” he said.
The beauty of doing a cloud transition, he added, is that you learn which of your workloads are suited to run in the cloud and which are not.
Barb Darrow has covered technology for more than 20 years, most recently as senior writer for Fortune Magazine.
The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
2018 State of the CloudCloud adoption is growing, but how are organizations taking advantage of it? Interop ITX and InformationWeek surveyed technology decision-makers to find out, read this report to discover what they had to say!