Will AI and Machine Learning Break Cloud Architectures?

AI and machine learning are powered by lots of data, so much so that one futurist thinks today's cloud architectures aren't enough.

Lisa Morgan, Freelance Writer

June 10, 2019

4 Min Read
Image: Nmedia - stock.adobe.com

Businesses must continually evolve their data storage strategies to keep pace with emerging data usage requirements. Most of today's enterprises now have a hybrid cloud architecture. However, as they move more data and workloads to the cloud, they may be in for a surprise when it comes to production-level artificial intelligence and machine learning. Specifically, storing the requisite data may become too expensive to be practical.

In fact, futurist Tom Koulopoulos thinks AI and machine learning will drive the next wave of data storage innovations out of necessity. That's the premise for his latest book, The Bottomless Cloud.

"One of the Achille's heels of AI and machine learning is their voracious appetite for data," said Koulopoulos."The problem is that learning has a certain cost associated with it, and in case of AI and machine learning it's how much it costs to capture, transfer, and store data."

For example, file systems and storage software provider Tuxera estimates that one autonomous car generates between 11 terabytes and 192 TB of data per day.

"According to Google, a Waymo generates 2 TB of information in a single day. I imagine it will be more like 20 TB per day when they're fully autonomous," said Koulopoulos. "Even at 2 TB per day, over a single year, you'd end up with about $3 million-plus dollars of storage costs in Amazon, Google or Microsoft cloud. $3 million dollars [worth of storage costs] for a $30,000 Tesla doesn't make a lot of sense."

Who will sow the seeds of economic change

Koulopoulos expects startups to lead the next wave of storage innovations rather than the incumbents. For example, cloud startup Wasabi, one of his clients, promises an 80% reduction in storage costs, up to 6 times the speed of Amazon S3, free egress and 11 9s of data durability.

Although the AWS and Microsoft Azure teams were invited to share their views on the future of storage for this story, both declined to participate.

Naturally, Amazon, Google and Microsoft are well aware of the startup activity in the space. 

"AI is kind of a dead man walking if we don't figure this out," said Koulopoulos. "You can demonstrate driverless cars, you can have a fleet of a few hundred of them maybe, and if you're Google you have a lot of storage to work with, so the cost is not a big issue when demonstrating the viability of the technology. But at scale, it just is not tenable, it's not sustainable. We have to change the economic model of storage in order to create some of the early proof points for AI."

Amazon, Google, and Microsoft aren't in any imminent danger, nor does Koulopoulos expect them to drop out of the data storage ecosystem. However, when new innovations enable new economic models, business leaders tend to choose the economically favorable option.

"In the next five to 10 years, I think it really comes down to revisiting the basic business model of cloud data storage," said Koulopoulos. "Wasabi caught my attention because they're doing the same thing Amazon, Google and Microsoft are doing and they're doing it at 10% of the price. It's not the data storage that costs, it's the business model being used."

In the short term, existing technologies will be optimized while fundamentally different data storage technologies will emerge. Collectively and individually, these solutions will enable better outcomes that can be achieved faster and more cost-effectively.

"I see an ecosystem developing with a lot of smaller players that are going to create a lot of disruption, a lot of commotion and eventually the Googles, Amazons and Microsofts will follow suit to some degree," said Koulopoulos. "I think they will have to change their models, but they won't change them willingly for some time because they don't have to.”

Test your own theory

The best way to get a sense of what's next is to experiment with emerging technologies so their value can be compared and contrasted with what's already in place.

"A lot of companies are talking about AI, but they're experiments. Once you move past those experiments, you realize the importance of data storage as a cost and you start to shift your thinking a little bit and build applications that may not be built on Amazon's S3 but built on one of these other ecosystem players' models," said Koulopoulos. "You need to experiment and test and see what the value is."


Greater storage space at lesser costs creates opportunities to do things that were not economically feasible before, such as making backups of backups or the ability to unleash AI and machine learning on even larger corpuses of data.

"If I look at any cool business model in any industry from healthcare to education to retail, ultimately, it's the data business model and data is limitless," said Koulopoulos. "If we figure out the economics of storing that data, then we truly have infinite abundance which results in infinite innovation and infinite business models. It's huge value for us going forward, and I'm convinced that we'll be able to solve some huge problems that we haven't been able to solve [before]."

About the Author(s)

Lisa Morgan

Freelance Writer

Lisa Morgan is a freelance writer who covers business and IT strategy and emerging technology for InformationWeek. She has contributed articles, reports, and other types of content to various publications and sites ranging from SD Times to the Economist Intelligent Unit. Frequent areas of coverage include big data, mobility, enterprise software, the cloud, software development, and emerging cultural issues affecting the C-suite.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights