Will Edge Computing Replace the Cloud?

New technology breakthroughs plus the IoT are driving new interest in edge computing.

Andrew Froehlich, President & Lead Network Architect, West Gate Networks

May 23, 2017

3 Min Read
Image: Shutterstock/Chombosan

Although we're only beginning to scratch the surface when it comes to the capabilities of cloud applications, many are already contemplating the successor to cloud architectures. One such technology being discussed is a called edge computing. This is a concept where we move computation that's currently centralized inside the cloud back to the edge in a distributed manner. Doing this provides key benefits that make it superior to traditional cloud architectures. But will it eventually replace the cloud in the enterprise? That's our topic of discussion today.

The notion of edge computing is nothing new. In fact, prior to cloud computing, we relied on our PC to provide the lion's share of our computational needs as we ran applications locally. But the cloud has changed all that. Now, our devices simply function as front-end terminals that interact with applications and data operating in the cloud. This model ended up being more efficient and cost effective compared with distributed computing models. Additionally, devices could be made smaller and cheaper because they didn't have to have the processing and storage capabilities of old.

But times are changing. We can now manufacture tiny, yet powerful system on chip (SoC) devices for a very low cost. Additionally, the Internet of Things (IoT) movement is creating an environment where a centralized computing model may not be ideal. As millions of connected devices are added and begin collecting data to be processed centrally in the cloud, it becomes clear that we are going to bump into issues such as network congestion and latency. That's why proponents of edge computing are so interested in it. By pushing computation duties back to the edge, you eliminate both problems -- and ultimately improve overall application performance.

It's clear that edge computing does seem to make complete sense when looking at a use-case scenario such as IoT. But before you start rearchitecting your enterprise infrastructure from a cloud-centric to an edge-centric model, let's take a step back and look at the bigger picture. Yes, edge computing can indeed provide performance benefits such as real-time computation and a reduction of dependence on network connectivity into the cloud. But at the same time, you give up a great deal that makes cloud computing so appealing. This includes better overall management of applications and data since they are centralized. Additionally, distributed applications are far more complex to create, deploy and support. Any cost savings gained with improved application performance may be negated.

In the end, I don't see edge computing replacing the centralized computing model that the cloud provides. While it's true that there are network-related performance issues that need to be addressed, moving back to a fully-distributed model is a bit extreme. Instead, I agree with those that envision a hybrid model where cloud providers begin deploying small-scale data centers in strategic geographic locations that move data processing capabilities closer to the user -- but are still centrally managed. This would essentially provide the same benefits of edge computing without relinquishing control.

Additionally, even in situations where edge computing does make sense today -- as is the case with IoT -- we still need the cloud. While it makes sense to perform portions of IoT analysis at the edge, the data still will likely need to be collected and centrally stored. A cloud architecture is the obvious way to accomplish that goal. Ultimately, the cloud would remain an integral part of any edge computing environment.

We have the privilege of witnessing a unique time where we will soon see everything from automobiles to coffee makers being connected to simplify and ultimately improve our lives. And while much of this may indeed leverage the concept of edge computing from a data processing standpoint, it’s not the end-all, be-all technology that some make it out to be. Yes, there will be situations where edge computing makes complete sense. But there will be far more instances where the traditional centralized cloud model is still the right choice.

About the Author

Andrew Froehlich

President & Lead Network Architect, West Gate Networks

Andrew has well over a decade of enterprise networking under his belt through his consulting practice, which specializes in enterprise network architectures and datacenter build-outs and prior experience at organizations such as State Farm Insurance, United Airlines and the University of Chicago Medical Center. Having lived and worked in South East Asia for nearly three years, Andrew possesses a unique international business and technology perspective. When he's not consulting, Andrew enjoys writing technical blogs and is the author of two Cisco certification study guides published by Sybex.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights