If you look back at the history of computing, you'll notice a cycle where computing becomes more centralized, then it becomes more distributed, then back to centralized again, and so on.
Over the past 10 to 15 years, the centralized model, in the form of cloud computing, was the dominant trend. But now the trend seems to be swinging back toward a distributed model again as edge computing becomes more prevalent.
In edge computing, data processing happens at the edge of the network rather than in a centralized hub. This means that devices at the edge of the network need to have processing and storage capabilities. In practical terms, edge computing takes a lot of different forms. Remote offices that have their own servers and storage on-site are a form of edge computing. Drones, autonomous vehicles, and mobile devices are also examples of where you can apply edge computing. And as these types of devices become more commonplace, the edge computing market is experiencing exponential growth.
According to Grand View Research, the edge computing market was worth $3.5 billion in 2019, and it's growing rapidly. In fact, the firm expects to see a 37.4% compound annual growth rate through 2027, when the market could reach $43.4 billion.
So why is edge computing becoming so popular now?
Edge computing has a couple of big advantages over centralized models like cloud computing. First, if you process data close to where you use it, you can reduce latency. In other words, your devices get faster if you don't have to wait for them to transmit data to the cloud, for the cloud to process it, and for the cloud to send data back to the device again.
Second, if you process some data at the edge, you don't have to transmit as much data to the cloud, which can reduce costs related to data transmission.
This slideshow highlights 10 trends that are making these benefits particularly attractive to enterprises right now and are accelerating edge computing.
You really can't discuss edge computing without discussing the Internet of Things (IoT). These days it's common for everything from home appliances to watches to vehicles to children's toys and even clothing to have an Internet connection. And in the industrial space, Internet-connected sensors and equipment have become the norm. Best estimates from several different research firms say that there are currently more than 20 billion endpoints connected to the Internet of Things, and that number could double in the next five years.
All those things generate a lot of data. IDC forecasts the IoT will generate around 79.4 zettabytes (ZB) of data by 2025. Transmitting that much data puts a strain on networks, and it can be expensive. Giving IoT devices the ability to do some processing at the edge of the network reduces the amount of data that they need to transmit, reducing costs and freeing up bandwidth for other things.
It might seem strange to list a worldwide pandemic as a "trend," but the big news of 2020 has definitely accelerated the edge computing trend. Because so many people around the world are now working from home, enterprises have essentially gained thousands of small remote offices. The data processing and storage that happens in all those remote offices qualifies as edge computing. Until someone discovers an effective vaccine or cure for COVID-19, expect this type of edge computing to continue unabated. And even when things go back to "normal," some enterprises expect to continue enabling a large remote workforce, which means continued edge computing.
3. Digital Transformation
Digital transformation has been one of the hottest buzz words in enterprise technology over the past decade. While the phrase means different thing to different organizations, in general, it involves changing an enterprise's processes, operations, and culture to focus on digital ways of doing business.
For many organizations, digital transformation has meant embracing the Industrial Internet of Things (IIoT), smart factories, ecommerce and mobility -- all things that have increased their reliance on edge computing.
4. 5G Headaches
For years, telecommunications companies have been promising that 5G service is just around the corner. When it finally becomes widespread, 5G could transmit data 10 times faster than 4G services. However, technological and regulatory hurdles have repeatedly delayed its rollout. In the meantime, businesses have begun deploying an increasing number of devices that will eventually take advantage of 5G capabilities. For now, however, many are using edge computing to reduce the amount of data they need to transmit and using the existing, albeit much slower, 4G networks. Until 5G becomes much more prevalent (and probably even after), edge computing will continue to bridge the gap.
5. Analytics and AI
Enterprises have received the message that analytics, particularly analytics powered by artificial intelligence (AI) and machine learning (ML) can provide insights that will transform their businesses. According to IDC, the market is growing at a 13.2% CAGR and could be worth $274.3 billion by 2022.
In the early days of the analytics trend, organizations were primarily applying their analytics tools to their business data. However, as time passes, they are beginning to expand their analytics to more operational data. In some cases, those analytics are happening at the edge -- in smart factories, retail outlets, remote offices, and other locations -- increasing the trend toward edge computing.
6. Digital Twins
A digital twin is a computerized representation of something that exists in the real world. A product development prototype is a digital twin. So are weather models and virtual reality environments based on the real world.
Thanks to the growth of the Industrial IoT, today the majority of enterprises have created digital twins that reflect the status of their operations. For example, a manufacturer may have a digital twin for each piece of equipment on its factory floor, so that it can monitor the status of its production lines. When enterprises have thousands of sensors are transmitting data that gets used to update the digital twin, it requires a whole lot of bandwidth. Many are using edge computing to do some of the necessary processing on-site so that they can transmit a smaller subset of data to the digital twin.
Augmented reality (AR) and virtual reality (VR) create immersive, highly realistic environments for users. But the graphics necessary to sustain these environments requires an awful lot of processing power, particularly in the form of graphics processing units (GPUs). While it is possible to do this processing in the cloud, the latency caused by transmitting the data back and forth results in a lag that makes the AR or VR environment less real. For this reason, AR and VR environments almost always rely on devices with edge computing capabilities, such as high-powered headsets with built-in GPUs.
8. Privacy Regulations
The European Union's General Data Protection Regulation (GDPR) imposes a lot of restrictions on whether personally identifiable information related to EU residents can leave its country of origin. And several other countries also have mandates that limit where data can be stored. To comply, enterprises have turned to distributed cloud computing models (where data physically resides on servers in disparate countries but is part of a unified cloud) and edge computing. By storing and processing data at the edge of the network, organizations can follow relevant laws while still doing business globally.
9. Smart Cities and Autonomous Things
It might still seem like science fiction, but in the not-too-distant future, autonomous vehicles, smart cities, and AI-powered drones and robots will be part of everyday life. Already, city governments are beginning to build sensors into infrastructure, cars with self-driving capabilities are on the roads, and commercial drones fly overhead. As these things become more commonplace, so will the amount of data processing and storage handled at the edge of the network. As in many other applications, transferring data to the cloud and back again simply takes too long for most of these scenarios, so expect smart infrastructure and smart things to have a growing amount of processing power.
10. Natural Disasters
Fires. Floods. Hurricanes. Global pandemic. 2020 has pointedly demonstrated that enterprises need to have plans in place to deal with these kinds of catastrophes. And climate scientists tell us that these kinds of events will likely become more commonplace in the future.
Disaster planning professionals have long advised organizations to store data in more than one location. While edge computing alone isn't a business continuity plan, storing data both on the edge and in the cloud can be a valid strategy for dealing with disasters. And it's just one more trend accelerating the adoption of edge computing.Cynthia Harvey is a freelance writer and editor based in the Detroit area. She has been covering the technology industry for more than fifteen years. View Full Bio