Enterprise interest in the Internet of Things and artificial intelligence are coming together, with AI helping the IoT do its job.

Guest Commentary, Guest Commentary

June 22, 2018

4 Min Read

Artificial intelligence and the Internet of Things are two of tech’s most popular buzzwords. Put them together, and you have a potent combination for handling the mind-boggling amounts of data flooding enterprises from all directions.

Worldwide spending on IoT is expected to reach $1.4 trillion by 2021, according to IDC, as organizations invest in IoT-enabling hardware, software, services and connectivity. IoT is seen as the future of just about everything, from smart-city advances like traffic congestion relief and intelligent street lighting, to better energy management, to industrial robotics and asset tracking, to monitoring of medical equipment and patient condition (not to mention the array of home consumer applications).

All of these devices and sensors – an oft-quoted Gartner prediction places the number of connected things at 20.4 billion by 2020 -- produce nearly unimaginable volumes of data. Companies, governments and other organizations need to be able to collect, parse and analyze all that data to detect patterns that can drive business decisions.

The more data sources you have, of course, the tougher it is to derive meaningful insights from them. The only possible path to a coherent picture is automated intelligence using machines, i.e. AI.

Simple, right? Problem solved? Far from it.

When we talk about managing IoT devices on the edge, we really mean “control planes,” not the devices themselves but the things that control the things. An example of this in your home might be the Phillips Hue Bridge unit that directs your smart light bulbs from your phone or tablet. In a large enterprise setting, it means the cloud infrastructure elements that manage the compute, networking and storage associated with the devices or sensors. There can be thousands of them.

As IoT pieces proliferate, it becomes exceedingly difficult, inefficient, and expensive to manage them remotely from a centralized cloud or data center without also having a system to gather and analyze the data closer to the source. IoT devices need to be able to collect and process huge amounts of data in near real time with minimal latency. And companies can rack up huge sums in connectivity costs if they send entire raw streams of sensor data back to the main data center instead of only the most useful information.

Thus, a fundamental priority for enterprises today must be developing AI and machine learning (ML) frameworks that can be deployed at the edge, managing and sifting through data there and then returning the most relevant data to the core. Then it can be combined with other information across the organization for global insights.

There is no way to do this without some form of AI. The complexity level is simply too high.

The tightening bond between AI and IoT is driven by a sea change in corporate computing: In the past, it was all about how well machines can compute something and, even in the recent cloud era, it is about how well clouds can deal with the fact that that compute may be elastic. Now, it’s all about the data: Where does it make sense to have data stored, how should it be moved around, and what actions should be taken based on what it tells us?

All of this is causing a re-think of the architectures and tools required to make AI models easily accessible and reusable for the intelligent edge.

As enterprises extend more and more to the edge, containers are supplanting virtual machines as the go-to technology. The latter are just too heavyweight.

Kubernetes has emerged as the clear winner for container orchestration, with AI as one of its fastest-growing use cases, because it provides a great deal of operational efficiency and agility for this edge-to-core dance.

Kubeflow, an machine learning stack built for Kubernetes, reduces the challenges in building production-ready AI systems, such as manual coding to combine various components from different vendors and hand-rolled solutions and difficulty in moving ML models around without major re-architecture.

Acumos is a new Linux Foundation project to foster a federated platform for managing AI and ML applications and sharing AI models. With its visual workflow to design AI and ML applications, as well as a marketplace for freely sharing AI solutions and data models, Acumos holds strong promise for enabling the re-launch of AI applications as containers.

Thanks to these AI-supporting technologies, organizations will be able to focus on data science in an increasingly IoT-dependent enterprise without hitting infrastructure walls. It’s fair to say that success at this endeavor will become a key competitive differentiator for companies across industries.

AI and IoT is a crucial combination that will shape corporate data strategies for years to come.

Stephan Fabel is Cloud Architect and Product Manager at Canonical.

About the Author(s)

Guest Commentary

Guest Commentary

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT professionals in a meaningful way. We publish Guest Commentaries from IT practitioners, industry analysts, technology evangelists, and researchers in the field. We are focusing on four main topics: cloud computing; DevOps; data and analytics; and IT leadership and career development. We aim to offer objective, practical advice to our audience on those topics from people who have deep experience in these topics and know the ropes. Guest Commentaries must be vendor neutral. We don't publish articles that promote the writer's company or product.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights