To see the problems in building the so-called Internet of Things, look at the trackside switches in what the railroad industry calls "dark territory." These switches are important -- if one is in the wrong position, a train could go off on a sidetrack spur at normal track speed and derail. But these areas are called dark territory because they're lightly used stretches of track in remote areas, where there are no automated signals, and probably no power lines and cellular links. Train operators must visually check that each switch is in the right position.
Union Pacific CIO Lynden Tennison would love to have a monitor that does nothing more than tell dispatchers and engineers which position a switch out in dark territory is in. For such a simple task, "it seems like it ought to be a $100 device, just to me and you living in the consumer tech world," Tennison says. But the sensor would need a power source and a communication link, and it would need to be hardened and weather-resistant. His goal: to get the cost to buy and implement each switch down below $10,000.
Tennison gets a lot of sales calls from analytics software vendors, each promising to help him sort out the data that the Internet of Things can generate for Union Pacific, the largest railroad company in the US. But Tennison's bigger problem is still having to do too much manual data collection. "I keep telling them that if you'll solve my sensor problem and get me a lot of cheap sensors out there that can collect a lot more information for me, I'll buy your analytics engine," he says.
That's the state of the game when it comes to the Internet of Things -- progress, but also frustrating barriers.
Companies in a variety of industries -- transportation, energy, heavy equipment, consumer goods, healthcare, hospitality, insurance -- are getting measurable results by analyzing data collected from all manner of machines, equipment, devices, appliances, and other networked "things."
Union Pacific says it reduced the number of train derailments caused by failed bearings by 75% by doing near-real-time analysis of data collected by sensors along its tracks, and now it's pouring millions of R&D dollars into new techniques, such as accelerometers on trains that feel for bumps that suggest a bad track. GE Power & Water says it helped Dubai Aluminum improve the fuel efficiency of its gas turbines by 1.5% while increasing output 3.4%, by analyzing sensor-collected operating data. Oil and gas company ConocoPhillips thinks it can save about $250 million a year in drilling costs by doing real-time measurement and analysis along the drill line to fine-tune factors such as speed and pressure on the drill bit. FedEx expects to save $9 million a year using sensors on its trucks that let it schedule dock assignments more efficiently.
Companies are moving more cautiously on the customer-facing Internet of Things, but they're making progress as well. John Deere can do remote, wireless diagnostics of some tractors and combines, for example. Guests at Disney World can wear MagicBands equipped with RFID chips that, when placed next to a reader, connect to their accounts and let them make purchases, access rides, and open their hotel rooms.
But companies are also hitting roadblocks. Union Pacific's Tennison says this whole area of "sensor-based, network-based diagnostic and predictive analytics" will be the biggest technology opportunity in his industry for the next 10 or 15 years. "Having said that, it's not moving as fast as I would like," he says.
Whirlpool CIO Michael Heim says "our toe is in the water on connected devices," as the company figures out the kind of connections customers really want in their homes, and what they'll pay for. Heim does see huge potential, and not just the cliché scenario of your refrigerator knowing all its contents and emailing you when the milk's running low. If customers let Whirlpool track appliance usage remotely, that would be a boon to product development, providing a window into what features people really use. What if the fridge told you when temperatures are varying, suggesting a pending failure, or your icemaker lost water pressure, suggesting a busted pipe might be spraying water all over your kitchen? What if your washer could be diagnosed remotely, since many appliances already generate electronic error codes? Even further out, what if people with elderly parents could monitor appliances remotely -- if Dad opened the fridge four times, used the stove, and ran some laundry, he's probably OK.
While Whirlpool product teams are working on all the foundations for this kind of connectivity, Heim says, "those are more futuristic than you think."
Here are the main IoT challenges companies are wrestling with.
The data isn't good enough.
One of the myths about the Internet of Things is that companies have all the data they need, but their real challenge is making sense of it. In reality, the cost of collecting some kinds of data remains too high, the quality of the data isn't always good enough, and it remains difficult to integrate multiple data sources.
Let's start with getting enough data. The cost of a sensor includes not just the device, but also the installation, maintenance, connectivity, and power. And even in tightly controlled environments such as a factory, a lot of legacy equipment wasn't built for Internet connectivity, making security and integration problematic.
"We've come a long way, and we're leveraging the heck out of what we do have out there," Tennison says. "I'm just saying to myself, 'If I had 10 times or 20 times as many collection points as I do today, how much better could we get?' That seems to me right now the biggest problem."
Data quality is a problem that GE Power & Water CIO Jim Fowler is putting in front of his $28 billion-a-year unit's CEO and other company leaders. The monitoring and alerting systems GE is developing for maintenance of its gas and wind turbines, for example, draw on many types of data, including customers' operational data and their inventories of replacement parts.
The data collected today is good enough to improve operations -- GE says wind-power company First Wind, for example, improved energy output 3% from existing turbines by monitoring weather and operating conditions and changing the blade pitch on its turbines for better efficiency. But the data
GE is using isn't as timely, complete, or accurate as it wants (timeliness is the biggest challenge). So Fowler has taken two main steps to improve data quality.
One, a year ago he assembled a data science team that includes a data quality group, which is looking for ways to automate areas where employees collect critical data manually. Two, he created a data quality portal that shows the top 10 problems the group is trying to improve. The unit's CEO sees a report monthly.
As CIO, Fowler thinks he should "own every piece of data in our environment" but that business unit leaders must see and understand the data-collection problems that cause bad data, so they can have a hand in fixing those quality problems.
Some data just isn't available. While we hear a lot about early implementations of "smart meters" or Nest-type home automation thermostats that provide real-time insight into power use, there still isn't a lot of energy demand data coming into power generators. This winter was brutally cold, and some power plants saw surges -- and resulting outages -- unlike any they've seen before. If the plants had had accurate data on that rising demand, they could have run at higher capacity. "That linkage is one of the next big areas to look at," Fowler says.
Networks aren't ubiquitous.
Cellular networks cover a lot more ground than they did even a few years ago, and 4G networks are expanding, but once you move beyond big metropolitan areas, you still can't count on cellular.
ConocoPhillips spread its own radio towers across parts of Texas to transmit sensor data in order to optimize gas and oil well production. But newer techniques and sensors could generate 100 times more data than those wells gather and transmit today -- more than the network can handle for real-time analysis. "Wired pipe," for example, lets a driller put fiber optic cable miles down the well to collect sound, pressure, and seismic data in a constant stream. "Then you're talking about gigabytes of data flowing off this all the time," says Richard Barclay, ConocoPhillips' manager of infrastructure and operations.
Does the company need to pipe all that data back to some command center for analysis? If so, how quickly? Or can it crunch some of that data at the wellhead to guide urgent decisions about drill pressure and speed, and send less-urgent data back for historical analysis? Barclay says his team is examining what amount of data, measured at which intervals, people really need to make a decision.
The future Internet of Things model often will combine on-machine processing for urgent needs and batch-data uploads for less timely analysis. Bill Ruh, VP of GE Software, describes this as "real-time, big data processing at the machine. We don't have anything like that today."
For companies tying the Internet of Things into their products, the concern is less that there isn't a network available. Instead, the worry is that the network becomes key to the customer experience, yet it's something the product maker doesn't control. "You're dealing with almost a massive outsourcing of your brand, in terms of the eyes of the customer," says Alex Brisbourne, president of Kore, a company that provides wireless machine-to-machine connectivity.
Integration is tougher than analysis.
Connected devices and machine-to-machine communication no doubt generate a lot of data. But analyzing that data to get useful insight is not, surprisingly, among the top barriers to the Internet of Things. "Analytics is the least of our problems," says Union Pacific's Tennison.
Data analysis does take expertise -- data-savvy people who understand the business problems their companies are trying to solve and the opportunities they're trying to seize. Companies often must change their business processes to let employees respond to and use the insights the data analyses present.
But the hard part isn't crunching the data; it's connecting all the systems needed to paint a complete data picture, says Richard Soley, executive director of the Industrial Internet Consortium. A group of big-name companies -- led by AT&T, Cisco, GE, IBM, and Intel -- created the Industrial Internet Consortium to spur the kind of integration, authentication, and security
needed to make such connections easier. Soley describes how hospitals use sensors clipped to a patient's finger to gauge oxygen levels, but those sensors generate false alarms.
Hospitals also use respiratory sensors that tell if the patient's chest is going up and down. The two measure the same "system" for a person -- taking in oxygen -- but are rarely integrated because there is no data standard or platform for doing so. "This literally kills people," Soley says. "... It's a lack of Internet thinking in industrial systems."
Companies spend 90% of their IoT budgets on those kinds of integrations, leaving insufficient money to drive the operational changes that actually produce the returns, says Ton Steenman, Intel's IoT business leader. "It should be 10% of your money on stitching things together," he says.
More sensor innovation needed.
A typical IoT scenario starts with some kind of sensor collecting data on a device. Look for a wave of innovation and experimentation ahead in what those sensors measure, look like, and cost.
That innovation has begun. In the medical field, a new generation of wearable monitoring technology lets hospital personnel track patients' health after they're discharged in order to avoid readmissions (and associated financial penalties from the health insurers). Auto insurance companies such as Progressive and Allstate offer customers networked devices that analyze and report on their driving habits with the promise of cutting their rates if the results are positive.
Apple last fall rolled out iBeacon, a device that uses low-level Bluetooth frequency to let businesses engage with consumers' iPhones over short distances. For example, a retailer could offer a product coupon when a person puts a phone near a designated iBeacon sensor. "It allows brands to compete for shoppers in the store," says Erik McMillan, CEO of Shelfbucks, a startup that's marketing an iBeacon management platform. Shelfbucks offers a similar platform for near-field communication, but McMillan predicts that retailers will prefer iBeacon's superior reliability. "NFC is such a finicky technology," says McMillan, citing problems with metal interference and with having to put the phone a precise distance from the reader. "Within two years, NFC will disappear," he says.
Not only is Union Pacific experimenting with accelerometers on trains that feel for bumps, it's also running cameras over tracks (at lower speeds than a freight train, for now) and using algorithms to analyze the captured images for cracks and other flaws in railroad ties. The image analysis is catching more than 90% of what human inspectors would, CIO Tennison says. UP is also testing video analysis at higher speeds, via cameras posted along tracks or in railyards, whose footage exposes problems, such as a bent ladder, on trains as they roll by their regular routes.
GE's Ruh calls video "the most underutilized sensor in industrial markets, one of three game-changing classes of sensors we're going to see." Combining video with autonomous drones opens up even more possibilities for industrial monitoring in remote and hazardous areas.
The second big area for innovation: more-refined and more-affordable environmental sensors for measuring the things companies already measure, such as vibration, temperature, and pressure, Ruh says.
The third area of innovation Ruh identifies is the "software-defined sensor," a combination of multiple sensors plus computing power that sits out on a network and "calculates rather than measures."
Companies must start thinking about what and how many sensors they'll need to add to their products to satisfy customer needs. Ruh notes that today's airline jet engines have a "double-digit" number of sensors, more than twice the previous generation, and that the next generation will have a triple-digit number. But it's not just high-tech products that will need sensors. Tennison is talking with makers of composite-material railroad ties about whether it makes sense to embed a sensor that can tell a reader on a passing train when a tie has deteriorated.
Status quo security doesn't cut it.
The IT industry has almost two decades of experience securing datacenters and devices against Internet threats. But that experience could prove a liability if IT professionals don't recognize the differences in the IoT world of operational technology (OT).
"The biggest fallacy is that traditional IT security solves operational technology problems," says Ruh. For example, compared with securing a datacenter, there's much more of a physical dimension to machine-to-machine security -- frequent hands-on maintenance and repair. Taking a turbine down for a security reboot is a huge problem. "What we're finding is that the IT world still doesn't get it," Ruh says. "... When I look at the future of security, it's going to have to be a foundationally different world in OT security than IT security."
For example, Ruh envisions a power plant turbine having cameras and sensors that watch and understand who a person is and what he's doing. Is it a known engineer doing something within his responsibility? Is someone doing something that will cause a failure, due to malice or mistake? Machines need to have smarter tools to question a person making changes or send alerts about changes -- essentially, to protect themselves, Ruh says.
Segregating networks is another security best practice, building on the auto manufacturing approach, Fowler says. In a car, the network for antilock braking systems is entirely separate from the network for entertainment systems. Diagnostic and monitoring networks are separate from operational ones where possible.
Fowler notes that 30-year-old machine-controller systems weren't written with Internet connectivity in mind. A lot of GE's controller-related R&D is focused on how to rebuild controllers so they can do things like machine-to-machine authentication and certificate-based security.
Likewise, the Internet will need new standards for machine-to-machine communication. "We're bending the Internet to lots of uses that it wasn't designed to do," says Soley, of the Industrial Internet Consortium.
The Industrial Internet Consortium has working groups working on three main deliverables:
Cost savings today, revenue tomorrow.
Consumer-facing IoT projects are advancing in fits and starts. Google's $3.2 billion acquisition of Nest, a maker of Internet-connected thermostats and smoke detectors, provides one giant proof point of the commercial interest, but there are few other "wow" examples. We've seen a flurry of new Internet-connected wearable devices, but one notable setback is the FuelBand fitness tracking device, which Nike said last month it would stop developing, though it will continue developing fitness tracking software. On the flip side, Facebook just acquired Moves, an app that does fitness tracking via a smartphone.
The clearest IoT successes today are from industrial projects that save companies money, rather than from projects that drive new revenue. Intel's Steenman sees the most interest in areas such as smarter manufacturing and buildings, monitoring and optimization of power and water supplies, and city government uses such as managing and rerouting vehicle traffic.
But even with these industrial projects, companies shouldn't underestimate the cultural change they need to manage as machines start telling veteran machine operators, train drivers, nurses, and mechanics what's wrong, and what they should do, in their environment.