General Electric's moniker for the Internet of things is "the industrial Internet." For example, it has a center in Atlanta that remotely monitors about 1,800 GE gas turbines used in electricity power plants worldwide. Sensors on the turbines relay performance data so that GE can anticipate maintenance needs and try to avoid breakdowns.
Software is essential because GE isn't really selling a gas turbine; it's selling the ability to generate power.
GE is increasingly selling service contracts that are less about making repairs and more about guaranteeing performance. It has a pipeline of $147 billion worth of performance-based service contracts that will generate $45 billion in revenue this year, CEO Jeffrey Immelt said in the company's last annual report.
Like at UP, however, what's just as interesting is what GE can't do now but expects do in the coming years.
Until about three years ago, GE had to be selective about the data it collected from those gas turbines. It could collect only about three months' worth of data--about 50 TB--because its database and analysis tools didn't scale beyond that size. See the problem if you were in the energy business? It meant you couldn't trend the current heat wave against the last few years' heat waves. What's more, the software could analyze data from only the turbine itself, not data points from the steam turbine and heat-recovery equipment around it, which also might have clues to a pending breakdown.
Over the past few years, GE has scaled its software so it can collect and use data spanning many years and more equipment. It also acquired analytics vendors such as SmartSignal that let it handle more complicated data, like the effects of those machines connected to the gas turbine.
The next big obstacle GE's software faces is speed--which it will need to predict the future. Today's software can send alerts when it spots a potential problem, but it isn't nearly fast enough to do "what if" analysis of machines. "People call us up and say, 'Can we overdrive our equipment for the next two hours?'" says GE software CTO Rich Carpenter. Today, those answers are given based on decades of experience and engineering knowledge.
What GE wants to offer is the ability to ask, "Has any machine in our entire system ever had X, Y, and Z factors, and what happened four hours later?" GE's systems today would take about 30 days to answer that question--if they could even answer it. GE's working to combine its data management and analytics software with Hadoop-based data processing to deliver an answer in 30 seconds. GE Software has just tested a prototype architecture that delivers that kind of speed, says Erik Udstuen, business leader for GE's software and services business.
"It's being able to predict the future by having a very clear line of sight to the past," says Jim Walsh, global general manager for GE software and services.
Yet GE sees some of the same limits that Tennison sees at UP: sensors are pricey and networks to collect the data can be spotty. It's one thing to cover the 1,800 gas turbines inside power plants in sensors and collect the related data. It's much harder to do that for tens of thousands of wind turbines spread across often remote expanses of the U.S. and China.
This kind of reality check on the Internet of things is essential. People's faith in emerging technology gets to the point that they start to assume that all data is gettable, that all of it is crunchable to turn questions into answers, like those magical computers that spit out the answer in spy movies.
Union Pacific points to what's possible. Yet at the same time, its goal of driving growth through greater use of analytics, sensors, and networked machines shows how much work still lies ahead.