Machine-To-Machine Analytics: Next Big Data Challenge?

Data storage is the tricky issue right now, but M2M data will soon test enterprises even further, says Hitachi exec.

Jeff Bertolucci, Contributor

October 1, 2012

4 Min Read

For many enterprises, today's top big data challenge is finding innovative and affordable ways to store growing volumes of real-time information, such as human-generated data arriving via the Internet. But that's just the beginning. In the coming years, machine-to-machine (M2M) data sharing and analytics will pose an even greater challenge for organizations, according to Asim Zaheer, senior VP of Hitachi Data Systems (HDS).

In a phone interview with InformationWeek, Zaheer discussed HDS's view of the expanding big data landscape, and how the company's enterprise clients are tackling logistical issues related to data management and storage. HDS, the data storage subsidiary of global conglomerate Hitachi, has $4 billion in annual revenue and is headquartered in Santa Clara, Calif.

"We've been talking to the companies we work with--big enterprises, everything from banks to healthcare providers--about big data, and whether it's in their plan," said Zaheer.

A year ago, most of these companies said big data was a very important part of their business strategy, primarily because of the industry buzz around the topic, he said. "It was something they needed to learn about," said Zaheer. "They thought they had to get educated because there might be a train passing by that they need to get on."

Many enterprises were influenced by news reports of retailers using big data technologies, such as the Hadoop management framework and the analytics engines that run on top it, to mine real-time data for actionable information. Retailers, for instance, use this data to study the behaviors and buying patterns of their customers.

Fast-forward a year later, however, and these same companies have adjusted their focus a bit. They're concentrating less on advanced analytics and more on simply finding ways to store the vast amounts of Internet-driven data they're collecting.

[ Storage is still a problem. Is DNA On Microchips Big Data's Future Storage Answer? ]

It's big data crisis control, in a sense. Organizations first must learn to store huge amounts of human-generated data culled from websites, mobile apps, social media sites, and so on. Next comes the second step: performing analytics on the data.

A slow global economy is hampering both developments.

"Many companies are realizing the value and power of data. But they're grappling with the reality of budget constraints," Zaheer said.

An even bigger challenge is the coming explosion of machine-to-machine (M2M) data. Wireless sensors, for instance, are becoming commonplace in a variety of consumer and industrial devices, including vending machines, healthcare products, home security systems, and parking meters.

Sensors are increasingly ubiquitous in the transportation industry, too. High-speed trains in Japan, for instance, have sensors that check for seismic activity, environmental changes, unexpected rail traffic, and other anomalies.

The 2011 Tohoku earthquake and tsunami off the coast of Japan provided a good example of how sensors, working in conjunction with M2M data sharing and analytics software, can help prevent additional problems during a natural disaster.

The Japanese trains have sensors that continuously monitor the track. "The sensors picked up that there was unusual seismic activity, and the transportation management systems shut down all trains," said Zaheer.

He added, "There were no reports of any accidents or disasters related to train movement. That's because sensors picked up on (the activity) and recorded it to an application, which then acted automatically and shut down all the trains."

M2M applications will play a major role in big data's future. "Machines talking to machines, sharing information, correlating things going on in one area versus another, and coming up with new conclusions and presenting them to humans to act upon," Zaheer said.

However, this vision of the future poses tactical and financial challenges for enterprises.

"The amount of content they're grappling with, and the need to retain it, is a challenge," said Zaheer. "How do I store all of this stuff? Also, how do I find what I need, when I need it?"

He added, "It's not simply a matter of throwing it all into a storage system. It's also being able to access what you need it from anywhere in the world."

In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of InformationWeek: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)

About the Author(s)

Jeff Bertolucci


Jeff Bertolucci is a technology journalist in Los Angeles who writes mostly for Kiplinger's Personal Finance, The Saturday Evening Post, and InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like

More Insights