Touchscreen Ultrabooks might be cutting edge today, but Intel CTO Justin Rattner says it's nothing compared to the next generation of smart sensors and wearable technology just around the corner.
Intel CTO Justin Rattner was trying out a new line when he sat down with InformationWeek last week during CES. "Someone asked me, 'Has anything impressed you?'" he said of the show. "I started to give my regular answer, which is, 'Oh, I can't possibly see everything at CES.' But then it popped into my head and I said, 'No, if it's here, it's already old news.'"
Rattner was mostly joking, but his comment set the tone for the half-hour conversation that followed. Nearby, thousands of companies, including Intel, were promoting today's definition of the cutting edge. Rattner's attention, though, was fixed on a more distant horizon. Touch-enabled Ultrabooks and smartphone apps might get the headlines today, he suggested, but in coming years, technology will achieve true contextual awareness, learning to manage not only our social preferences and schedules but perhaps even our DNA.
Touch-enabled devices have been making waves for the last few years. When asked what comes next, Rattner replied, "Going forward, sensing is going to be a big deal ... It's what's going to take us from this generation, largely inspired by the iPhone, to the next generation of devices that finally transcend this, that really embody this notion of pervasive access to information."
He described the process in terms of not only "hard sensors" that track physical attributes such as light, heat, pressure and motion, but also "soft sensors" such as a user's calendar, social network activity and Web browsing habits. "What context awareness does is collect all of that, some of which is up-to-the-minute on the physical sensors and some of which is accumulated incrementally over a long expanse of time through these soft senses, to create devices that really anticipate your needs," he said.
Rattner said such tools will "be like your best friend" because they will "know where you are in space and time, and understand the relationships you have with other people and other things." The challenge for programmers and engineers, he concluded, is "with that [sensing] knowledge, what will these future applications look like?"
If a proliferation of sensors will define the next major inflection point in personal technology, big data challenges will have to be overcome to make all the information actionable. "You're discovering the knowledge that's buried in this mountain of data," Rattner remarked.
He said cities provide a "perfect laboratory for looking at that stuff," explaining that Intel is currently involved in collaborative research with Imperial College London and University College London to develop the analytic tools necessary to understand what the data can reveal about the British capital.
"There's an opportunity at city scale to crowdsource a lot," Rattner said. To illustrate, he explained that in the event of a public health emergency, such as a hazardous materials spill, information aggregated from millions of sensors could help first responders act more quickly and effectively. "It may turn out that most of those sensors are things we carry around, always looking, listening, sniffing," he said.
As another example, he described a project developed by an Intel lab in Berkeley, Calif. Researchers attached particulate sensors to all the street sweeping vehicles in nearby San Francisco, allowing them to map with high precision which neighborhoods were particularly impacted by given airborne pollutants. "They didn't know what to do with the data," Rattner said of city officials. He explained that they'd relied on a single particulate sensor as the basis for all previous decision making, and that Intel's report, which used a color-coded system to represent the density of various atmospheric elements, wasn't particularly user-friendly. The government officials' difficulty in harnessing the information, he suggested, underscores the big data challenge.
"I think when we did that, we didn't really connect the data with the analytics," he remarked. "It was, 'Here's the data. What are you going to do with it?' Now, we deliver in a form people can understand and relate to and that makes recommendations."
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.