Fresh from demonstrating that its artificial intelligence software can defeat a skilled human player of Go, Google is taking steps to embed more intelligence in its mobile hardware.
To allow machine learning to run locally on mobile devices, Google plans to incorporate silicon and software from chip maker Movidius and will contribute neural network technology to the company.
Google and Movidius have worked together on Project Tango, an initiative to enable real-time three-dimensional location mapping on mobile devices, for better augmented and virtual reality applications. Lenovo earlier this month announced that it intends to release a Project Tango-enabled smartphone later this year.
Machine learning, artificial intelligence, neural networks, and related technology have become areas of intense interest for Google and its peers because these disciplines help mobile devices deal with sensory and navigation applications in the real world. When Larry Page became CEO of Alphabet last year through the reorganization of Google's corporate structure, he gave some indication of the value of machine learning to Google by noting, "Recent launches like Google Photos and Google Now using machine learning are amazing progress."
To continue such progress, Google plans to use the Movidius MA2450, a visual processing unit (VPU) designed to perform the neutral network calculations at low power, in future mobile devices. A VPU is similar in concept to a graphics processing unit (GPU), silicon tuned to make graphics computation efficient. The distinction is that the Movidius processor is designed specifically for computer vision applications, like image recognition and text translation.
Blaise Agϋera y Arcas, head of Google's machine intelligence group, contends that the availability of mobile machine intelligence will allow technology to enhance the way people interact with the world and will allow new categories of products to be built. "By working with Movidius, we're able to expand this technology beyond the data center and out into the real world, giving people the benefits of machine intelligence on their personal devices," he said in a statement.
Beyond new product possibilities, VPU hardware should make relevant applications more responsive while draining less power. By running machine learning code locally on mobile devices, Google's computer vision applications can avoid the delay and the power drain that arises when data is transmitted to a remote server for processing and then returned.
Remi El-Ouazzane, CEO of Movidius, said that embedding neural network computing in the silicon allows extreme power efficiency.
The two companies said that they will provide further details as their collaboration progresses.