Google wants to hear from developers who can create mobile apps that rely on precise awareness of users' surroundings.
On Thursday, the company's Advanced Technology and Projects (ATAP) group, not to be confused with its X Lab, invited developers to submit proposals for applications that take advantage of Project Tango, an experimental Android-based phone. The phone is built with custom hardware and software for tracking the device as it moves in real-time and generating a 3D model of the local area.
Project Tango hardware, capable of taking a quarter million 3D measurements every second, relies on a software development kit (SDK) with APIs for accessing position, orientation, and depth data through Android apps written in Java or C/C++, or through the Unity Game Engine (which can build for Android devices and other platforms too).
Current smartphones can manage limited tracking of position and orientation, but lack the full range of sensors and precision to run the kinds of applications Google envisions. More significantly, they aren't designed to place the device within a 3D representation of the local environment.
Project Tango phones include a vision processing system, a depth sensor, and a motion tracking camera, along with the gyroscopes and orientation sensors found in other smartphones. They can be thought of as something like a mobile version of Microsoft's Kinect system.
The gaming applications are obvious, which explains why the SDK has been designed to work with Unity, one of the leading professional game development frameworks. Given Project Tango phones, developers could create apps that, for example, tracked player movements accurately enough to determine whether a virtual laser blast from one player hit another player or an obstacle.
But the potential applications go far beyond entertainment. Google suggests its devices will be able to: capture the dimensions of a person's home, to help simplify furniture shopping; enable directions that extend into buildings rather than stopping at street-level doors; help the visually impaired navigate; and locate products precisely on a shelf in a large store.
[Who will own the data? Read Google's Android Contract: Not Very Open.]
What's more, the ability to easily integrate depth information into images has the potential to simplify image editing. A camera app with access to image depth data should be able to identify objects in photos and separate them for manipulation far more easily than if it had to rely on edge detection and color data algorithms. Such an app, in theory, could support one-touch removal of objects from an image, rather than requiring the manual tracing of object outlines for excision.
Given Google's longstanding interest in indoor mapping and its recently published patent application for crowdsourcing indoor locations, Project Tango suggests Google aims to take Street View, or something like it, beyond the street and into businesses and homes. Indoor mapping data, incidentally, should prove particularly useful for Google's nascent robot business -- reliable navigation in confined spaces, with obstacles that may move, remains a challenge for robots.
It remains to be seen how Google, not to mention legal authorities, will extend privacy protections to all the indoor data that gets generated and squirreled away on Google servers.
Google says it intends to distribute 200 Project Tango devices by March 14.
Engage with Oracle president Mark Hurd, NFL CIO Michelle McKenna-Doyle, General Motors CIO Randy Mott, Box founder Aaron Levie, UPMC CIO Dan Drawbaugh, GE Power CIO Jim Fowler, and other leaders of the Digital Business movement at the InformationWeek Conference and Elite 100 Awards Ceremony, to be held in conjunction with Interop in Las Vegas, March 31 to April 1, 2014. See the full agenda here.