Anyone interested in buying Google's forthcoming computerized eyewear can post or tweet in hope of receiving an invitation.
Apple iWatch Vs. Smartwatches Past And Present
(click image for larger view and for slideshow)
Google has launched a website to allow anyone to apply for an invitation to join the company's Glass Explorer program, an opportunity previously offered only to developers attending last summer's Google I/O developer conference.
The Glass Explorer program provides participants with the option to buy Google Glass Explorer Edition, the company's forthcoming Internet-connected eyeglasses, for $1,500 plus tax.
A select group of developers in the Glass Explorer program recently had the opportunity to participate in Glass Foundry events in New York and San Francisco, at which details of the Project Glass Mirror API were revealed under a strict non-disclosure agreement. The API provides a way for third-party applications to exchange data with the cloud service Google is developing to communicate with its networked glasses.
Google last summer staged an impressive demonstration that involved skydivers wearing its glasses to transmit a live video feed of their jump. But it has not previously released many details about how Project Glass might be useful to the average person.
So it has put this question to would-be Glass Explorers, inviting them to publish posts of 50 words or less on Google+ or Twitter that explain, "what you would do if you had Glass, starting with the hashtag #ifihadglass." Applicants, who must be 18 or older and live in the U.S., can include up to five pictures and a video of 15 seconds or less.
To inspire potential applicants, Google has posted a new video and a website page demonstrating some of the capabilities of Google Glass. Assuming the video represents captured imagery rather than post-production visual effects, Glass has been designed to listen for a spoken prompt, "Okay Glass," and to then present a menu of options -- take a picture, record a video, hang out with, and get directions to -- that represent valid spoken commands.
Those commands invite further refinement using Google's search query auto-completion technology. For example, saying, "Okay, Glass, hang out with ..." shows a list of the user's Google+ contacts on the display screen to indicate names that can be used to complete the command (to start a Google+ hangout).
The video also shows how Glass can be used to submit spoken queries to Google search. "Okay Glass, Google photos of tiger heads," for example, returns a series of images of tigers' heads from Google Image Search. Glass can also access Google Translate to convert spoken words into text in another language that's displayed on the wearer's screen. The video also confirms that Glass includes support for Google Now, the company's predictive assistant technology. And it can be used to send speech-to-text messages.
Google will be accepting 50-word social media applications until February 27, at which point it will send Glass Explorer program invitations to an undisclosed subset of applicants. Those receiving invitations have the option, but not the obligation, to purchase Google Glass Explorer Edition. Invitees will be able to write code for Glass using the Mirror API, just like the developers who previously signed up for the Glass Explorer program.
Invitees who opt to buy will have to travel to special Glass distribution events to be held in New York, Los Angeles or San Francisco to pick up their glasses. Google said it will notify those accepted in mid-March. The company has not committed to a specific date to make Project Glass publicly available, though it has said it plans to deliver Explorer Edition hardware in "early 2013."
Building A Mobile Business MindsetAmong 688 respondents, 46% have deployed mobile apps, with an additional 24% planning to in the next year. Soon all apps will look like mobile apps – and it's past time for those with no plans to get cracking.
InformationWeek Tech Digest, Nov. 10, 2014Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?