Over the next few years, improving the convenience of mobile services will depend on improving the use of context in delivering mobile experiences. Your business will need to predict what your customers want when they launch a mobile application or website. Delta Airlines, for example, knows how close a passenger is to departure time and delivers relevant content, such as a frequent flier's real-time status on the upgrade list for her next flight. Today, that kind of context is uncommon. Tomorrow, it will be table stakes.
Application developers writing mobile apps will have to start thinking about "mobile context" -- which we define as everything your customer has told you and all you can understand about what the customer is currently experiencing. Context is just one of the big, new challenges that application developers will face. Here are several more of the most important challenges we see.
Your customer's mobile context consists of:
Preferences: The history and personal decisions the customer has shared with you or with social networks.
Situation: The current location, of course, but other relevant factors could include the altitude, environmental conditions and even speed the customer is experiencing.
Attitude: The feelings or emotions implied by the customer's actions and logistics.
Delivering a good contextual experience will require aggregating information from many sources. It could be from the devices customers are carrying, the local context of devices and sensors around them (e.g. a geofence that knows which airport gate they're at), an extended network of things they care about (e.g. the maintenance status of the incoming airplane they are about to take for their next flight, and the probability it will leave on time) and the historical context of their preferences. Gathering this data is a major challenge because it will be stored on multiple systems of record to which your app will need to connect.
2. Device Proliferation.
Another challenge facing mobile developers is device proliferation. It might seem like today's mobile app development process is pretty well defined: Build your app, make sure it looks pretty on a 4-inch smartphone and a 10-inch tablet, then submit it to an app store. It's not quite that easy now, and it'll be much tougher in the near future. A wide range of new device sizes and changes to the nature of the apps themselves will increase the need for flexibility, especially on the client. We're already seeing 5-inch phablets, 7-inch tablets, and Windows 8 devices of 20 inches or more. Collectively, these new devices will significantly expand the potential for collecting contextual data about your customers. Here are some ideas of what changes you'll face:
3. Voice, Prioritized Over Touch.
Mobile developers are clamoring for API access to Apple's Siri and Google Now. There are a lot of scenarios where you would want to build voice input into your app today. For a running or fitness app, a phone is likely to be strapped to a person's sweaty arm, and looking at your screen while running can be a fast track into a lamp post. The same is true while driving. If you're hustling through an airport with luggage to catch a flight, voice beats touch. Modern applications will let people use their devices while keeping their eyes and hands off it.
4. Heads-Up Interfaces.
Expect to see heads-up displays such as Google Glass go mainstream in the next five years as Moore's law pushes processors to the point where such gadgets can be made powerful, lightweight and perhaps even stylish. Augmented-reality apps that don't work well on a phone or tablet could be transformative when ported to a device like Google Glass. A compelling example would be an app that provides real-time information about the people you are talking to but whose names you've forgotten.
But heads-up displays will create a whole new slate of problems for developers. We'll have to adapt to peripheral cues such as reminders and alerts that don't block the user's vision. We'll also need to integrate tactile and aural feedback such as voice commands and vibrating sensors that alert users they need to take action.