If we don't control the technology we depend on, someone else will -- and we might not like the consequences.
When people fail to control their actions, the law might step in. When people fail to control their technology, code might intercede.
It's happening already. Google is well on its way to developing self-driving cars. No doubt automated cars will save lives, fuel, and time because, let's face it, we're terrible drivers on the whole. Most of the time you might be a very good driver, but statistically speaking, your average lifetime risk of dying in a car accident comes to something like 1 in 84. Even if you drive flawlessly, someone else won't. You or someone you know could suffer for that. Expect that Google will do better, if we're willing to surrender control.
It looks as if we will. Technology has become so complicated and powerful that many people prefer ease of use or the promise of security, real or not, over control. Apple has won a huge following by limiting control of its mobile devices for the sake of convenience and consumer protection. It might be your iPhone, but Apple decides what software you can run on it.
(Developers in good standing, of course, can run anything they want on their iPhones, but they cannot distribute the software through Apple channels without approval, unless they go rogue and jailbreak their phones.)
Often, technological control is exercised to protect us from ourselves: Many websites use programming code to coerce users into selecting strong passwords, because we can't be trusted to choose wisely. But technological control can also be a threat to business interests: Google last year removed AdBlock Plus from Google Play, despite the wishes of Android users.
How much control should we surrender? Do we need Google, for example, to disable our phones for us? A recent Google patent application, "System and Method for Controlling Mobile Device Operation," contemplates that scenario. It describes research to help in "correcting occasional human error" when phones are left in a state that's not situationally appropriate. The patent application describes the possibility of "controlling and/or recommending changes to mobile devices located in opera houses, movie theatres, or other similar locales in which silence is encouraged (e.g., placing the mobile devices in mute or vibrate to minimize distractions)."
Code-driven manners correction could be a godsend -- or an intrusion, if implemented as an enforced requirement rather than as a request. We'd welcome the muting of phone-happy moviegoers, but we'd hate to miss a call from a family member in the emergency room. Crafting an acceptable use policy for technology isn't simple, as a matter of code or law.
The European Commission is considering whether to require cars to be fitted with speed-limiting devices. Would you welcome a smart car (or should it be called a cautious car?) that applies the brakes at just over 115 kph (71 mph)? What if your insurance company required it? Do we drive our machines or do they drive us?
More and more, it will probably be the latter, except where guns are concerned. Do as you will with your glock. The Second Amendment guarantees the right to bear arms in the US. What would a right to unfettered computing be like? It would be an open-source world.
Reality falls short of that. We're moving from a time where reading had a language requirement to a time where reading has both a language and a device requirement. Sorry, your iBook doesn't work with your Kindle. Where technology is concerned, code is law. Devices have no choice but to obey, unless prevented from doing so by errors. People tend to accept the limits of their devices.
Hackers are the exception. That's why hackers -- skilled programmers, for the most part -- are regarded with suspicion. They can rewrite the law of code to their own advantage, bypassing barriers that would block the less technically savvy and creating new avenues of action and opportunity. Frankly, we should all aspire to be hackers, in the best sense of the word.
Of course, code isn't really law. It functions as a kind of law, but within the operating system rather than the legal system. Code might be an attempt to implement the law or to enforce contractual rules. But what the law allows and what code allows might not be the same things.
To complicate matters, laws are seldom crafted with the care of code. When they govern technology, they tend to fall short, to be overbroad, to be so vague as to criminalize everything, or to empower zealous prosecutors too much. The Computer Fraud and Abuse Act comes to mind here.
The ability and the right to control the technology that already governs us matters more than ever because our devices see so much more of our lives than ever. When the NSA is able to track people using Google advertising cookies, when dozens of ad companies know more about us than our friends, it's clear that we've lost control.
If we're going to turn control over to our machines, we need to make sure we have a say in crafting the terms of our surrender. But really we should strive to keep as much control as we can manage, as long as we're not endangering others.
InformationWeek Conference is an exclusive two-day event taking place at Interop where you will join fellow technology leaders and CIOs for a packed schedule with learning, information sharing, professional networking, and celebration. Come learn from each other and honor the nation's leading digital businesses at our InformationWeek Elite 100 Awards Ceremony and Gala. You can find out more information and register here. In Las Vegas, March 31 to April 1, 2014.
Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.