Strategic CIO // Executive Insights & Innovation
Commentary
1/16/2014
09:36 AM
Thomas Claburn
Thomas Claburn
Commentary
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%

The Second Machine Age: Meet Your Computer Overlords

A maddeningly reasonable and readable new book by Erik Brynjolfsson and Andrew McAfee argues that while a robot may take your job, the economy will prosper.

When Jeopardy champion Ken Jennings wrote his final answer, knowing that he and fellow human contestant Brad Rutter had been bested by an IBM computer called Watson, he included a personal note: "I for one welcome our new computer overlords."

In their compelling new book The Second Machine Age, Erik Brynjolfsson, director of the MIT Center for Digital Business, and Andrew McAfee, principal research scientist at the MIT Center for Digital Business, cite Jennings's concession, among many other anecdotes, as evidence of the accelerating and transformative intellectual capacity of machines.

Just as the industrial age altered the market for physical labor, the computer age is remaking the market for mental labor, the authors argue.

Shortly after his defeat, Jennings offered a self-effacing and surprisingly sanguine account of his contest with Watson in a Slate article titled "My Puny Human Brain."

"Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last."

Brynjolfsson and McAfee strike a more cautious tone – Jennings's characterization of his defeat as a "happy ending" comes across as Stockholm Syndrome – but they convey more or less the same message: The robots are coming, and they'll going to take jobs as well as create them. But we'll manage.

"After spending time working with leading technologists and watching one bastion of human uniqueness after another fall before the inexorable onslaught of innovation, it's becoming harder and harder to have confidence that any given task will be indefinitely resistant to automation," they write.

The tl;dr version for those seeking to delay obsolescence: In the near term, humans will have an edge in careers that require creativity (e.g. creating innovative software or elegant prose), the recognition of broad patterns (e.g. fashion trends), complex forms of communication (e.g. interrogations), and tasks that depend on mobility in unpredictable environments (e.g. electricians). But beyond that, don't underestimate what computers will be able to do in a few decades.

The Second Machine Age is maddeningly reasonable and readable. It's neither Luddite polemic nor libertarian techno-utopianism. For those who follow technology closely, much of the foundational history upon which the authors build their argument will be familiar, though there's value in revisiting the accomplishments of tech luminaries and businesses in a cohesive framework. For those unfamiliar with names like Rodney Brooks, Gordon Moore, and Hans Moravec, prepare to be alarmed and reassured at the same time.

In a phone interview, Brynjolfsson explained that the goal of the book is to bridge the divide between the wild optimism exhibited by technologists and the pessimism of many current economists, for the benefit of policymakers.

At the same time, McAfee stresses that The Second Machine Age isn't a cookbook for enlightened bureaucracy. "The solutions to job and wage problems are not going to come from Washington, but from the activities of entrepreneurs and innovators," he said.

Befitting a book begotten by MIT academics, The Second Machine Age is a paean to entrepreneurship, education, and growth-oriented regulation. That's evident from the first pages when the authors cite the work of anthropologist Ian Morris to argue that the surge in human social development made possible by the steam engine and the industrial era "made a mockery of all the drama of the world's earlier history," as Morris put it.

Intriguing though this hockey-stick graph may be as a reflection of the economic significance of the industrial revolution, it's dangerously reductive for those with concerns that stretch into civic, political, and cultural realms.

To their credit, Brynjolfsson and McAfee grapple with the some of the thornier issues accompanying the computer age, like the growing gap between the rich and poor, the potential for a largely idle population, and the tendency of technology to create winner-takes-all markets.

In one example, they point to the way that TurboTax software has enriched its creators but endangered the jobs and incomes of tens of thousands of tax preparers. Creative destruction wrought by the march of technology becomes problematic when it shifts wealth from many to few.

"In the words of Marc Andreessen, software is eating the world," said Brynjolfsson. "You get a lot of winner take-all-markets, which is great for [the makers of TurboTax] and consumers not so much for people of average skills [who have been made redundant]."

And therein lies Silicon Valley's pathological blind spot: The celebration of entrepreneurship argues for a star-system. Like Hollywood dreams, it's a narrative about talent and success that focuses on the few winners rather than the multitudes of losers. If only we were all the geniuses we see in the mirror.

The authors acknowledge that when income is distributed by power law -- in which a small number reap a disproportionate share -- it not only increases income inequality but it also disrupts our institutions.

Look no further than the anti-gentrification sentiment Google has confronted in the San Francisco Bay Area to see that economic arguments about the overall social benefit of tech-driven wealth creation don't resonate with everyone. And moving the masses to accept their new computer overlords is likely to mean sharing the opportunity if not sharing the wealth. If only the example set by the Bill & Melinda Gates Foundation were more widely emulated.

Perhaps the most fascinating aspect of The Second Machine Age is how the authors propose to deal with the situation. There among the pro-business economics are glimpses of socialism, or to use a less loaded terminology, social concern. As they put it, "...we are going to need more novel and radical ideas -- more 'out-of-the-box' thinking -- to deal with the consequences of technological progress." Consider that a measure of the disruptive potential of technology.

At a time when Congress can barely be cajoled to agree on a budget, when efforts to improve healthcare divide us rather than unite us, and antiquated legislation hamstrings innovation, calling for radical ideas seems impossibly ambitious.

The authors strike a more optimistic note. "The surprise to me is how many things we can do from a policy perspective and how much common ground there is about what the smart things to do are," said McAfee. "When we talk to business leaders with a range of backgrounds, we hear a lot of commonality on immigration, entrepreneurship, and education."

In the book, Brynjolfsson and McAfee float the idea of revisiting Nixon's Family Assistance Plan, a basic income program that would assure a minimum standard of living an era of technologically magnified unemployment.

But they're quick to stress that their preferred approach involves providing support in conjunction with an incentive to work. Through what's known as a negative income tax -- in which those below a certain threshold receive scaled payments -- and lower labor taxes, they argue low earners could be helped and simultaneously encouraged to work. They also would like to see some consideration for a value-added tax (VAT) and Pigovian (deterrence-oriented) taxes in lieu of income taxes.

The authors also call for more support of crowdsourcing, to encourage the development of services like Airbnb, Lyft, and TaskRabbit. And in the spirit of Google, circa 2006, they throw out a few non-endorsed ideas to see what sticks: a national mutual fund for distributing capital ownership to all citizens as a hedge against wealth concentration; taxes, contests, and other incentives to make machines augment humans rather than replace them; using non-profits to perform socially beneficial tasks, as determined by democratic vote; human-employment offsets, a reward for companies that employ enough people; vouchers for basic necessities; and a revival of Depression-era 'workfare' programs for the public benefit.

The principal shortcoming of The Second Machine Age is that it's a suspended narrative. Like the recently released The Hobbit: The Desolation of Smaug, it just ends, without a clear conclusion. While it may be scientifically appropriate to withhold judgment in the absence of data -- how else, really, can a non-fiction book about the future end? -- it's emotionally unsatisfying. We are talking, after all, about the end of the world as we know it. What happens to the humans? Do they survive the robot revolution? And does that crazy scientist ever get together with that free-spirited android? Inquiring minds want to know.

But in the absence of closure, The Second Machine Age leaves us with the next best thing: questions about the kind of future we want. It's interactive media of a sort. "Technology is not destiny," Brynjolfsson and McAfee conclude. "We shape our destiny."

Let's get to work on that, while we still can.

Thomas Claburn is editor-at-large for InformationWeek. He has been writing about business and technology since 1996 for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. He's the author of a science fiction novel, Reflecting Fires, and his mobile game Blocfall Free is available for iOSAndroid, and Kindle Fire.

InformationWeek Conference is an exclusive two-day event taking place at Interop where you will join fellow technology leaders and CIOs for a packed schedule with learning, information sharing, professional networking, and celebration. Come learn from each other and honor the nation's leading digital businesses at our InformationWeek Elite 100 Awards Ceremony and Gala. You can find out more information and register here. In Las Vegas, March 31 to April 1, 2014.

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
Page 1 / 3   >   >>
ErnieSchell
50%
50%
ErnieSchell,
User Rank: Strategist
1/21/2014 | 8:28:13 PM
Even high-level jobs can be automated
I have worked as a systems consultant for multichannel marketers for 25 years, and take pride in my ability to do a "strategic" Needs Analysis and RFP - but I have thought long and hard about how well the lion's share of what I do could be automated and have concluded that it might even improve the process. If you can program a couple of hundred business rules into the bot, and all the RFPs I've done over the years (240+), I'm sure it could do a reasonable job replacing me... in the year 2025 or so, which is not that far away. On the other hand, a human would have to invest the proverbial 10K hours (i.e., five years) to achieve the same thing, probably with less bankable results. Yes, we will all bow soon to our computer overlords, only they will be our hand-maidens instead. But let's make sure they are not like the Sorcerer's Apprentice from Fantasia (and drown us all)!
TerryB
50%
50%
TerryB,
User Rank: Ninja
1/17/2014 | 1:13:35 PM
Re: Seems like I remember...
Wouldn't have mattered, Watson got final question right.

But you bring up an excellent point:  How would Watson do playing Texas Hold'em? I'd love to see that experiment. Anyone who ever watched World Series of Poker knows having the best hand (in probability theory before betting complete or after conclusion of hand) rarely wins.

But what a Poker Face Watson would have, no tells. And no hoodies or sunglasses either...
Mohamed S. Ali
50%
50%
Mohamed S. Ali,
User Rank: Apprentice
1/17/2014 | 12:33:50 PM
Who Owns the Future?
Another good read on the subject is Jaron Lanier's, "Who Own's the Future?". His thoughts are a bit more radical, but less socialist? perhpaps. He cites the example of Insatgram (12 employees when it sold to FB), which represents photography to consumers today, in much the same way that Kodak (150K+ employees at its peak) did a generation ago. The contention he makes is that Instagram is worth a $1B to Facebook, not because of their 12 employees, but the content that is being generated by their millions of users. The same can apply to Facebook's valuation itself, its the users that make Facebook valuable, not the software, and content generators on these platforms are not being compensated for the value that they are provding - effectively the value provided by millions and millions of these users is being taken off the book, and on to the book of "siren servers" (Google, Facebook etc.). If this value is transferred back to the generators of content, then that minimizes the winner take all model that's currently at play. 
Thomas Claburn
50%
50%
Thomas Claburn,
User Rank: Author
1/17/2014 | 12:04:10 PM
Re: Silicon Valley's pathological blind spot...
>"on-board human computing"--nice (though a bit unsettling) turn of phrase.

When the term "computer" first came into use in the late Renaissance, it was a job description rather than an object: It referred to people who did mathematical computations.
RobPreston
50%
50%
RobPreston,
User Rank: Author
1/17/2014 | 9:22:08 AM
Re: Silicon Valley's pathological blind spot...
"on-board human computing"--nice (though a bit unsettling) turn of phrase.
Kristin Burnham
50%
50%
Kristin Burnham,
User Rank: Author
1/17/2014 | 8:46:17 AM
Re: Seems like I remember...
That's a good point. Will robots learn human emotion? And how does that change things?
cbabcock
50%
50%
cbabcock,
User Rank: Strategist
1/16/2014 | 6:29:13 PM
A debate is what's needed
A learned and astute review, Tom Claburn. In the debate between technologists and economists, or technologists and more generally, the humanists, the technologists so far are winning. Even the presumption that there's a debate going on is suspect. Western civilization in its many forms has always been a technology-driven society, possibly a little more so with each passing year. A technology judged to be superior in one part of its culture is swiftly adopted in the other parts, with little debate on its long term effect. Its lead to great advances -- genome codification, space travel, nuclear energy, a well-connected society. And it's not too hard to imagine some of the drawbacks: nuclear bombs, a collapsing distance between domestic life and the next battlefront, communication with more vituperation than understanding. When the advances start taking over -- dominating the body politic with unintended consequences -- that would be a good time for a genuine debate to get underway, and that time may not be far off.
ChrisMurphy
50%
50%
ChrisMurphy,
User Rank: Author
1/16/2014 | 5:36:43 PM
Re: Silicon Valley's pathological blind spot...
I wouldn't want a robot nurse for acute care, but what if I could have a robot that let me stay living independently in my home for many more years, by helping me get up, collecting a few vital signs so my kids weren't perpetually panicked, and the like? I could see some sort of home healthcare robot in the not so distant future.  
Thomas Claburn
50%
50%
Thomas Claburn,
User Rank: Author
1/16/2014 | 4:47:39 PM
Re: Silicon Valley's pathological blind spot...
>What will all the truck drivers do for employment when driverless trucks carry cargo around the country?

I think driverless vehicles will take longer than expected because there's no sign of a good way to handle the unexpected. Programming is great for getting vehicles to follow a known path. It's not so great when there's been an accident and three lanes that were supposed to be there no longer are, or when a flag-waving maintenance worker tries to warn drivers away from a chemical spill. So then you have trucks phoning home to ask for remote guidance, and that requires infrastructure and people monitoring, and very soon, on-board human computing looks cost effective again.
Lorna Garey
50%
50%
Lorna Garey,
User Rank: Author
1/16/2014 | 4:29:20 PM
The Needs Side of the Equation
One obvious answer seems to be to require less income. That seems feasible in an age where we can have 3D printers to make basic items, and if we generally work at home and don't need so many suits and dress shoes, and if we have self-driving fleets and perhaps European-style rail so we don't need so many cars. Computer-aided advances in food production could let us do local agriculture and reduce costs.

I recently heard my twenty-something niece use the term "bourgie" in a negative tone about a club where there was a lot of conspicuous consumption. If you don't know it, Google it. Maybe the rise of AI will coincide with human life being simpler and less focused on McMansions, more on community.
Page 1 / 3   >   >>
The Business of Going Digital
The Business of Going Digital
Digital business isn't about changing code; it's about changing what legacy sales, distribution, customer service, and product groups do in the new digital age. It's about bringing big data analytics, mobile, social, marketing automation, cloud computing, and the app economy together to launch new products and services. We're seeing new titles in this digital revolution, new responsibilities, new business models, and major shifts in technology spending.
Register for InformationWeek Newsletters
White Papers
Current Issue
InformationWeek Tech Digest, Nov. 10, 2014
Just 30% of respondents to our new survey say their companies are very or extremely effective at identifying critical data and analyzing it to make decisions, down from 42% in 2013. What gives?
Video
Slideshows
Twitter Feed
InformationWeek Radio
Archived InformationWeek Radio
Join us for a roundup of the top stories on InformationWeek.com for the week of November 16, 2014.
Sponsored Live Streaming Video
Everything You've Been Told About Mobility Is Wrong
Attend this video symposium with Sean Wisdom, Global Director of Mobility Solutions, and learn about how you can harness powerful new products to mobilize your business potential.