Whether or not you were aware of it at the time, the world changed in 2011 when Ken Jennings lost to IBM’s Watson on Jeopardy. As a part humorous, part chilling conclusion to the historic show, Jennings wrote the following beneath his final answer:
I, for one, welcome our new computer overlords.
Jennings recognized something investigated in the book, Race Against the Machine, published by Erik Brynjolfsson and Andrew P. McAfee at the Massachusetts Institute of Technology. According to the book, Race Against the Machine explores the ways “we’re living in an era where computers dominate every aspect of our life, and unless we catch up and start working with the machine rather than against it, the new economic machine is going to spit us out.”
We can draw correlations to the patterns we’re seeing in income disparity and job displacement from technology and automation. The theory in Race Against the Machine also helps to explain humans’ role in the new economy. At only 80 pages, it is a must read for anyone still on the fence about whether to pursue a liberal arts degree or one in engineering.
Most people with a liberal arts degree are struggling to find jobs, but tech companies are scrambling to find talent. It’s still one of the most difficult problems in a startup: hiring smart people. The money and power has already shifted and will continue to disproportionately shift to people who either (a) know how to program or (b) figure out how to work well with those who already do. There’s an enormous ecosystem of value creation potential out there if you could only contribute to it.
Is there anyone who is immune to technological unemployment?
In 2004, Frank Levy and Richard Murnane wrote that truck drivers are nearly immune from technological unemployment in The New Division of Labor:
The truck driver is processing a constant stream of [visual, aural, and tactile] information from his environment… To program this behavior we could begin with a video camera and other sensors to capture the sensory input. But executing a left turn against oncoming traffic involves so many factors that it is hard to imagine discovering the set of rules that can replicate a driver’s behavior… Articulating [human] knowledge and embedding it in software for all but highly structured situations are at present enormously difficult tasks… Computers cannot easily substitute for humans in [jobs like truck driving].
At one point, it was easy to write off computers as being our drivers. The 2004 DARPA Grand Challenge supported this hypothesis. The “winning” vehicle took hours to get just 8 miles into the 150 mile course, and then stopped working.
Surprisingly, something happened less than six years later, not centuries or decades later: Google modified the Toyota Prius to be fully autonomous. Today, Priuses have driven hundreds of thousands miles with no human guidance on American roads. There was only one accident ever reported, when a human driver read-ended the Google-modified vehicle.
Why did this happen so quickly? Six years is all it took for this seemingly quantum leap into fully autonomous driving that Levy and Murnane pointed out as so computationally difficult to solve. It’s not just that computers are getting cheaper and faster. Martin Grötschel showed that processors were increasing efficiency in computing a la Moore’s Law by a factor of 1,000, butalgorithms were 43,000 times faster in the same time span.
This sweeping change isn’t coming in futuristic gas-sippers from Mountain View. On the legal side, Blackstone Discovery analyzed 1.5 million documents for less than $100,000. The one human assisting the machine did the work of 500 lawyers. And the machine-human pair did a much better job: lawyers have been shown to achieve only 60% accuracy in this mundane, headache-inducing work.
In China, Foxconn already has 10,000 robots and expects to buy 300,000 next year. Over the next 3 years the company is planning to purchase 1,000,000 robots to replace a portion of their existing Chinese workforce.
Even in sales, companies are turning to software instead of people to close. Since June 2009, when the recession ended, corporate spending on software is up 26%, but payrolls remain flat. When I ordered tickets to Florida, I bought them on Hipmunk without any interaction with a human. I got on a train and ordered my ticket through a computer and when I got to the airport I printed my tickets from a kiosk. I was even lucky enough to proceed through security with minimal interaction with the TSA.
There are very few professions resistant to automation and those tend to involve physical coordination and sensory perception, a phenomenon called Moravec’s Paradox.
How much theory is relevant?
Is Race Against a Machine arguing that computer science is what everyone should be studying now? Not necessarily. The book is optimistic but points out that the educational system is doing a terrible job of teaching people how to be creative and how to work with machines. It’s important to be able to work creatively with technology, as Steve Jobs once said:
When I went to Pixar, I became aware of a great divide. Tech companies don’t understand creativity. They don’t appreciate intuitive thinking, like the ability for an A&R guy at a music label to listen to a hundred artists and have a feeling for which five might be successful.
Granted, you’re not a computer scientist just because you can build web applications. Maybe you’re just a web app developer, but I think that’s good enough. A web developer today can go and get an entry level job in California with $70,000 or more in salary plus benefits, equity, significant learning experience, and plenty of room for career mobility. I was 20 years old in 2007 when I was offered $70,000/year to join a company as the first employee.
Can you get away with being a web application developer and skip accreditation altogether? Computer science curricula is still heavy on theory, but much of the theory and lower level material taught in CS is not necessary for a startup — when was the last time you needed to know how multiplexers work at the gate level? Everything you learn sits somewhere on a totem pole of tech abstraction, so the current debate seems to be over how low on that pole you should go to learn what’s applicable to what you’ll use in a tech role.
Creativity and intuition
What can’t computers do? Intuition and creativity. That’s where humans can help. We will never win the race by running faster than the machine, but we can still help it with the skills it can’t yet replicate. When you think about it, it’s a match made in heaven: the power of the machine harnessed with the creativity of the individual. Undoubtedly there is no limit to the markets that remain to be captured.
I realize there’s some irony that our willingness to work with the machines is also what causes an increase in automation, and a potential increase in unemployment for those who don’t adapt. There was a Redditor who asked if he was a scumbag because he automated his work and ended up with 95% of the entire bonus pool because other people performed poorly in comparison. There’s nothing to stop other employees from learning Ruby or Python and balancing out the bonus pool again — no CS degree required there; just creativity and cooperation from a machine.
Fortunately, Race Against the Machine isn’t a grim outlook on our jobs or standard of living if we can race alongside the machine. In 1800, 90% of all Americans worked in agriculture or farming in 1800; by 1900 41% did, and only 2% by 2000. We found more jobs and higher standards of living in the manufacturing and services industry that emerged. When we left the Industrial Revolution, we saw many more jobs as a result–not fewer.
I don’t think we’re heading towards the “end of work” as some economists would suggest. We’re experiencing the natural and healthy fluctuations of a market that is deeply starved for technical skills, or at least people who can use their creativity and intuition in unison with technology.
VB's research team is studying web-personalization... Chime in here, and we’ll share the results.