Artwork: Kathleen Fu

Are robots coming for your job?

By Ryan McCarthy

November 10, 2021

 

Good morning, fellow humans! Today, Ryan McCarthy is asking an apparently timeless question about automation. And he offers advice for what to do, so the robots don’t come for your job.

But first, did someone send you The Distance? Subscribe here, and get it directly in your inbox each week. On to the main event:

Whether you’re a factory worker, a lawyer, a writer, a surgeon, or even a CEO, you’ve probably been warned that robots are coming for your job. Last year, Andrew Yang built an entire presidential campaign — and an argument for a universal basic income — on the threat of job-stealing robots. Famed Silicon Valley investor Sam Altman wrote this year that AI will soon do so much of everything, that it will redefine our notion of, well, “everything.”

But, if you’re paralyzed by the threat of robots taking your job, fear not: Warnings about automation are not new. As early as 1885, The New York Times warned that machines were the “great enemy of labor” and created “the workingman’s troubles.” Keynes fretted over “technological unemployment,” as did, a matter of fact, Einstein.

Some of the most cited warnings about robots come from a 2013 Oxford University study, which estimated that 47% of U.S. jobs are at high risk of automation in the next few decades, and a 2015 McKinsey Global Institute report that put the number at 45%. But dig a little deeper into the doomsaying, and the future is not so dystopian. Both of these analyses focus on job functions — or parts of your job, in other words — that may change because of technology; they do not suggest that robots and AI will lead to mass unemployment.

“There are very few jobs that are fully automated away,” said Brad Hershbein, a senior economist at the W.E. Upjohn Institute for Employment Research.

The truth is that technological change almost always creates more jobs than it destroys, a fact that McKinsey itself notes in its reports. This general rule applies not just to more modest innovations like the ATM — which once threatened to destroy the job of a bank teller, but actually had the opposite effect — but it applies to bigger disruptions, too. Since 1980, McKinsey estimates, the personal computer has created 15.8 million net new jobs.

Jeff Strohl, the director of research at Georgetown University’s Center on Education and the Workforce, likes to begin speeches by raising the specter of robot-driven job loss. “I always ask the audience, ‘When was the last time you saw an occupation blink out of existence right in front of your eyes?’” Strohl never gets an answer. McKinsey itself has estimated that “very few occupations — less than 5 percent — consist of activities that can be fully automated.”

What’s certain is that technology will change your job, and not always for the better — think of cab drivers morphing into on-demand gig workers, or the way some customer service workers can only be reached by first dealing with a chatbot.

These changes are also likely to happen at a very bad time. Studies have shown that automation tends to follow big financial shocks, as firms use recessions to deploy technology that hasn't yet been adopted en masse. (Think of retailers installing more self-service checkout kiosks during the COVID crisis, for example.)

In a 2017 paper, Hershbein and Lisa B. Khan of Yale University found that companies most affected by the Great Recession of 2007-2009 were more likely to invest in new technology. As a result, these firms were more likely to “upskill” by demanding higher-skilled workers.

“Learning how to learn new things, whether they be broad or narrow skills, is arguably the most important career skill.”

So what can you do to stay ahead of technology changing your job? If you’re a doctor or a lawyer, you already likely have professional training and recertification built into your job. But for everyone else, formal worker retraining programs — once the province of unions or careers spent with one employer — are now rare.

“We used to do a lot of training in America,” Strohl said, “but companies pushed the costs of training back on to the worker.”

If your company does offer worker retraining, there are reasons to be skeptical of its real value. In 2019, Amazon announced that it would spend $700 million to retrain roughly one-third of its U.S. workforce by 2025. This sounds impressive, but there’s little data on the effectiveness of corporate retraining programs. Worse, your company may not have your interests in mind. “From the point of view of the Amazons and the Walmarts of the world, these programs are basically a labyrinth where they can identify the workers who they want to keep and develop,” said Hershbein.

What’s clear is that the most effective retraining programs teach more than just technical skill. Research has shown that broader worker training can be more helpful than narrow, task-based approaches. This may be one reason why coding bootcamps tend to underwhelm.

For executives and white-collar workers, it’s certainly possible — and advisable — to learn new skills. But broadening your existing capabilities may be a smarter move than a wholesale career reinvention. Research from Matthew Bidwell, an associate professor of management at Wharton School at the University of Pennsylvania who studies white-collar workers, has found that the careers of people who generalize may actually end up better in the long run than those who go off and learn an entirely new set of skills.

To keep up with technology, Bidwell suggests finding ways it can be used to leverage what you’re already good at. “If I was in marketing then, yes, I’d be trying to learn data science and take SQL training,” Bidwell said. “But I’d want to do it in a way that augments my existing skills and experiences.”

Even with all warnings about all-powerful robots disrupting careers, sometimes the way to get ahead is shockingly simple. Bidwell’s working on new research that suggests that people who move sideways in their career — that is, take a job that is on the same level at their company or at another firm — are more likely to get promoted and earn pay raises.

If there’s a takeaway here, it’s that learning how to learn new things, whether they be broad or narrow skills, is arguably the most important career skill. (Let’s see a robot do that.) In fact, Hershbein suggests revamping high school curriculums to specifically train graduates to learn to train themselves to get ahead of technology.

“There’s this idea that once you go to school, you’re done with your education,” said Hershbein. “That’s no longer how it’s going to work. You’re going to be working and learning.”

Ryan McCarthy is editorial director for politics, policy & society at Vox, the former editor in chief of VICE News, and previously worked for The New York Times, Washington Post, and Reuters.

Questions We're Asking:

Could a robot do your job?

• Have you noticed more automation lately — at the checkout, at restaurants, etc?

• Does the rise of technology scare you, in terms of job security or otherwise?

Write us, TheDistance@fundrise.com

One trick I’ve learned recently

Try and imagine what the older version of yourself would think about whatever decision you need to make, particularly around finances and family.

This would’ve helped me a few years ago, when I decided to commute into work instead of staying home and seeing my second-grade daughter’s Flag Day recital. At the time, I reasoned doing this was key to a successful career and important to our finances. But I didn’t think about what Future Emily would want. When my kids are grown and gone, what kind of memories do I want us to share? What kind of relationships will we have? What will really matter?

Now I know. I don’t even remember what the work was that day. A few years later that company laid me off in a downsizing. Meanwhile, my daughter was super proud of her performance in that recital, where she led the class in their rendition of the “50 Nifty United States.” She still talks about it. Each time, I feel a little pang.

But I’m not too hard on myself. My behavior is an example of something most of us do: discount the future. Social scientists also call it present bias: You don’t think much about the older version of yourself. It’s like Future You is essentially an entirely different person than Present You.

Failure to consider Future You is why some of us don’t max out our 401(k)s and keep the cash to buy stuff now. Or why you might pick up smoking when you’re young — it’s just impossible to conceptualize your adult self, and how that self is destined to pay the cost of today’s decisions.

And one way to get people to save more for something that seems so far off — or to do something that seems like a pain now — is to build a connection to our future selves.

It’s like when people say kind of corny things about saving the planet “for the children.” Except YOU are “the children.”

Here’s a quote from British philosopher Richard Parfit, cited in a 2011 paper about discounting the future, that explains it all elegantly:

“[O]ur future selves are like future generations. We can affect them for the worse, and, because they do not now exist, they cannot defend themselves. Like future generations, future selves have no vote, so their interests need to be specially protected. Reconsider a boy who starts to smoke, knowing and hardly caring that this may cause him to suffer greatly fifty years later. This boy does not identify with his future self. His attitude towards this future self is in some ways like his attitude to other people.”


What else we're reading:

Hot Streaks in Your Career Don’t Happen by Accident (The Atlantic): Just about everyone has a distinct period in their life when they produce their best work, writes Derek Thompson. He explores some new research that explains why hot streaks happen and how to plan for them.

Neuroscience’s Existential Crisis (Nautilus): If you feel like having your mind blown, read this piece by neuroscientist Grigori Guitchounts on how scientists have mapped the human brain… but don’t yet understand it. Our brains, in other words, haven’t really caught up with our brains.

Is It O.K. To Be A Luddite? (The New York Times): An oldie but goodie — in this essay from 1984 (Orwellian coincidence noted), Thomas Pynchon, a novelist fixated on technological conspiracy and apocalypse, tracks the history of humanity’s fear of tech. From textile workers who were scared of the implications of the industrial revolution, to Pynchon’s own misgivings of the nuclear sort, technophobia is hardly a new phenomenon.

What you’re saying:

Last week we asked if you would’ve waited for the second marshmallow. Some answers:

Peter Ingeman: “It is hard to say how I would have behaved as a child, but I know that my present self would certainly wait for the second marshmallow. I am now wise enough to understand that in this experiment I entered with no marshmallows at all, so gaining one or two is a windfall, and gaining none does not set me back.”

Michael Thomas: “Would I have waited for the second marshmallow? No way! Why not? First, I know they are bad for me. But mostly because I don't really like marshmallows. One marshmallow has a bit of novelty value. Two marshmallows would just be gross.”

Austin Tacker: “I probably would have eaten the marshmallow when it was given to me and not waited, as a four year old.”

Issei: “Assuming that I was told that I would receive another marshmallow with 100% certainty, I’d wait… but if it didn’t come, I would of course be mad.”

Keep the emails coming!

We love to hear from you: TheDistance@fundrise.com

See you in the future!

The Distance is edited by Emily Peck and Alex Slotnick, and produced by Katie Valavanis and Brett Wilson, with help from Caitlin Daitch and Motasem Halawani. Our design director is Erin Culliney.