Social Media Put Us All in Our Own Truman Shows, Threatening Our Democracy (Guest Column)

Capitol Riots Inset Jeff Orlowski
Getty Images

An image of the Jan. 6 Capitol riots and inset of 'The Social Dilemma' director Jeff Orlowski.

Big Social's business is keeping people hooked on their platforms, and their algorithms feed us content that provokes fear and outrage, writes Netflix's 'Social Dilemma' filmmaker.

On Jan. 6, rioters wielding bats and Confederate flags stormed the U.S. Capitol yelling, “The steal is real, the steal is real. They’re not operating by our f—ing laws. This is real. And theirs is fake.” People who saw themselves as warriors righting a rigged system called for “the people who stole this election from us … hanging from a rope out here for treason” and scrawled “murder the media” on one of the Capitol’s entrance doors. Later, our sitting president called them “very special” and some lawmakers said they were “patriots.”

How did we get here?

For the past four years, in the making of our documentary The Social Dilemma, my team and I have been studying how social media and search companies are impacting society. The tech insiders we interviewed revealed that social media has been distorting reality for more than a decade. Their business is keeping people hooked on their platforms, and their algorithms have learned that the best way to do that is by feeding us content that provokes fear and outrage. This doesn’t just change what we see and buy; it changes what we think, who we are and how we act.

In the 1998 movie The Truman Show, Jim Carrey plays the unwitting star of his own reality TV show. He lives in ignorance of the artificial universe — from the fake morning paper to the actress he “marries” — that has been fed to him in a literal bubble since he was born. He sees himself as a free agent but, in reality, his whole world is counterfeit and controlled.

The 3.6 billion of us on social media today are all Trumans. While we think these platforms are connecting us to the world, they’re actually separating us from reality. Each of us lives and learns inside a personalized filter bubble that changes — and radicalizes — how we function in society.

The consequences can be catastrophic: Fueled by viral false information, a 28-year-old volunteer firefighter and father holds a pizza restaurant at gunpoint to save imaginary children from being molested in an imaginary basement. Suburbanite Brits burn dozens of 5G towers because they’re falsely convinced the technology is spreading coronavirus. A Midwestern pharmacist attempts to destroy 550 doses of life-saving vaccine.

As we learned from the mastermind and puppeteer of Truman’s curated life, “we accept the reality of the world with which we are presented. It’s as simple as that.”

But we’ve seen and can no longer ignore the results of algorithm-induced extremism and polarization. The invasion of our nation’s Capitol was, as Rashad Robinson, president of Color of Change, put it, “the culmination of years of dangerous algorithms.” Stop Hate for Profit explained in a statement that “seeds of hate have already been sown through countless tweets and posts and streams that have burst forth into a violent real-world reality.”

This was not an unexpected event. It was the guaranteed outcome of social media. People were radicalized and incited by their leader on social media with false information of election fraud, planned a violent mob on social media, and then took their extremist ideas off of social media, enacting real-world harm. "[On Jan. 6] I saw what I have been expecting to see for the last several months, even several years," said Whitney Phillips, a researcher of misinformation at Syracuse University.

And social media companies cannot plead ignorance. An internal Facebook report leaked last summer acknowledges that their algorithms “exploit the human brain’s attraction to divisiveness.” Facebook’s research found that 64 percent of all users who join an extremist group on the platform do so because of their own recommendation tools. The report concedes, “our recommendation systems grow the problem.”

“It's as though we have less and less control over who we are and what we really believe,” says Justin Rosenstein, co-inventor of the Facebook "like" button, in The Social Dilemma.

The experts and tech insiders we interviewed in the film warned us about the dire consequences of letting Big Social play God. When Facebook’s former director of monetization, Tim Kendall, was asked in our film what he was most worried about, he replied, “civil war.” At the time that seemed alarmist, but today it feels prescient. The events at the Capitol weren’t just the terrifying closing of the Trump era; they were a symptom of a social fabric shredded by social media. And, while banning Trump from Twitter and Facebook for inciting violence was long overdue, it’s hardly the solution. The problem is in the code, and that’s not going away with Trump.

At the end of The Truman Show, the protagonist faces a choice. He can accept the easy, false reality with which he’s been presented, or courageously choose the truth. We, too, are faced with this dilemma. We can continue to exist in a deluded, distorted, increasingly extreme and polarizing universe, or we can mindfully demand something better.

Our social media puppeteers also have a choice. Will they complacently watch their creation destroy democracies, or will they take responsibility for fixing the hate-filled mess they’ve made?