Facebook creates a fork-in-the-road moment for Trump — and the rest of us
A mixed decision by Facebook's oversight board has implications for social media firms shaping our democracies
We've arrived at a consequential moment for Donald Trump, his political power, and his potential ambitions for an electoral comeback.
It's also consequential for all of us.
A group created by Mark Zuckerberg to be the so-called Supreme Court of Facebook just handed down a ruling of three-dozen pages with potentially far-reaching implications for the rules of communication in modern democracies.
The short takeaway from Wednesday's decision: Facebook was right to ban Trump from its platform for promoting the Jan. 6 storming of the U.S. Capitol, and the decision followed international human rights law.
Yet the decision also found fault with Facebook on two important fronts.
It said the company failed to clearly define the length of Trump's punishment and urged it declare, within six months, whether and when Trump might be reinstated.
It also faulted Facebook for not examining its own role in fostering unrest.
Facebook refuses to discuss own role
The oversight board wanted to investigate whether Facebook's own algorithms might have promoted the spread of misinformation and extremist content about the 2020 election.
But the company refused to share that information, which made clear not only its own aversion to that topic, but also the limited power of its oversight body and the difference from a real court.
History offers sobering examples of how innovations in communications technology, from the printing press to the telegraph, can have a dual effect.
On the one hand they democratize information, with more of it flowing more readily to more people. On the other: They occasionally trigger instability and upheaval.
In a coincidence of timing, on the same day that this oversight board faulted Facebook over election misinformation, officials in Arizona were still busy sifting through ballots from the last election. They're searching for bamboo fibers in the paper because of one of the many baseless election conspiracy theories that proliferated online said tens of thousands of fraudulent ballots shipped from Asia were stuffed into ballot boxes.
Social media's role already promises to become a policy issue in the next U.S. election.
It highlights two emerging and clashing attitudes to how big tech might be regulated, and expect to hear more in the 2022 congressional midterms.
Two clashing visions on regulation
Republicans call it a free speech issue and vow to crack down on tech companies if they regain congressional power next year.
It's already begun: Republican state lawmakers in Florida and Texas have bills that would fine or sue tech companies for silencing political speech.
Facebook is more interested in acting like a Democrat Super PAC than a platform for free speech and open debate.<br><br>If they can ban President Trump, all conservative voices could be next.<br><br>A House Republican majority will rein in big tech power over our speech.
—@GOPLeader
On the other side of the aisle, many Democrats, and tech critics, want regulation targeted elsewhere: algorithms that they say push lies and extremist content, poisoning democracies in the process.
One analyst believes Facebook's company-appointed board is not the ultimate answer to this modern policy dilemma.
The reason Facebook created the board, said Jameel Jaffer, is to prove it can govern itself and doesn't need the thing it fears above all else: regulation.
He credited board members for doing a decent job under the circumstances.
"I think it's a thoughtful decision. And a completely defensible one," said Jaffer, director of the Knight First Amendment Institute at Columbia University.
What Facebook means to Trump
The person most directly affected by the ruling is the 45th president of the United States.
Being on Facebook matters to Trump — regardless of whether he chooses to run for president again.
It's a source of political power, a megaphone that amplifies his voice and a magnet for raising money.
Trump's now-inactive Facebook page has 35 million followers. To put that in context, that's 10 times the audience of the most-watched prime time show on Fox News, Tucker Carlson's; 30 times the number of people listening when Trump calls into favourite morning TV show Fox & Friends; 60 times the most popular show on the right-wing Newsmax network.
Trump has lost access to that account, to his Twitter account with nearly as many followers, and to YouTube.
He's now resorted to communicating with the public by emailing a subscriber list, where he delivers the sorts of messages each day he'd usually be posting online.
Harder to reach audiences
Trump's messages aren't being heard as often lately, according to one pollster.
Morning Consult surveys found 56 per cent of Republican respondents were at least slightly aware of Trump blasting Senate Republican Leader Mitch McConnell in February; only 39 per cent had heard of a similar statement in March; and just 30 per cent in April.
This could start mattering closer to next year's midterms.
WATCH | What Apple's new privacy controls mean for Facebook:
Trump's endorsements have the potential to make and break candidacies in primary contests given his towering popularity among Republican voters.
Look no further than a power struggle playing out this week in Washington for an example of his power to crush, or elevate, people within the Republican Party.
Wyoming Rep. Liz Cheney risks being turfed from the party's congressional leadership.
That's not because she votes against party policy, or is insufficiently conservative, as her voting record shows support for her party and its supposed ideology.
The reason Trump and his allies want her gone is she keeps criticizing Trump.
"A warmongering fool," is what Trump called Cheney in an emailed statement. In different times, he might have tweeted and posted this view on Facebook.
The former president has a choice for her replacement: Elise Stefanik, one of the most liberal elected members of the Republican Party, who is heavily involved in cross-border issues as her district in upstate New York touches the Canadian border.
But these days she's gotten a reputation for a willingness to go all-out for Trump, in his impeachment hearings and his effort to overturn the 2020 election.
Which brings us back to the influence of Facebook and the new reality of elections in the social media era.
One professor of speech law says he's not sure Facebook's oversight board, which includes a former prime minister of Denmark, should be leaning on the principles of international human rights law to guide speech decisions affecting U.S. politics.
The Board has upheld Facebook’s decision on January 7 to suspend then-President Trump from Facebook and Instagram. Trump’s posts during the Capitol riot severely violated Facebook’s rules and encouraged and legitimized violence. <a href="https://t.co/veRvWpeyCi">https://t.co/veRvWpeyCi</a>
—@OversightBoard
Just like Canadians might want to have their political issues settled by Canadians, Americans might be angry that an international board is making recommendations about who gets banned from Facebook, said Eugene Volokh, a libertarian-leaning professor at UCLA.
"Imagine that, in 2024, Trump runs again, but is banned again," he said in an interview.
"And [imagine] the election goes on with [only] one candidate who is allowed to use this platform that is tremendously important to reaching out to voters ... I'm not sure that American voters would perceive that as a free and fair election."
He said he usually opposes regulatory heavy-handedness but said there's an argument for regulating online platforms like phone companies, or parcel-delivery services, to prevent discrimination.
Jaffer said the real scrutiny belongs elsewhere — particularly on how social media giants design their algorithms, and the effect they're having.
"Facebook's algorithms shunt people into echo chambers. They often have the effect of amplifying misinformation," he said.
"Facebook's political ad policies also insulate people from counter-speech — in other words, insulate people from views that are different from their own. And so Facebook, too, bears some responsibility here."
With files from Susan Ormiston