LTP News Sharing:

Facebook is in a world of hurt. It may not survive. (Fingers crossed!)

Scott Shepard

Scott Shepard

Last week I pointed out that the Wall Street Journal’s revelation of a secret whitelist of favored Facebook users underscored the pressing need for Facebook to hire a raft of center/right employees, not just as a matter of fairness or even of sensible treatment of customers, but for its own survival. If its management, along with befuddled censorship steward Nick Clegg, really believed that objectively fair censorship had been achieved because conservatives were demanding less of it and liberals more, then Facebook bid fair to make everyone its enemy.

That Journal story, though, was just the first of a series. Stories over the remaining days of the week identified Facebook as a giant deceit factory. Readers discovered that Facebook knows that Instagram (its gateway drug, erm, app) traumatizes the most easily traumatized: lonely teenagers. They found out that Facebook can’t be bothered to hire local-language speakers to keep human traffickers and drug cartels off of its site in developing countries (where, mind, its big user growth is). They read that the company’s algorithms hype up the crazy, because it’s the crazy that keeps people engaged – and that these algorithms made it hard for lil’ Dead Eyes Zuckerberg himself to lead a vaccination-and-lockdown push as effectively as he had wished.

Pundits immediately started making comparisons between Facebook and Big Tobacco, which for years continued to assert that cigarette smoking was fine even as its internal studies showed that it carried all the risks that pretty much everyone since the 1920s had suspected it did. And so, the thinking runs, Facebook (and Big Tech or Big Social Media generally) may be headed for the same heavy regulation, restriction and mulcting by state attorneys general to which the tobacco industry has been subject.

But for Facebook (and, if the fates allow, Twitter), the case may be very much worse. Big Tobacco, after all, had dropped any claims about the health benefits of its products, such as cold prevention and “T-zone soothing” about the same time that Geratol stopped claiming to cure tired blood – long, long before the ‘90s treasury raid. The industry wasn’t a monopoly. And it didn’t design its cigarettes so that the inhalation of tobacco smoke rendered politically disfavored users mute in the public square.

For Facebook, none of this is true. It always had the option of just letting everyone post anything they wanted – which is what was envisioned (and the limit of technical possibility) when the Communications Decency Act (CDA) was passed in 1995, when all was good in the world. But instead it waded into the censorship business, promising that “trust and safety” would result from its efforts.

And yet now we find that Facebook’s promises were guff. If “trust and safety” were the sole (or even primary) purpose of Facebook censorship, then there could be no justification for excluding long lists of the privileged from any effective review. Thoughtful people are already wondering whether the whitelists alone remove Facebook’s censorship from the protection of section 230 of the CDA, which protects companies from being “held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” Does exempting the favored abjure good faith? That’s a nice question about which reasonable disagreement is possible, but the company’s problems only start there.

Facebook’s potential assertions of good faith become radically less plausible (and so 230-protection more tenuous) when we consider that Facebook is not censoring vast categories of posts and messages that it knows and acknowledges to be harmful to its users, such as those Instagram feeds. They deteriorate still further with recognition that Facebook instead focuses its limited censorship efforts on stopping people from posting demonstrably true things about, say, the Covid virus and vaccines, and stopping civil dissent in some – but only some – instances, with the distinction depending on personal staffer political bias. (A staff that thinks diversely like the country being censored, and so could avoid the worst of the bias on display every day in Big Tech, sure would help to avoid this pitfall.)

But any wisp of a claim to good faith disappears entirely when all these revelations are tied together. Facebook’s own research tells it that intensive use by teens is bad for them, and that being locked in their homes away from their friends increases this use and the harms that flow from that use. Facebook’s response is to censor efforts to end lockdowns and materials that demonstrate that hyper-panicked understandings of and responses to the illness are misguided. Meanwhile, Zuckerberg himself actively colludes with Chief Panic Monger Anthony Fauci to deny Americans constitutional rights.

It’s harder to image worse faith than that.

Once section 230 protections have been stripped away, Facebook will be liable to suit every time it labels truth or opinion (because opinion is not susceptible to such description) as “misinformation.” Calling someone a liar for saying true things is libel – a false impugning of character. Facebook might have a New York Times v. Sullivan defense if the party it thus libels were a public figure, but as we recently discovered, public figures are exactly the people who it exempts from censorship. So precisely the people Facebook censors will be able to sue for defamation, and Facebook’s own whitelisting policy will have identified them as people too unimportant to qualify as public figures.

Good work, Zuck.

To paraphrase Emperor Hirohito, things have developed not necessarily to Facebook’s advantage.

 

Scott Shepard is a fellow at the National Center for Public Policy Research and Director of its Free Enterprise Project. This was first published at Townhall Finance.

Author: Scott Shepard