There comes a time in the life of every business, organization, neighborhood, e-commerce site, beer league or gathering — in the life of any collective human endeavor — when it becomes clear that rules are needed. If we didn’t need structures to discourage cheating, lying, selfishness, uncomfortable behavior, bad-faith actions, or outright criminality, we’d be living in a utopia. But we’re not.
We are at such a pivotal moment in the brief history of social media, when it has become imperative that more rules be created and enforced. The big breaking point was the suspension of President Donald Trump from Facebook and Twitter in January, for sharing messages about the Capitol protesters — rioters — being "very special" and that people should "remember this day forever," for example.
A Facebook-appointed independent oversight board had been pondering this unprecedented suspension and on Wednesday issued the pronouncement that the social media giant was right to temporarily suspend Trump for "maintaining an unfounded narrative of electoral fraud and persistent calls to action," which created "an environment where a serious risk of violence was possible."
But this specific breaking point was a danger zone Facebook and other social media companies have long been approaching yet avoiding. Social media for all its connective good has also become a magnet for bullying, misinformation, attention addiction, and a loss of civility, and it has provided entrees to discrimination and violence, in politics and beyond.
That is why Congress and the federal courts should finally get serious about the regulation of social media behemoths. That starts with amending Section 230 of the 1996 Communications Decency Act, which gives legal immunity to internet companies for any consequences from the content posted on their sites. The risk of losing big money always tends to make corporate behavior more responsible.
The oversight board's 12,000-word criticism highlighted the company's arrogance in refusing to provide information as to how the decision about Trump was made and whether financial or political considerations came into play. But in the end, it punted back to Facebook the decision to reexamine Trump’s suspension and decide whether the ban is permanent within six months.
We can argue about what the rules should be, and in what way, for example, posts including bad-faith and uncorrected lies might be flagged or deleted, particularly for accounts with big followings. There can be debates about how many strikes should result in temporary and then permanent account suspension. Certainly, this isn’t easy. Maybe some social media companies will decide they don’t want to bother with real community standards, and that’s a business decision that may draw or repel certain users.
But for companies with the global dominance and scale of Facebook, users deserve better rules to keep the platforms safe and to stop the spread of disinformation. Love or hate Trump, Facebook's murky decision about its standards and how they were applied to the former U.S. president reveal that our problems are with Facebook itself.
— The editorial board