Op-Ed: The content moderation and coordination conundrum for Big Tech


Underlying most public policies (good or bad) are the best of intentions. However, on problems as severe as online child sexual abuse material (CSAM), good intentions are no excuse if the policy makes the situation worse. The recent and bipartisan crusade against big tech, fueled by various agendas, is making the fight against CSAM much more difficult.

At a recent Senate Judiciary Committee hearing, Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey faced all sorts of questions and criticisms, often contradictory. In general, Republicans seem to think online services are moderating content to such an extent that they are engaging in systemic “censorship” of conservative viewpoints. Meanwhile, Democrats seem to think the same services aren’t doing enough to stop the spread of harmful or misleading content.

One thing both sides agree on, thankfully, is the need to combat CSAM. There is bipartisan legislation on the matter in the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act. Unfortunately, the bill is severely flawed in several areas. One part of the bill does, to a certain extent, get at the right way to tackle CSAM.

The EARN IT Act creates a body where tech companies can coordinate and exchange best practices on combatting CSAM. Apart from the other non-starter issues in the bill, the basic idea of getting people to work together to effectively protect children online and perhaps combat other harmful material is unobjectionable. If you see something, say something, right?

Or so one would think.

During the mid-November hearing, Sen. Josh Hawley (R-Mo.), an original sponsor of the EARN IT Act, accused Facebook and Twitter, along with Google, of working together to “coordinate censorship.” Hawley tweeted, “Whistleblower says Twitter and Google routinely suggest censorship topics - hashtags, individuals, websites, many of them conservative – and Facebook logs them for follow-up on Tasks. But [Mark Zuckerberg] REFUSES under oath to turn over list of Twitter or Google mentions on Tasks[.]”

Despite his intentions of trying to protect the speech of the users of online services like Facebook and Twitter, this line of attack from the senator is unhelpful and constitutionally problematic, at best.

Given the immense pressure from both sides of the aisle to combat things like CSAM, foreign propaganda, violence, and other harmful content, we should expect a significant degree of coordination between tech firms. If something awful slips through the cracks of one system, then all the companies in the industry could pay the price through reputational, regulatory, or legal repercussions. Section 230, a law that provides some liability protection for online services like Facebook and Twitter from what their users post, has taken a lot of flak of late, including the most recent hearing. However, the law does nothing to insulate these firms from criminal law.

Aside from the fact these companies are run by people with beating hearts and consciences, there is a real incentive for tech firms to work together against harmful content. There’s nothing necessarily nefarious about it at all. Again, this coordination is reasonable to such an extent that a bipartisan group of senators is attempting to codify a version of it.

But because of the huffing and puffing by folks such as Sen. Hawley, expect less of this coordination going forward, at least on a voluntary basis. In addition to Sen. Hawley’s criticism of coordination on content moderation, Facebook and Google are both the focus of significant antitrust scrutiny, where coordination is a very dirty word. This is awful news for the fight against CSAM and heinous online content.

Adding insult to injury is the fact that many of the content moderation decisions on which companies coordinate, and are now being criticized, are constitutionally protected. While CSAM is rightfully illegal due to the unquestionable harm it causes, there are plenty of other things that are not as clear-cut, and are therefore legal, that most users do not wish to see nor with which web services want to be affiliated.

Like it or not, it is just as legal for private companies to disassociate from terrorist propaganda as it is for them to refuse to host certain mainstream political viewpoints. The First Amendment is not about protecting speech and decisions on which everyone agrees.

This is all to say that even if companies are engaged in biased moderation practices, it is pretty clear that this is just none of the government’s business. Pretending it is and threatening companies, through everything from antitrust action to Section 230 reform, risks hampering their ability to work together to fight the things on which we all agree and for which the First Amendment offers no protection, such as CSAM.



* This article was originally published here
HELP STOP THE SPREAD OF FAKE NEWS!

SHARE our articles and like our Facebook page and follow us on Twitter!




Post a Comment

0 Comments