April 20, 2024

The Complex Debate Over Silicon Valley’s Embrace of Content Moderation

Ellen Pao, once the head of Reddit, the freewheeling message board, publicly rebuked her former company. She said it was hypocritical for the Reddit leader Steve Huffman to signal support for the Black Lives Matter movement as he recently did in a memo, since he had left up the main Trump fan page, The_Donald, where inflammatory memes often circulate.

“You should have shut down the_donald instead of amplifying it and its hate, racism, and violence,” Ms. Pao wrote on Twitter. “So much of what is happening now lies at your feet. You don’t get to say BLM when reddit nurtures and monetizes white supremacy and hate all day long.”

A hands-off approach by the companies has allowed harassment and abuse to proliferate online, Lee Bollinger, the president of Columbia University and a First Amendment scholar, said last week. So now the companies, he said, have to grapple with how to moderate content and take more responsibility, without losing their legal protections.

“These platforms have achieved incredible power and influence,” Mr. Bollinger said, adding that moderation was a necessary response. “There’s a greater risk to American democracy in allowing unbridled speech on these private platforms.”

Section 230 of the federal Communications Decency Act, passed in 1996, shields tech platforms from being held liable for the third-party content that circulates on them. But taking a firmer hand to what appears on their platforms could endanger that protection, most of all, for political reasons.

Article source: https://www.nytimes.com/2020/06/05/technology/twitter-trump-facebook-moderation.html

Speak Your Mind