March 16, 2025

State of the Art: Tech Companies Like Facebook and Twitter Are Drawing Lines. It’ll Be Messy.

To do this, Facebook has partnered with dozens of fact-checking organizations around the world. It limits the spread of news that has been deemed false by showing those posts lower in users’ News Feeds, and it also displays more truthful articles as an alternative to ones that aren’t accurate.

Andrew McLaughlin, a former head of policy at Google who now runs an incubator that aims to build technology for progressive political movements, said he was impressed by Facebook’s efforts.

“I think I’m representative of a certain crowd of people who once took a really strong sense of pride in the sturdiness of our commitment to free speech on internet platforms,” he said. “But my views have certainly shifted in the caldron of experiences — and I am now glad that platforms like Facebook are really focusing resources and energy on malicious, manipulative propaganda.” (He previously consulted for Facebook, but is not currently working for the company.)

But I’m less sanguine, because there’s a lot we still don’t know about these policies and their effects.

One lingering question is political neutrality. Facebook has been targeted by conservatives who argue — without much evidence except for the fact that Silicon Valley is a liberal cocoon — that its efforts to police speech might be biased. In response, Facebook has invited Jon Kyl, a former Republican senator, to audit the company for bias against conservatives. Liberals, meanwhile, have argued that Facebook, in refusing to ban right-wing conspiracy factories like Alex Jones’s Infowars, is caving to the right.

I asked Ms. Bickert if Facebook takes potential political repercussions into account when deciding its policies. She told that me her team “seeks input from experts and organizations outside Facebook so we can better understand different perspectives and the impact of our policies on global communities.”

That’s gratifying, but it doesn’t get to the heart of the problem: Facebook is a for-profit corporation that, for both regulatory and brand-image reasons, wants to appear politically unbiased. But if it determines that some political actors — say, the alt-right in the United States, or authoritarian dictators elsewhere — are pumping out more false news than their opponents, can we count on it to take action?

Article source: https://www.nytimes.com/2018/07/25/technology/tech-companies-facebook-twitter-responsibility.html?partner=rss&emc=rss

Speak Your Mind