September 20, 2017

On Technology: How Hate Groups Forced Online Platforms to Reveal Their True Nature

The recent rise of all-encompassing internet platforms promised something unprecedented and invigorating: venues that unite all manner of actors — politicians, media, lobbyists, citizens, experts, corporations — under one roof. These companies promised something that no previous vision of the public sphere could offer: real, billion-strong mass participation; a means for affinity groups to find one another and mobilize, gain visibility and influence. This felt and functioned like freedom, but it was always a commercial simulation. This contradiction is foundational to what these internet companies are. Nowhere was this tension more evident than in the case of Cloudflare, a web-infrastructure company. Under sustained pressure to drop The Daily Stormer as a client, the company’s chief executive, Matthew Prince, eventually assented. It was an arbitrary decision, and one that was out of step with the company’s stated policies. This troubled Prince. ‘‘I woke up in a bad mood and decided someone shouldn’t be allowed on the internet,’’ he wrote in an email to his staff. ‘‘No one should have that power.’’

Photo
Online platforms offered unprecedented freedoms for their users — but these freedoms could be taken away at any moment, for any reason. Credit Illustration by Jon Han

Social platforms tend to refer to their customers in euphemistic, almost democratic terms: as ‘‘users’’ or ‘‘members of a community.’’ Their leaders are prone to statesmanlike posturing, and some, like Mark Zuckerberg, even seem to have statesmanlike ambitions. Content moderation and behavioral guidelines are likewise rendered in the terms of legal governance, as are their systems for dispute and recourse (as in the ubiquitous post-ban ‘‘appeal’’). Questions about how platforms like Twitter and Reddit deal with disruptive users and offensive content tend to be met with defensive language invoking free speech.

In the process of building private communities, these companies had put on the costumes of liberal democracies. They borrowed the language of rights to legitimize arbitrary rules, creating what the technology lawyer Kendra Albert calls ‘‘legal talismans.’’ This was first and foremost operationally convenient or even necessary: What better way to avoid liability and responsibility for how customers use your product? It was also good marketing. It’s easier to entrust increasingly large portions of your private and public life to an advertising and data-mining firm if you’re led to believe it’s something more. But as major internet platforms have grown to compose a greater share of the public sphere, playing host to consequential political organization — not to mention media — their internal contradictions have become harder to ignore. Far before Charlottesville, they had already become acute.

Advertisement

Continue reading the main story

In a bracing Vice documentary about the rally, a man identified as a writer for The Daily Stormer told the reporter Elle Reeve, ‘‘As you can see, we’re stepping off the internet in a big way.’’ He saw the turnout as confirmation that what he’d been a part of online was real. ‘‘We have been spreading our memes, we’ve been organizing on the internet, and so now they’re coming out,’’ he said, before digressing into a rant about ‘‘anti-white, anti-American filth.’’ This sentiment was echoed in active and longstanding far-right communities on Reddit and 4chan and adjacent communities on Facebook and Twitter.

Newsletter Sign Up

Continue reading the main story

It is worth noting that the platforms most flamboyantly dedicated to a borrowed idea of free speech and assembly are the same ones that have struggled most intensely with groups of users who seek to organize and disrupt their platforms. A community of trolls on an internet platform is, in political terms, not totally unlike a fascist movement in a weak liberal democracy: It engages with and uses the rules and protections of the system it inhabits with the intent of subverting it and eventually remaking it in their image or, if that fails, merely destroying it.

But what gave these trolls power on platforms wasn’t just their willingness to act in bad faith and to break the rules and norms of their environment. It was their understanding that the rules and norms of platforms were self-serving and cynical in the first place. After all, these platforms draw arbitrary boundaries constantly and with much less controversy — against spammers, concerning profanity or in response to government demands. These fringe groups saw an opportunity in the gap between the platforms’ strained public dedication to discourse stewardship and their actual existence as profit-driven entities, free to do as they please. Despite their participatory rhetoric, social platforms are closer to authoritarian spaces than democratic ones. It makes some sense that people with authoritarian tendencies would have an intuitive understanding of how they work and how to take advantage of them.

This was also a moment these hate groups were anticipating; getting banned in an opaque, unilateral fashion was always the way out and, to some degree, it suits them. In the last year, hard-right communities on social platforms have cultivated a pre-emptive identity as platform refugees and victims of censorship. They’ve also been preparing for this moment or one like it: There are hard-right alternatives to Twitter, to Reddit and even to the still-mostly-lawless 4chan. There are alternative fund-raising sites in the mold of GoFundMe or Kickstarter; there’s an alternative to Patreon called Hatreon. Like most of these new alternatives, it has cynically borrowed a cause — it calls itself a site that ‘‘stands for free speech absolutism’’ — that the more mainstream platforms borrowed first. Their persecution narrative, which is the most useful narrative they have, and one that will help spread their cause beyond the fringes, was written for them years ago by the same companies that helped give them a voice.

John Herrman is a David Carr Fellow at The New York Times.

Sign up for our newsletter to get the best of The New York Times Magazine delivered to your inbox every week.

Sign up for our newsletter to get the best of The New York Times Magazine delivered to your inbox every week.

A version of this article appears in print on August 27, 2017, on Page MM18 of the Sunday Magazine with the headline: Online platforms annexed much of our public sphere, playacting as little democracies — until extremists made them reveal their true nature. .

Continue reading the main story

Article source: https://www.nytimes.com/2017/08/21/magazine/how-hate-groups-forced-online-platforms-to-reveal-their-true-nature.html?partner=rss&emc=rss

Speak Your Mind