A fringe messaging board was taken down after the El Paso mass shooting, but if the internet is the modern public square, should everyone have their say, asks Dave McGinn
After this month’s mass shootings in El Paso, Texas, and in Dayton, Ohio (31 people were killed and dozens injured), the online infrastructure and security company Cloudflare terminated its service for the fringe message board, 8chan.
This was because the El Paso gunman had uploaded a manifesto to that website.
Cloudflare is a security mechanism of last resort for websites that would otherwise be vulnerable to cyber attacks from hackers and ‘cyberactivists’, whether for extortion (to hold the company to ransom) or for politics (to bring the website down permanently).
Services like Cloudflare provide a ‘shield’ of sorts, filtering visitors to a website so that it cannot be overwhelmed by traffic and brought down, a reliable defence against distributed denial of service (DDoS) attacks, which are a simulation of thousands of simultaneous requests to the same website, so as to crash it.
This gives Cloudflare the final word on whether a website gets to function or not on the open web.
There is much discussion about the role of social media and the internet as incubators of deviant debate, extremist ideology, and divisive hate and vitriol.
This has been a major PR problem for technology behemoths such as Twitter, Facebook, and Google (YouTube), which are under pressure to curtail this content. The platforms themselves often drive traffic to harmful conspiracy theories and extremism, through their algorithms and recommendation engines.
The debate has become more partisan, as conservative commentators and demagogues (including Donald Trump, the president of the United States) have claimed that the moderation of content has an inherent bias against them, on the basis of ideology, as Silicon Valley tech execs are dominated by ‘liberal elites’.
The latest iteration of the debate has focussed on the prominent role of the larger tech companies within society, and whether they are the modern equivalent of the public square.
The US Supreme Court, in Packingham v North Carolina, unanimously held that states can’t broadly limit access to social media.
The corollary is that by virtue of their size and prominence, platforms such as Facebook and Twitter have an obligation to facilitate the free and unencumbered flow of speech.
The tech giants again find themselves desperately seeking a middle ground. On one side are liberal-leaning activists and commentators, and advocates for minorities and vulnerable groups, highlighting what they claim are purveyors of xenophobia and hatred, against, for example, Muslims and immigrants; on the other side are conservatives, claiming that discussion about important societal issues is being stifled when it deviates from the liberal orthodoxy.
The platforms’ own internal judgements are the basis for these decisions, which is precarious, if you accept the idea that they are the modern incarnation of the public sphere.
Traditionally, scepticism of large corporations that wield too much power is the purview of the political left, but not in the context of the internet culture wars, as right-wing demagogues are ‘purged’ from online platforms.
On the flip side, latter-day conservatives in the US have afforded more power to the private realm and promoted the virtues of market dynamics to resolve these product/market issues, but, nowadays, the argument against Big Tech is more regularly articulated by those on the right. A cynic might conclude that both sides are being disingenuous and self-serving.
On this issue, you can find strange bedfellows. US-based Irish cultural critic Angela Nagle, a leftist in the tradition of UK Labour party leader, Jeremy Corbyn (Nagle has been exiled from the progressive movement for deviating from certain liberal shibboleths) has argued that banning someone from a social platform “would be like banning a dissident figure from using the post office or a bank,” and that we are “allowing a few global monopolies to have absolute control over what information is accessible to the world.”
She has found common ground with Fox News presenter Tucker Carlson, whose show routinely hosts guests and airs segments that drive anti-immigrant narratives and conspiracy theories; he is regularly cited as a mainstream proponent of the white replacement theory, which feeds into the paranoia from which white supremacists draw their motivation.
In a recent editorial on his show, Carlson argued that by “censoring” dissenting voices on Facebook, for example, “left-wing tech companies control the political debate.”
The platforms’ policies are largely being defined in response to each novel situation that arises, a high stakes challenge when political candidates and elected representatives are involved.
English Defence League founder and former MEP candidate, Tommy Robinson (real name Stephen Yaxley-Lennon), recently had his YouTube account neutered — but not deleted — amid claims of hate speech.
The channel was demonetised, live streaming was disabled, and it was removed from search results, effectively rendering it a glorified video-hosting server (critics who responded include Labour MP Yvette Cooper, who said that YouTube are a “complete disgrace” for failing to block the content entirely).
In a blog post, the company concluded that, “blocking a world leader from Twitter, or removing their controversial tweets, would hide important information people should be able to see and debate.”
In Ireland, former presidential aspirant Gemma O’Doherty has recently had her YouTube account deleted, because of claims of xenophobia, prompting a modest, but enduring, protest outside Google HQ on Dublin’s Barrow St.
For their part, exiles from the mainstream platforms, and their sympathisers (including free speech absolutists, who don’t necessarily align with them politically), have had to come up with creative solutions to what they see as encroachment on the open exchange of ideas.
This has resulted in alternative platforms, including Twitter clone Gab, whose slogan is, “a social network that champions free speech, individual liberty, and the free flow of information online,” and which has become a magnet for anti-Semites and racists.
Gab recently moved its software to a decentralised, ‘federated’ system, called Mastodon, having been temporarily shut down by its hosting provider. Psychology professor and self-styled free speech defender, Jordan Peterson, is also planning to launch a new platform, ThinkSpot, “an intellectual playground for censorship-free discourse,” which is sure to be popular.
There have also been attempts to recover revenue streams when they are removed from a user’s channel, often YouTube’s first move for contentious content.
This was initially an attraction of using a third-party service like Patreon, where patrons can support content creators for what they post on mainstream platforms, while still maintaining independence from them.
However, Patreon, too, has fallen out of favour with those on the right, after shutting down the accounts of certain controversial figures, such as Sargon of Akkad and Milo Yiannopoulos, for hate speech, inspiring an exodus of conservative users from the service.
Digital cryptocurrency Bitcoin is ubiquitous in particular online groups, and it is stable, in the sense that it cannot be ‘switched off’, but is cumbersome and not regular or reliable.
More recently, software solution Entropy was launched as a kind of ‘wrapper’ around YouTube live streams, claiming to facilitate much of the same functionality that YouTube disables for contentious accounts — including user donations.
What the Cloudflare decision reveals is that it is not just whether or not a specific account is shut down on a particular platform, but it is also the very plumbing of the internet, which is in the hands of private technology companies.
Most people will not weep over the plight of 8chan, but it does put the situation into focus.
And there are a number of layers in hosting content such as the El Paso murderer’s manifesto, and, arguably, a number of layers of culpability in facilitating the dissemination of hateful propaganda.
Server hosts, security providers, domain registrars, payment companies, crowdfunding platforms, and myriad other discreet service providers all conspire to bring you the websites that you read every day — and the divisive hate that you probably don’t — and they can take it away, too.
Despite a lot of hand-wringing, legislators have so far failed to regulate social platforms — while many human rights and privacy advocates query whether they should at all — so, in the meantime, private technology companies remain the arbiters of acceptable content on the internet.
This creates something of an arms race, as popular service providers — whether for narrow commercial reasons or an unwavering commitment to basic decency — offload users that they allege promote hate and division, and the users seek, and usually find, alternatives.
While Cloudflare is the de facto standard for website defence, there are competitors, many of whom are sure to be less scrupulous (or more impervious to public pressure, depending on your perspective), so 8chan will be back online soon enough, though it remains to be seen for how long.