Reflections On Facebook vs J30Strike

by on Jun.20, 2011, under general

Earlier today I posted about a current crisis of censorship on Facebook.

Background:

  • Citizen activists created J30Strike.org, a website advocating peaceful demonstration against austerity measures in the U.K. on June 30th
  • Sometime in the last 48 hours or so, anyone who tried to post or share a link to J30Strike on their Facebook account was blocked from doing so, receiving instead an error message much like this one informing them that the site had been reported as abusive or spam
  • I discovered the block through word of mouth – primarily people noting it on Facebook, evading the prohibition by spelling out J 30 STRIKE DOT ORG – and wrote a blog post about it
  • Soon after that, people trying to share my blog post on Facebook began receiving error messages as well, marking my link as abusive or spam
  • Somewhere in here, all known redirect links to the site (bitly, tinyurl, etc) stopped working as well
  • Posting links to J30Strike began intermittently working again for at least some people this evening, after many formal reports submitted to Facebook and at least one call from a journalist

What’s the deal? Well, there are two possible explanations, and they are both troubling, albeit in different ways.

The first explanation is the good old-fashioned corruption of shadowy censors within Facebook. Very straightforward. Someone at Facebook is a fan of austerity measures and doesn’t like movements like J30Strike; she adds it to their spam system as a pretext, with the intent to keep individuals from learning about the site.

This scenario, while admittedly conspiratorial, is somewhat more plausible than you may think. Peter Thiel, one of the executives known to guide Facebook in many ways, is a well-known ultra-libertarian who, I think it’s safe to say, support austerity measures. And, as I mentioned in my previous post, Zuckerberg and David Cameron were last seen chumming around via videochat, talking about ways to solicit cuts from the Facebooking public.

But I actually don’t think that’s what happened here. Or at least, I’m willing to believe that there is another explanation, despite the shady pattern of blocking and unblocking that occurred this afternoon. And it’s not because I put it past Facebook to do something this scummy – lord knows I’ve called them worse before.

No – there is a second explanation that is both more benign and believable in its cause and just as terrible in its consequences.

See, I think it is more likely than not that Facebook was on the level. No grand conspiracy involving cryptocorporate machinations. Sure, maybe a few ticked-off Tories hit the “Report” button on J30Strike with the intent to get it kicked. But the censorship itself was all the product of automated technology. The initial site gets reported and kicked. Facebook has smart technology to sniff redirects to their destination and then block them as well. And since the metagraf pulled automatically from the text by the Facebook Share functionality included, in the grey text, the forbidden URL, my site got banned too.

I think this is probably what happened. And while it may seem acceptable, it troubles me all the more.

We often conflate intent with outcome. The intuitive argument goes something like this: if Facebook didn’t intend to commit censorship – if this was all just a set of dominoes toppled by a few disingenuous reports – then it can’t be criticized for the automated authoritarianism that occurs.

Call me a consequentialist, but I’m not OK with that.

Facebook is increasingly the space within which people receive their information, including civic information. Shared newspaper links, blog posts, and conversations in the comments may not intuitively accrue as much respect as the Federalist Papers, but they are at least as important in the public discourse as the proverbial crier on the common was generations ago.

But where once there was a town common, there is now a walled garden, and the architecture of this enclosure threatens to throttle the pamphleteer before he so much as primes his printing press.

Assume the best of intentions on the part of Facebook for a moment. Look at what still happened.

A site advocating citizen activism vanishes from the face of Facebook. Then, alternate routes – the redirects – are shut down. After crushing the conversation, Facebook then successfully silenced the metaconversation – that is, posts like mine, commenting on the controversy, which merely linked to the content. Users had to resort to guerilla tactics – coded, deceptive transformations of the domain name – in order to spread the word slowly amongst the community.

Think about that for a moment. Think about the incredible, suffocating centralized power the Facebook filter represents to controversial opinions. Had this been in a traditionally public forum, banning truly offensive or abusive material would have had to survive the strict scrutiny of a skeptical judge. But on Facebook – merely a mediated public – presumably a few reports were enough to simply disappear (used as verb; gulag connotations intended) an entire movement. And it only came back because enough people – again, including at least one journalist who called Facebook to investigate – knew about it already. Had those avenues not been open, J30Strike on Facebook would have simply succumbed to a kind of automated crib death.

That scares the living hell out of me – just as much as a conservative cabal at the helm of a conspiracy.

We need to find out what happened here. We need to know how it was that a nonviolent, democratic demonstration was denied entry into the civic conversation. And we need Facebook to eradicate this Terminator style automated censorship from its architecture.

If the protests in the Middle East taught us anything, it is that these digital spaces can facilitate real action. In this Facebook cannot be Janus-faced, humbly accepting accolades for its role in toppling tyrants while simultaneously silencing citizens at home.

We’re far past the point where Facebook may acceptably discount or defer its moral responsibilities. We’re far past the point where they may plausibly claim to be a simple amoral actor in the social space. They are, by virtue of their own incredible success, co-consuls (with Google) of the world’s greatest information empire. We deserve, and must demand, their greatest care when it comes to matters of civic expression. Because the freedom of speech is too important to be merely automated out of existence.

8 comments for this entry:
  1. Facebook Censors Citizen Activism Website - Chris Peterson

    [...] e7: Mother Jones has a story up about this, and I have some additional reactions as well. [...]

  2. Facebook vs Ebert - Chris Peterson

    [...] chris on Jun.21, 2011, under general On the heels of the J30Strike fiasco, Facebook has turned its auto-censor cannon at…Ebert? For movie critic Roger Ebert, it took [...]

  3. Facebook automates suppression of freedom of speech? | infinite ideas machine

    [...] fascinating reflection by Chris Peterson on the on-off-on-off blocking by Facebook of “citizen activist” [...]

  4. The downside of Facebook as a public space: Censorship | TechDiem.com

    [...] not just taking down pages that Facebook users are concerned about: According to a blog post from one of the organizers of a recent public anti-government protest in Britain, a number of users reported that Facebook not only blocked them from linking to a website set up by [...]

  5. Why censorship on Facebook matters | Interchange Project

    [...] Facebook is the new town hall, its censorship is really troubling: Facebook is increasingly the space within which people receive their information, including civic [...]

  6. Facebook union bans: Three strikes and you’re out | johninnit

    [...] Chris Peterson has a very interesting post on what he thinks are the mechanics behind this one, and it seems very plausible indeed. It looks [...]

  7. [Kiosque] Facebook : un espace public avec une police privée (Internet Actu) | l'Arène

    [...] il y a une dizaine de jours quand la page Facebook du critique de cinéma Roger Ebert a disparu, et quand un groupe de militant britannique a vu ses contenus bloqués. Qui surveille les surveillants ? se demande Mathew [...]

  8. Facebook’s “Groups for Schools”: A Harmless Example of a Scary Trend - Chris Peterson

    [...] all of these cases, when the algorithms start failing, we can experience serious problems. Chris documented a case where links to a blog post of his were blocked for abuse on Facebook. The blog post in question [...]

Leave a Reply