You can watch it on MIT TechTV or embedded below:
This fall I will be taking a leave from my job to be a full-time graduate student in CMS at MIT. More on that later. For now, this post lays out the contours of my proposed master’s thesis, both to help me organize my own thoughts and also in the hopes others will help me think about them.
In 2009 a loosely-knit group of conservative Diggers founded the Digg Patriots, a highly active “bury brigade.” Hosted in a Yahoo!Group and facilitated by a variety of post-tracking technologies, the Digg Patriots would link each other to what they deemed unacceptably “liberal” posts or posters so that they could team up to “bury” them by downvoting into obscurity. According to Phoenixtx, a founder of the Digg Patriots, “The more liberal stories that were buried the better chance conservative stories have to get to the front page. I’ll continue to bury their submissions until they change their ways and become conservatives.”
In 2008, a conservative blogger accused “the left” of similarly strategizing to flag conservative YouTube videos as spam or abusive for takedown. And, almost a year ago today, links to a U.K. strike site began being blocked as spammy on Facebook under strange and unexplained circumstances.
These incidents differ in important respects but they are characterized by a common dynamic: end-users repurposing certain algorithms to remove content from the stream of conversation.
It is my argument that today’s dominant information ecosystem which has widely distributed the means of information production has also widely distributed the means of informational removal, and that as Internet intermediaries have designed and deployed tools to incorporate “social” feedback into quality assurance algorithms, users have begun to strategically repurpose these tools in order to silence speech with which they disagree. And the goal of my research is to document and define user generated censorship as an emergent practice in relation to the mediating technologies which enable it.
Why “user generated censorship”?
For one, it nicely mirrors and invokes user generated content. Besides the rhetorical flourish, the invocation actually has an intellectual purpose, because the technological affordances and social practices which are associated with user generated content are the same affordances and practices which allow for their opposite. Put more plainly: the design of reddit lends itself to the earnest upvote but also the strategic downvote. The sorts of end-user power and input which characterize social production / Web 2.0 / whatever empowers users not only to produce content but also to remove it.
For another, the word “censorship” is controversial and contested, and I am going to try to use that historical weight to hammer home why this matters. Censorship – as opposed to repression – is something that we think of as being an exercise of centralized power. A pope censors. A king censors. Even a local autocrat draws their power ex officio.
But the reason we worry about censorship has nothing to do with the structure of power which enables it but rather the results which obtain: the silencing of ideas, of culture, of alternative perspectives.
“Internet censorship” has been done to death in the academic (and popular) literature. But it is all the old dynamic in a new medium. One worries about Google in China – or just plain China or Google alone – because of the power that large centralized authorities can wield over their constituents (and each other).
The Digg Patriots, on the other hand, have no office and no formal power which exceeds that of any other individual user. But through their strategic behavior they were able to repurpose the power usually reserved by and for centralized authority towards their own ends.
This is interesting and new and different, I think. Facebook has a lot of centralized power over the links shared in its news feed. It would never, I think, explicitly put content up for vote: “should we allow people to link to J30Strike?” Nor would it, I believe, allow its engineers to block content with which they politically disagree. But by allowing end users to make a nominally neutral decision (“is this spam”) and then enforcing that decision with the full power of a centralized network, Facebook – and everyplace else – has effectively delegated the power associated with the center of a network to a subset of the nodes at the edges.
So there is my project as a series of concentric circles. At its core, it is a journalistic enterprise, documenting what I believe to be an emergent dynamic between users and technology. But that dynamic operates within a larger context, not only of why information matters but how this new dynamic is an entirely new configuration of user power in the context of networked social intermediaries.
Today brings a guest post from MIT senior (and good friend) Paul Kominers. Full post after the jump.