Highlighted Selections from:

The Virtues of Moderation


Grimmelmann, James (2015) “The Virtues of Moderation,” Yale Journal of Law and Technology: Vol. 17: Iss. 1, Article 2.

p.1: TL;DR—On a Friday in 2005, the Los Angeles Times launched an experiment: a “wikitorial” on the Iraq War that any of the paper’s readers could edit. By Sunday, the experiment had ended in abject failure: vandals overran it with crude profanity and graphic pornography. The wikitorial took its inspiration and its technology from Wikipedia, but missed something essential about how the “the free encyclopedia that anyone can edit” staves off abuse while maintaining its core commitment to open participation.The difference is moderation: the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse. Town meetings have moderators, and so do online communities. A community’s moderators can promote posts or hide them, honor posters or shame them, recruit users or ban them. Their decisions influence what is seen, what is valued, what is said. They create the conditions under which cooperation is possible.This Article provides a novel taxonomy of moderation in online communities. It breaks down the basic verbs of moderation—exclusion, pricing, organizing, and norm-setting—and shows how they help communities walk the tightrope between the chaos of too much freedom and the sterility of too much control. Scholars studying the commons can learn from moderation, and so can policy-makers debating the regulation of online communities. – Highlighted Jan 1, 2017

p.4: Anarchy on the Internet is not inevitable. Spaces can and do flourish where people collaborate and where all are welcome. What, then, separates the Wikipedias from the wikitorials? Why do some communities thrive while others become ghost towns? – Highlighted Jan 1, 2017

p.5: A second group, the software and interface designers who are responsible for the technical substrate on which online communities run, works closely with the first (indeed, they are often the same people). Their own professional literature offers a nuanced understanding of how the technical design of a social space influences the interactions that take place there. – Highlighted Jan 1, 2017

p.6: This richer understanding of moderation should be useful to scholars and regulators in two ways. One is theoretical: wellmoderated online communities catalyze human cooperation. Studying them can provide insights into the management of common-pool resources and the creation of information goods, two problems moderation must solve simultaneously. Studying online communities is thus like studying fisheries or fan fiction—a way to understand society. The other payoff is practical. Many laws either regulate the activities of online communities or exempt them from regulation. The wisdom of these choices depends on empirical facts about the value and power of moderation. Regulators cannot properly evaluate these laws without paying close attention to how moderation plays out on the ground. – Highlighted Jan 1, 2017

p.9: Second, moderation can increase access to online communities. Openness is partly about efficiency: more members can make the community more productive. But openness also has moral consequences: cutting people off from a community cuts them off from the knowledge the community produces. – Highlighted Jan 1, 2017

p.20: Moderation’s biggest challenge and most important mission is to create strong shared norms among participants. Norms can target every form of strategic behavior. For example, if every author refrains from personal attacks, there is no further personal-attack problem to be solved. Beneficial norms, however, cannot simply be set by fiat. By definition, they are an emergent property of social interactions. Moderators have limited power over group norms. Most of the levers they can pull will only nudge norms in one direction or another, possibly unpredictably. Good norm-setting is a classic example of knowhow. – Highlighted Jan 1, 2017

p.21: Moderators can influence norms directly by articulating them. They can do this either in general, with codes of conduct and other broad statements of rules, or in specific cases by praising good behavior and criticizing bad. The difference is the difference between “Don’t post images containing nudity” and “This post has been deleted because it contained nudity.” Note, however, that stating a norm does not automatically promote it. There is empirical evidence that, in some circumstances, expressing a norm about user behavior can induce exactly the opposite response. – Highlighted Jan 1, 2017

p.22: Highlighting good behavior and hiding bad behavior reinforce participants’ sense that good behavior is prevalent while also teaching them what to do. – Highlighted Jan 1, 2017

p.22: As a result, designers frequently worry about how to balance competitive and cooperative impulses. Competition can spur users to individual effort at the cost of social cohesion, and different communities strike the balance differently. – Highlighted Jan 1, 2017

p.22: There are four important distinctions that affect how a type of moderation operates: (1) humans vs. computers, (2) secret vs. transparent, (3) ex ante vs. ex post, and (4) centralized vs. decentralized. These are the “adverbs” of moderation. These four distinctions are independent: any Verb of moderation can be applied using any of the sixteen possible combinations. – Highlighted Jan 1, 2017

p.29: Just as one size does not fit all forms of moderation, one size does not fit all communities. Communities differ along many axes: the email system has different properties than Wikipedia, which has different properties than the comments section of a blog. Four characteristics of a community are particularly important in affecting the kinds of strategic behavior threatening it and the effectiveness of various types of moderation: (1) the capacity of the infrastructure, (2) the size of the user community, (3) the distribution of ownership, and (4) the identifiability of participants. As with the adverbs above, these characteristics are mostly independent of each other. – Highlighted Jan 1, 2017

p.35: All four verbs of moderation can tap into identity. Exclusion absolutely depends on it; without identity, the distinction between “outsiders” and “insiders” collapses. – Highlighted Jan 1, 2017

p.44: Relatedly, transparency is a key aspirational virtue. Because every edit is logged, Wikipedians are expected to explain and if necessary defend their actions in sometimes excruciating detail. The process of being given administrator privileges can involve a harrowing examination of one’s editing history, often by other editors with an axe to grind. – Highlighted Jan 1, 2017

p.46: The Los Angeles Times ignored all of this. Like Wikipedia, it was open to the world, but it had none of Wikipedia’s devices for helping the well intentioned collaborate while keeping the ne’er-do-wells at bay. Unlike Wikipedia, the Times had no way to block persistently harmful users—not even a mechanism to track and identify the worst abusers. Unlike Wikipedia, it had no back channel for users to converse and develop community norms or dispute-resolution mechanisms to contain conflict, and the experiment failed long before they could evolve. The Times forced users with strongly divergent beliefs on a controversial topic together, exacerbating normative conflict. It brought them together for a one-off project, with no long-term reputations to recognize trustworthy members of the community. It had no dedicated cadre of administrators cleaning up destructive edits. Vandals who saw the broken windows decided to storm the front door. – Highlighted Jan 1, 2017