The Dutch city of Drachten has undertaken an unusual experiment in traffic management. The roads serving forty-five thousand people are “verkeersbordvrij”: free of nearly all road signs. Drachten is one of several European test sites for a traffic planning approach called “unsafe is safe.”1 The city has removed its traffic signs, parking meters, and even parking spaces. The only rules are that drivers should yield to those on their right at an intersection, and that parked cars blocking others will be towed.

The result so far is counterintuitive: a dramatic improvement in vehicular safety. Without signs to obey mechanically (or, as studies have shown, disobey seventy percent of the time2), people are forced to drive more mindfully—operating their cars with more care and attention to the surrounding circumstances. They communicate more with pedestrians, bicyclists, and other drivers using hand signals and eye contact. They see other drivers rather than other cars. In an article describing the expansion of the experiment to a number of other European cities, including London’s Kensington neighborhood, traffic expert Hans Monderman told Germany’s Der Spiegel, “The many rules strip us of the most important thing: the ability to be considerate. We’re losing our capacity for socially responsible behavior. The greater the number of prescriptions, the more people’s sense of personal responsibility dwindles.”3

Law has long recognized the difference between rules and standards—between very precise boundaries like a speed limit and the much vaguer admonishment characteristic of negligence law that warns individuals simply to “act reasonably.” There are well-known tradeoffs between these approaches.4 Rules are less subject to ambiguity and, if crafted well, inform people exactly what they can do, even if individual situations may render the rule impractical or, worse, dangerous. Standards allow people to tailor their actions to a particular situation. Yet they also rely on the good judgment of often self-interested actors—or on little-constrained second-guessing of a jury or judge that later decrees whether someone’s actions were unreasonable.

A small lesson of the verkeersbordvrij experiment is that standards can work better than rules in unexpected contexts. A larger lesson has to do with the traffic expert’s claim about law and human behavior: the more we are regulated, the more we may choose to hew only and exactly to the regulation or, more precisely, to what we can get away with when the regulation is not perfectly enforced. When we face heavy regulation, we see and shape our behavior more in relation to reward and punishment by an arbitrary external authority, than because of a commitment to the kind of world our actions can help bring about.5 This observation is less about the difference between rules and standards than it is about the source of mandates: some may come from a process that a person views as alien, while others arise from a process in which the person takes an active part.

When the certainty of authority-sourced reward and punishment is lessened, we might predict two opposing results. The first is chaos: remove security guards and stores will be looted. The second is basic order maintained, as people choose to respect particular limits in the absence of enforcement. Such acting to reinforce a social fabric may still be due to a form of self-interest—game and norm theorists offer reasons why people help one another in terms that draw on longer-term mutual self-interest6—but it may also be because people have genuinely decided to treat others’ interests as their own.7 This might be because people feel a part of the process that brought about a shared mandate— even if compliance is not rigorously monitored. Honor codes, or students’ pledges not to engage in academically dishonest behavior, can apparently result in lower rates of self-reported cheating.8 Thus, without the traffic sign equivalent of pages of rules and regulations, students who apprentice to generalized codes of honor may be prone to higher levels of honesty in academic work—and benefit from a greater sense of camaraderie grounded in shared values.

More generally, order may remain when people see themselves as a part of a social system, a group of people—more than utter strangers but less than friends—with some overlap in outlook and goals. Whatever counts as a satisfying explanation, we see that sometimes the absence of law has not resulted in the absence of order.9 Under the right circumstances, people will behave charitably toward one another in the comparative absence or enforcement of rules that would otherwise compel that charity.

In modern cyberspace, an absence of rules (or at least enforcement) has led both to a generative blossoming and to a new round of challenges at multiple layers. If the Internet and its users experience a crisis of abuse—behaviors that artfully exploit the twin premises of trust and procrastination—it will be tempting to approach such challenges as ones of law and jurisdiction. This rule-and-sanction approach frames the project of cyberlaw by asking how public authorities can find and restrain those it deems to be bad actors online. Answers then look to entry points within networks and endpoints that can facilitate control. As the previous chapter explained, those points will be tethered appliances and software-as-service—functional, fashionable, but non-generative or only contingently generative.10

The “unsafe is safe” experiment highlights a different approach, one potentially as powerful as traditional rule and sanction, without the sacrifice of generativity entailed by the usual means of regulation effected through points of control, such as the appliancization described earlier in this book. When people can come to take the welfare of one another seriously and possess the tools to readily assist and limit each other, even the most precise and well-enforced rule from a traditional public source may be less effective than that uncompelled goodwill. Such an approach reframes the project of cyberlaw to ask: What are the technical tools and social structures that inspire people to act humanely online? How might they be available to help restrain the damage that malevolent outliers can wreak? How can we arrive at credible judgments about what counts as humane and what counts as malevolent? These questions may be particularly helpful to answer while cyberspace is still in its social infancy, its tools for group cohesion immature, and the attitudes of many of its users still in an early phase which treats Internet usage as either a tool to augment existing relationships or as a gateway to an undifferentiated library of information from indifferent sources. Such an atomistic conception of cyberspace naturally produces an environment without the social signaling, cues, and relationships that tend toward moderation in the absence of law.11 This is an outcome at odds with the original architecture of the Internet described in this book, an architecture built on neighborliness and cooperation among strangers occupying disparate network nodes.

The problem raised in the first part of this book underscores this dissonance between origins and current reality at the technical layer: PCs running wild, infected by and contributing to spyware, spam, and viruses because their users either do not know or do not care what they should be installing on their computers. The ubiquity of the PC among mainstream Internet users, and its flexibility that allows it to be reprogrammed at any instant, are both signal benefits and major flaws, just as the genius of the Web—allowing the on-the-fly composition of coherent pages of information from a staggering variety of unvetted sources—is also proving a serious vulnerability. In looking for ways to mitigate these flaws while preserving the benefits of such an open system, we can look to the other layers of the generative Internet which have been plagued with comparable problems, and the progress of their solutions. Some of these resemble verkeersbordvrij: curious experiments with unexpected success that suggest a set of solutions well suited to generative environments, so long as the people otherwise subject to more centralized regulation are willing to help contribute to order without it.

Recall that the Internet exists in layers—physical, protocol, application, content, social. Thanks to the modularity of the Internet’s design, network and software developers can become experts in one layer without having to know much about the others. Some legal academics have even proposed that regulation might be most efficiently tailored to respect the borders of these layers.12

For our purposes, we can examine the layers and analyze the solutions from one layer to provide insight into the problems of another. The pattern of generative success and vulnerability present in the PC and Internet at the technical layer is also visible in one of the more recent and high profile content-layer endeavors on the Internet: Wikipedia, the free online encyclopedia that anyone can edit. It is currently among the top ten most popular Web sites in the world,13 and the story of Wikipedia’s success and subsequent problems—and evolving answers to them—provide clues to solutions for other layers. We need some new approaches. Without them, we face a Hobson’s choice between fear and lockdown.


Evangelists of proprietary networks and the Internet alike have touted access to knowledge and ideas. People have anticipated digital “libraries of Alexandria,” providing the world’s information within a few clicks.14 Because the Internet began with no particular content, this was at first an empty promise. Most knowledge was understood to reside in forms that were packaged and distributed piece by piece, profitable because of a scarcity made possible by physical limitations and the restrictions of copyright. Producers of educational materials, including dictionaries and encyclopedias, were slow to put their wares into digital form. They worried about cannibalizing their existing paper sales—for Encyclopaedia Britannica, $650 million in 1990.15 There was no good way of charging for the small transactions that a lookup of a single word or encyclopedia entry would require, and there were few ways to avoid users’ copying, pasting, and sharing what they found. Eventually Microsoft released the Encarta encyclopedia on CD-ROM in 1993 for just under $1,000, pressuring Britannica to experiment both with a CD-ROM and a subscription-only Web site in 1994.16

As the Internet exploded, the slow-to-change walled garden content of formal encyclopedias was bypassed by a generative proliferation of topical Web pages, and search engines that could pinpoint them. There was no gestalt, though: the top ten results for “Hitler” on Google could include a biography written by amateur historian Philip Gavin as part of his History Place Web site,17 a variety of texts from Holocaust remembrance organizations, and a site about “kitlers,” cats bearing uncanny resemblances to the tyrant.18 This scenario exhibits generativity along the classic Libertarian model: allow individuals the freedom to express themselves and they will as they choose. We are then free to read the results. The spirit of blogging also falls within this model. If any of the posted material is objectionable or inaccurate, people can either ignore it, request for it to be taken down, or find a theory on which to sue over it, perhaps imploring gatekeepers like site hosting companies to remove material that individual authors refuse to revise.

More self-consciously encyclopedic models emerged nearly simultaneously from two rather different sources—one the founder of the dot-org Free Software Foundation, and the other an entrepreneur who had achieved dot-com success in part from the operation of a search engine focused on salacious images.19

Richard Stallman is the first. He believes in a world where software is shared, with its benefits freely available to all, where those who understand the code can modify and adapt it to new purposes, and then share it further. This was the natural environment for Stallman in the 1980s as he worked among graduate students at the Massachusetts Institute of Technology, and it parallels the environment in which the Internet and Web were invented. Stallman holds the same views on sharing other forms of intellectual expression, applying his philosophy across all of the Internet’s layers, and in 1999 he floated the idea of a free encyclopedia drawing from anyone who wanted to submit content, one article at a time. By 2001, some people were ready to give it a shot. Just as Stallman had sought to replace the proprietary Unix operating system with a similarly functioning but free alternative called GNU (“GNU’s Not Unix”), the project was first named “GNUpedia,” then GNE (“GNE’s Not an Encyclopedia”). There would be few restrictions on what those submissions would look like, lest bias be introduced:

Articles are submitted on the following provisions:

  • The article contains no previously copyrighted material (and if an article is consequently found to have offending material, it will then be removed).
  • The article contains no code that will damage the GNE systems or the systems from which users view GNE.
  • The article is not an advert, and has some informative content (persoengl [sic] information pages are not informative!).
  • The article is comprehensible (can be read and understood).20

These provisions made GNE little more than a collective blog sans comments: people would submit articles, and that would be that. Any attempt to enforce quality standards—beyond a skim to see if the article was “informative”— was eschewed. The GNE FAQ explained:

Why don’t you have editors?

There should be no level of “acceptable thought”. This means you have to tolerate being confronted by ideas and opinions different to your own, and for this we offer no apologies. GNE is a resource for spe [sic] speech, and we will strive to keep it that way. Unless some insane country with crazy libel laws tries to stop something, we will always try and fight for your spe [sic] speech, even if we perhaps don’t agree with your article. As such we will not allow any individuals to “edit” articles, thus opening GNE to the possibility of bias.21

As one might predict from its philosophy, at best GNE would be an accumulation of views rather than an encyclopedia—perhaps accounting for the “not” part of “GNE’s Not an Encyclopedia.” Today the GNE Web site is a digital ghost town. GNE was a generative experiment that failed, a place free of all digital traffic signs that never attracted any cars. It was eclipsed by another project that unequivocally aimed to be an encyclopedia, emanating from an unusual source.

Jimbo Wales founded the Bomis search engine and Web site at the onset of the dot-com boom in 1996.22 Bomis helped people find “erotic photography,”23 and earned money through advertising as well as subscription fees for premium content. In 2000, Wales took some of the money from Bomis to support a new idea: a quality encyclopedia free for everyone to access, copy, and alter for other purposes. He called it Nupedia, and it was to be built like other encyclopedias: through the commissioning of articles by experts. Wales hired philosopher Larry Sanger as editor in chief, and about twenty-five articles were completed over the course of three years.24

As the dot-com bubble burst and Bomis’s revenues dropped, Wales sought a way to produce the encyclopedia that involved neither paying people nor enduring a lengthy review process before articles were released to the public. He and his team had been intrigued at the prospect of involving the public at large, at first to draft some articles which could then be subject to Nupedia’s formal editing process, and then to offer “open review” comments to parallel a more elite peer review.25 Recollections are conflicted, but at some point software consultant Ward Cunningham’s wiki software was introduced to create a simple platform for contributing and making edits to others’ contributions. In January 2001, Wikipedia was announced to run alongside Nupedia and perhaps feed articles into it after review. Yet Nupedia was quickly eclipsed by its easily modifiable counterpart. Fragments of Nupedia exist online as of this writing, a fascinating time capsule.26 Wikipedia became an entity unto itself.27

Wikipedia began with three key attributes. The first was verkeersbordvrij. Not only were there few rules at first—the earliest ones merely emphasized the idea of maintaining a “neutral point of view” in Wikipedia’s contents, along with a commitment to eliminate materials that infringe copyright and an injunction to ignore any rules if they got in the way of building a great encyclopedia—but there were also no gatekeepers. The way the wiki software worked, anyone, registered or unregistered, could author or edit a page at any time, and those edits appeared instantaneously. This of course means that disaster could strike at any moment—someone could mistakenly or maliciously edit a page to say something wrong, offensive, or nonsensical. However, the wiki software made the price of a mistake low, because it automatically kept track of every single edit made to a page in sequence, and one could look back at the page in time-lapse to see how it appeared before each successive edit. If someone should take a carefully crafted article about Hitler and replace it with “Kilroy was here,” anyone else could come along later and revert the page with a few clicks to the way it was before the vandalism, reinstating the previous version. This is a far cry from the elements of perfect enforcement: there are few lines between enforcers and citizens; reaction to abuse is not instantaneous; and missteps generally remain recorded in a page history for later visitors to see if they are curious.

The second distinguishing attribute of Wikipedia was the provision of a discussion page alongside every main page. This allowed people to explain and justify their changes, and anyone disagreeing and changing something back could explain as well. Controversial changes made without any corresponding explanation on the discussion page could be reverted by others without having to rely on a judgment on the merits—instead, the absence of explanation for something non-self-explanatory could be reason enough to be skeptical of it. Debate was sure to arise on a system that accumulated everyone’s ideas on a subject in one article (rather than, say, having multiple articles written on the same subject, each from a different point of view, as GNE would have done). The discussion page provided a channel for such debate and helped new users of Wikipedia make a transition from simply reading its entries to making changes and to understanding that there was a group of people interested in the page on which changes were made and whom could be engaged in conversation before, during, and after editing the page.

The third crucial attribute of Wikipedia was a core of initial editors, many drawn from Nupedia, who shared a common ethos and some substantive expertise. In these early days, Wikipedia was a backwater; few knew of it, and rarely would a Wikipedia entry be among the top hits of a Google search.

Like the development of the Internet’s architecture, then, Wikipedia’s original design was simultaneously ambitious in scope but modest in execution, devoted to making something work without worrying about every problem that could come up if its extraordinary flexibility were abused. It embodied principles of trust-your-neighbor and procrastination, as well as “Postel’s Law,” a rule of thumb written by one of the Internet’s founders to describe a philosophy of Internet protocol development: “[B]e conservative in what you do; be liberal in what you accept from others.”28

Wikipedia’s initial developers shared the same goals and attitudes about the project, and they focused on getting articles written and developed instead of deciding who was or was not qualified or authorized to build on the wiki. These norms of behavior were learned by new users from the old ones through informal apprenticeships as they edited articles together.

The absence of rules was not nonnegotiable; this was not GNE. The procrastination principle suggests waiting for problems to arise before solving them. It does not eschew solutions entirely. There would be maximum openness until there was a problem, and then the problem would be tackled. Wikipedia’s rules would be developed on the wiki like a student-written and student-edited honor code. They were made publicly accessible and editable, in a separate area from that of the substantive encyclopedia.29 Try suddenly to edit an existing rule or add a new one and it will be reverted to its original state unless enough people are convinced that a change is called for. Most of the rules are substance-independent: they can be appealed to and argued about wholly apart from whatever argument might be going on about, say, how to characterize Hitler’s childhood in his biographical article.

From these beginnings there have been some tweaks to the wiki software behind Wikipedia, and a number of new rules as the enterprise has expanded and problems have arisen in part because of Wikipedia’s notoriety. For example, as Wikipedia grew it began to attract editors who had never crossed paths before, and who disagreed on the articles that they were simultaneously editing. One person would say that Scientology was a “cult,” the other would change that to “religion,” and the first would revert it back again. Should such an “edit war” be settled by whoever has the stamina to make the last edit? Wikipedia’s culture says no, and its users have developed the “three-revert rule.”30 An editor should not undo someone else’s edits to an article more than three times in one day. Disagreements can then be put to informal or formal mediation, where another Wikipedian, or other editors working on that particular article, can offer their views as to which version is more accurate—or whether the article, in the interest of maintaining a neutral point of view, should acknowledge that there is controversy about the issue.

For articles prone to vandalism—the entry for President George W. Bush, for example, or the front page of Wikipedia—administrators can create locks to ensure that unregistered or recently registered users may not make changes. Such locks are seen as necessary and temporary evils, and any administrator can choose to lift a lock at his or her discretion.31

How does an editor become an administrator with such powers? By making lots of edits and then applying for an administratorship. Wikipedians called “bureaucrats” have authority to promote editors to administrator status—or demote them. And to whom do the bureaucrats answer? Ultimately, to an elected arbitration committee, the board of Wikipedia’s parent Wikimedia Foundation, or to Jimbo Wales himself. (There are currently only a handful of bureaucrats, and they are appointed by other bureaucrats.)

Administrators can also prevent particular users from editing Wikipedia. Such blocks are rare and usually temporary. Persistent vandals usually get four warnings before any action is taken. The warnings are couched in a way that presumes—often against the weight of the evidence—that the vandals are acting in good faith, experimenting with editing capabilities on live pages when they should be practicing on test articles created for that purpose. Other transgressions include deleting others’ comments on the discussion page—since the discussion page is a wiki page, it can be edited in free form, making it possible to eliminate rather than answer someone else’s argument. Threatening legal action against a fellow Wikipedian is also grounds for a block.32

Blocks can be placed against individual user accounts, if people have registered, or against a particular IP address, for those who have not registered. IP addresses associated with anonymizing networks such as Tor are not allowed to edit Wikipedia at all.33

Along with sticks there are carrots, offered bottom-up rather than top-down. Each registered Wikipedia user is automatically granted a space for an individual user page, and a corresponding page for discussion with other Wikipedians, a free form drop box for comments or questions. If a user is deemed helpful, a practice has evolved of awarding “barnstars”—literally an image of a star. To award a barnstar, named after the metal stars used to decorate German barns,34 is simply to edit that user’s page to include a picture of the star and a note of thanks.35 Could a user simply award herself a pile of barnstars the way a megalomaniacal dictator can adorn himself with military ribbons? Yes, but that would defeat the point—and would require a bit of prohibited “sock puppetry,” as the user would need to create alter identities so the page’s edit history would show that the stars came from someone appearing to be other than the user herself.

* * *

Wikipedia has charted a path from crazy idea to stunning worldwide success. There are versions of Wikipedia in every major language—including one in simplified English for those who do not speak English fluently—and Wikipedia articles are now often among the top search engine hits for the topics they cover. The English language version surpassed one million articles in March of 2006, and it reached the 2 million mark the following September.36

Quality varies greatly. Articles on familiar topics can be highly informative, while more obscure ones are often uneven. Controversial topics like abortion and the Arab-Israeli conflict often boast thorough and highly developed articles. Perhaps this reflects Eric Raymond’s observation about the collaborative development of free software: “[g]iven enough eyeballs, all bugs are shallow.”37 To be sure, Raymond himself does not claim that the maxim he coined works beyond software, where code either objectively runs or it doesn’t. He has said that he thinks Wikipedia is “infested with moonbats”: “The more you look at what some of the Wikipedia contributors have done, the better Britannica looks.”38 Still, a controversial study by Nature in 2005 systematically compared a set of scientific entries from Wikipedia and Britannica (including some from the Britannica Web edition), and found a similar rate of error between them.39 For timeliness, Wikipedia wins hands-down: articles near-instantly appear about breaking events of note. For any given error that is pointed out, it can be corrected on Wikipedia in a heartbeat. Indeed, Wikipedia’s toughest critics can become Wikipedians simply by correcting errors as they find them, at least if they maintain the belief, not yet proven unreasonable, that successive changes to an article tend to improve it, so fixing an error will not be futile as others edit it later.


As we have seen, when the Internet and PC moved from backwater to mainstream, their success set the stage for a new round of problems. E-mail is no longer a curiosity but a necessity for most,40 and the prospect of cheaply reaching so many recipients has led to the scourge of spam, now said to account for over 90 percent of all e-mail.41 The value of the idle processing power of millions of Internet-connected PCs makes it worthwhile to hijack them, providing a new, powerful rationale for the creation of viruses and worms.42

Wikipedia’s generativity at the content level—soliciting uncoordinated contribution from tens of thousands of people—provides the basis for similar vulnerabilities now that it is so successful. It has weathered the most obvious perils well. Vandals might be annoying, but they are kept in check with a critical mass of Wikipedians who keep an eye on articles and quickly revert those that are mangled. Some Wikipedians even appear to enjoy this duty, declaring membership in the informal Counter-Vandalism Unit and, if dealing with vandalism tied to fraud, perhaps earning the Defender of the Wiki Barnstar.43 Still others have written scripts that detect the most obvious cases of vandalism and automatically fix them.44 And there remains the option of locking those pages that consistently attract trouble from edits by new or anonymous users.

But just as there is a clearer means of dealing with the threat of outright malicious viruses to PCs than there is to more gray-zone “badware,” vandals are the easy case for Wikipedia. The well-known controversy surrounding John Seigenthaler, Sr., a retired newspaper publisher and aide to Robert F. Kennedy, scratches the surface of the problem. There, a prankster had made an edit to the Wikipedia article about Seigenthaler suggesting that it had once been thought that he had been involved in the assassinations of John F. Kennedy and RFK.45 The statement was false but not manifestly obvious vandalism. The article sat unchanged for four months until a friend alerted Seigenthaler to it, replacing the entry with his official biography, which was then replaced with a short paraphrase as part of a policy to avoid copyright infringement claims.46 When Seigenthaler contacted Jimbo Wales about the issue, Wales ordered an administrator to delete Wikipedia’s record of the original edit.47 Seigenthaler then wrote an op-ed in USA Today decrying the libelous nature of the previous version of his Wikipedia article and the idea that the law would not require Wikipedia to take responsibility for what an anonymous editor wrote.48

Wikipedians have since agreed that biographies of living persons are especially sensitive, and they are encouraged to highlight unsourced or potentially libelous statements for quick review by other Wikipedians. Jimbo and a handful of other Wikipedia officials reserve the right not only to have an article edited—something anyone can do—but to change its edit history so the fact that it ever said a particular thing about someone will no longer be known to the general public, as was done with the libelous portion of the Seigenthaler article. Such practice is carried out not under legal requirements—in the United States, federal law protects information aggregators from liability for defamatory statements made by independent information providers from which they draw49—but as an ethical commitment.

Still, the reason that Seigenthaler’s entry went uncorrected for so long is likely that few people took notice of it. Until his op-ed appeared, he was not a national public figure, and Jimbo himself attributed the oversight to an increasing pace of article creation and edits—overwhelming the Wikipedians who have made a habit of keeping an eye on changes to articles. In response to the Seigenthaler incident, Wikipedia has altered its wiki software so that unregistered users cannot create new articles, but can only edit existing ones.50 (Of course, anyone can still register.)

This change takes care of casual or heat-of-the-moment vandalism, but it does little to address a new category of Wikipedian somewhere between committed community member and momentarily vandalizing teenager, one that creates tougher problems. This Wikipedian is someone who cares little about the social act of working with others to create an encyclopedia, but instead cares about the content of a particular Wikipedia entry. Now that a significant number of people consult Wikipedia as a resource, many of whom come to the site from search engine queries, Wikipedia’s contents have effects far beyond the site’s own community of user-editors.

One of Wikipedia’s community-developed standards is that individuals should not create or edit articles about themselves, nor prompt friends to do so. Instead they are to lobby on the article’s discussion page for other editors to make corrections or amplifications. (Jimbo himself has expressed regret for editing his own entry in Wikipedia in violation of this policy.)51 What about companies, or political aides? When a number of edits were made to politicians’ Wikipedia entries by Internet Protocol addresses traceable to Capitol Hill, Wikipedians publicized the incidents and tried to shame the politicians in question into denouncing the grooming of their entries.52 In some cases it has worked. After Congressman Marty Meehan’s chief of staff edited his entry to omit mention of a broken campaign promise to serve a limited number of terms, and subsequently replaced the text of the entire article with his official biography, Meehan repudiated the changes. He published a statement saying that it was a waste of time and energy for his staff to have made the edits (“[t]hough the actual time spent on this issue amounted to 11 minutes”) because “part of being an elected official is to be regularly commented on, praised, and criticized on the Web.”53 Meehan’s response sidestepped the issue of whether and how politicians ought to respond to material about them that they believe to be false or misleading. Surely, if the New York Times published a story that he thought was damaging, he would want to write a letter to the editor to set the record straight.

If the Wikipedia entry on Wal-Mart is one of the first hits in a search for the store, it will be important to Wal-Mart to make sure the entry is fair—or even more than fair, omitting true and relevant facts that nonetheless reflect poorly on the company. What can a group of volunteers do if a company or politician is implacably committed to editing an entry? The answer so far has been to muddle along, assuming the best intentions of all editors and hoping that there is epistemic strength in numbers.54 If disinterested but competent editors outnumber shills, the shills will find their edits reverted or honed, and if the shills persist, they can be halted by the three-revert rule.

In August 2006, a company called MyWikiBiz was launched to help people and companies promote themselves and shape their reputations on Wikipedia. “If your company or organization already has a well-designed, accurately-written article on Wikipedia, then congratulations—our services are not for you. However, if your business is lacking a well-written article on Wikipedia, read on—we’re here to help you!”55 MyWikiBiz offers to create a basic Wikipedia stub of three to five sentences about a company, with some links, for $49. A “standard article” fetches $79, with a premium service ($99) that includes checking the client’s Wikipedia article after a year to see “if further changes are needed.”56

Wikipedia’s reaction to MyWikiBiz was swift. Jimbo himself blocked the firm’s Wikipedia account on the basis of “paid editing on behalf of customers.”57 The indefinite block was one of only a handful recorded by Jimbo in Wikipedia’s history. Wales talked to the firm on the phone the same day and reported that they had come to an accommodation. Identifying the problem as a conflict of interest and appearance of impropriety arising from editors being paid to write by the subjects of the articles, Wales said that MyWikiBiz had agreed to post well-sourced “neutral point of view” articles about its clients on its own Web site, which regular Wikipedians could then choose to incorporate or not as they pleased into Wikipedia.58 Other Wikipedians disagreed with such a conservative outcome, believing that good content was good content, regardless of source, and that it should be judged on its merits, without a per se rule prohibiting direct entry by a for-profit firm like MyWikiBiz.

The accommodation was short-lived. Articles submitted or sourced by MyWikiBiz were nominated for deletion—itself a process that entails a discussion among any interested Wikipedians and then a judgment by any administrator about whether that discussion reached consensus on a deletion. MyWikiBiz participated wholeheartedly in those discussions and appealed to the earlier “Jimbo Concordat,” persuading some Wikipedians to remove their per se objections to an article because of its source. Wales himself participated in one of the discussions, saying that his prior agreement had been misrepresented and, after telling MyWikiBiz that it was on thin ice, once again banned it for what he viewed as spamming Wikipedia with corporate advertisements rather than “neutral point of view” articles.

As a result, MyWikiBiz has gone into “hibernation,” according to its founder, who maintains that all sources, even commercial ones, should be able to play a role in contributing to Wikipedia, especially since the sources for most articles and edits are not personally identifiable, even if they are submitted under the persistent pseudonyms that are Wikipedia user identities. Rules have evolved concerning those identities, too. In 2007, Wikipedia user Essjay, the administrator who cleaned Seigenthaler’s defamatory edit logs, was found to have misrepresented his credentials. Essjay had claimed to hold various graduate degrees along with a professorship in theology, and had contributed to many Wikipedia articles on the subject. When Jimbo Wales contacted him to discuss a job opportunity at Wales’s for-profit company Wikia, Essjay’s real identity was revealed. In fact, he was a twenty-four-year-old editor with no graduate degrees. His previous edits—and corresponding discussions in which he invoked his credentials—were called into question. In response to the controversy, and after a request for comments from the Wikipedia community,59 Jimbo proposed a rule whereby the credentials of those Wikipedia administrators who chose to assert them would be verified.60 Essjay retired from Wikipedia.61

* * *

A constitutional lawyer might review these tales of Wikipedia and see a mess of process that leads to a mess of substance: anonymous and ever-shifting users; a God-king who may or may not be able to act unilaterally;62 a set of rules now large enough to be confusing and ambiguous but small enough to fail to reach most challenges. And Wikipedia is decidedly not a democracy: consensus is favored over voting and its head counts. Much the same could be said about the development process for the Internet’s fundamental technical protocols, which is equally porous.63 The Internet Engineering Task Force (IETF) has no “members”; anyone can participate. But it also has had a proliferation of standards and norms designed to channel arguments to productive resolution, along with venerated people in unelected positions of respect and authority who could, within broad boundaries, affect the path of Internet standards.64 As the Internet succeeded, the IETF’s standards and norms were tested by outsiders who did not share them. Corporate interests became keenly interested in protocol development, and they generally respond to their own particular pecuniary incentives rather than to arguments based on engineering efficiency. The IETF avoided the brunt of these problems because its standards are not self-enforcing; firms that build network hardware, or for-profit Internet Service Providers, ultimately decide how to make their routers behave. IETF endorsement of one standard or another, while helpful, is no longer crucial. With Wikipedia, decisions made by editors and administrators can affect real-world reputations since the articles are live and highly visible via search engines; firms do not individually choose to “adopt” Wikipedia the way they adopt Internet standards.

Yet Wikipedia’s awkward and clumsy growth in articles, and the rules governing their creation and editing, is so far a success story. It is in its essence a work in progress, one whose success is defined by the survival—even growth—of a core of editors who subscribe to and enforce its ethos, amid an influx of users who know nothing of that ethos. Wikipedia’s success, such as it is, is attributable to a messy combination of constantly updated technical tools and social conventions that elicit and reflect personal commitments from a critical mass of editors to engage in argument and debate about topics they care about. Together these tools and conventions facilitate a notion of “netizenship”: belonging to an Internet project that includes other people, rather than relating to the Internet as a deterministic information location and transmission tool or as a cash-and-carry service offered by a separate vendor responsible for its content.


We live under the rule of law when people are treated equally, without regard to their power or station; when the rules that apply to them arise legitimately from the consent of the governed; when those rules are clearly stated; and when there is a source of dispassionate, independent application of those rules.65

Despite the apparent mess of process and users, by these standards Wikipedia has charted a remarkable course. Although different users have different levels of capabilities, anyone can register, and anyone, if dedicated enough, can rise to the status of administrator. And while Jimbo Wales may have extraordinary influence, his power on Wikipedia depends in large measure on the consent of the governed—on the individual decisions of hundreds of administrators, any of whom can gainsay each other or him, but who tend to work together because of a shared vision for Wikipedia. The effective implementation of policy in turn rests on the thousands of active editors who may exert power in the shape of the tens of thousands of decisions they make as Wikipedia’s articles are edited and reedited. Behaviors that rise to the level of consistent practice are ultimately described and codified as potential policies, and some are then affirmed as operative ones, in a process that is itself constantly subject to revision.

In one extraordinary chat room conversation of Wikipedians recorded online, Wales himself laments that Larry Sanger is billed in several Wikipedia articles about Wikipedia as a “co-founder” of the encyclopedia. But apart from a few instances that he has since publicly regretted, Wales has not edited the articles himself, nor does he directly instruct others to change them with specific text, since that would violate the rule against editing articles about oneself. Instead, he makes a case that an unremarked use of the co-founder label is inaccurate, and implores people to consider how to improve it.66 At times—they are constantly in flux—Wikipedia’s articles about Wikipedia note that there is controversy over the “co-founder” label for Sanger. In another example of the limits of direct power, then-Wikimedia Foundation board member Angela Beesley fought to have the Wikipedia entry about her deleted. She was rebuffed, with administrators concluding that she was newsworthy enough to warrant one.67 (She tried again after resigning from the Foundation board, to no avail.)68

* * *

Wikipedia—with the cooperation of many Wikipedians—has developed a system of self-governance that has many indicia of the rule of law without heavy reliance on outside authority or boundary. To be sure, while outside regulation is not courted, Wikipedia’s policy on copyright infringement exhibits a desire to integrate with the law rather than reject it. Indeed, its copyright policy is much stricter than the laws of major jurisdictions require. In the United States, Wikipedia could wait for formal notifications of specific infringement before taking action to remove copyrighted material.69 And despite the fact that Wales himself is a fan of Ayn Rand70—whose philosophy of “objectivism” closely aligns with libertarian ideals, a triumph of the individual over the group—Wikipedia is a consummately communitarian enterprise.71 The activity of building and editing the encyclopedia is done in groups, though the structure of the wiki allows for large groups to naturally break up into manageable units most of the time: a nano-community coalesces around each article, often from five to twenty people at a time, augmented by non-subject-specific roving editors who enjoy generic tasks like line editing or categorizing articles. (Sometimes articles on roughly the same subject can develop independently, at which point there is a negotiation between the two sets of editors on whether and how to merge them.)

This structure is a natural form of what constitutionalists would call subsidiarity: centralized, “higher” forms of dispute resolution are reserved for special cases, while day-to-day work and decisions are undertaken in small, “local” groups.72 Decisions are made by those closest to the issues, preventing the lengthy, top-down processes of hierarchical systems. This subsidiarity is also expressed through the major groupings drawn according to language. Each different language version of Wikipedia forms its own policies, enforcement schemes, and norms. Sometimes these can track national or cultural standards—as a matter of course people from Poland primarily edit the Polish version of Wikipedia—but at other times they cross such boundaries. The Chinese language Wikipedia serves mainland China (when it is not being blocked by the government, which it frequently is),73 Hong Kong, Taiwan, and the many Chinese speakers scattered around the world.74

When disputes come up, consensus is sought before formality, and the lines between subject and regulator are thin. While not everyone has the powers of an administrator, the use of those special powers is reserved for persistent abuse rather than daily enforcement. It is the editors—that is, those who choose to participate—whose decisions and work collectively add up to an encyclopedia—or not. And most—at least prior to an invasion of political aides, PR firms, and other true cultural foreigners—subscribe to the notion that there is a divide between substance and process, and that there can be an appeal to content-independent rules on which meta-agreement can be reached, even as editors continue to dispute a fact or portrayal in a given article.

This is the essence of law: something larger than an arbitrary exercise of force, and something with meaning apart from a pretext for that force, one couched in neutral terms only for the purpose of social acceptability. It has been rediscovered among people who often profess little respect for their own sovereigns’ “real” law, following it not out of civic agreement or pride but because of a cynical balance of the penalties for being caught against the benefits of breaking it. Indeed, the idea that a “neutral point of view” even exists, and that it can be determined among people who disagree, is an amazingly quaint, perhaps even naïve, notion. Yet it is invoked earnestly and often productively on Wikipedia. Recall the traffic engineer’s observation about road signs and human behavior: “The greater the number of prescriptions, the more people’s sense of personal responsibility dwindles.”75 Wikipedia shows, if perhaps only for a fleeting moment under particularly fortuitous circumstances, that the inverse is also true: the fewer the number of prescriptions, the more people’s sense of personal responsibility escalates.

Wikipedia shows us that the naïveté of the Internet’s engineers in building generative network technology can be justified not just at the technical layer of the Internet, but at the content layer as well. The idiosyncratic system that has produced running code among talented (and some not-so-talented) engineers has been replicated among writers and artists.

There is a final safety valve to Wikipedia that encourages good-faith contribution and serves as a check on abuses of power that accretes among administrators and bureaucrats there: Wikipedia’s content is licensed so that anyone may copy and edit it, so long as attribution of its source is given and it is further shared under the same terms.76 This permits Wikipedia’s content to be sold or used in a commercial manner, so long as it is not proprietized—those who make use of Wikipedia’s content cannot claim copyright over works that follow from it. Thus dot-com Web sites like mirror all of Wikipedia’s content and also display banner ads to make money, something Jimbo Wales has vowed never to do with Wikipedia.77 (A list maintained on Wikipedia shows dozens of such mirrors.)78 Mirrors can lead to problems for people like John Seigenthaler, who not only have to strive to correct misrepresentations in the original article on Wikipedia, but in any mirrors as well. But Wikipedia’s free content license has the benefit of allowing members of the Wikipedia community an option to exit—and to take a copy of the encyclopedia with them. It also allows for generative experimentation and growth. For example, third parties can come up with ways of identifying accurate articles on Wikipedia and then compile them as a more authoritative or vetted subset of the constant work-in-progress that the site represents.

Larry Sanger, the original editor of Nupedia and organizer (and, according to some, co-founder) of Wikipedia, has done just that. He has started “Citizendium,” an attempt to combine some of Nupedia’s original use of experts with Wikipedia’s appeal to the public at large. Citizendium seeks to fork Wikipedia, and solicit volunteers who agree not to be anonymous, so that their edits may be credited more readily, and their behavior made more accountable. If Citizendium draws enough people and content, links to it from other Web sites will follow, and, given enough links, its entries could appear as highly ranked search results. Wikipedia’s dominance has a certain measure of inertia to it, but the generative possibilities of its content, guaranteed by its choice of a permissive license, allow a further check on its prominence.

Wikipedia shows us a model for interpersonal interaction that goes beyond the scripts of customer and business. The discussions that take place adjunct to editing can be brusque, but the behavior that earns the most barnstars is directness, intelligence, and good faith. An owner of a company can be completely bemused that, in order to correct (and have stay corrected) what he sees as inaccuracies in an article about his firm, he will have to discuss the issues with random members of the public. Steve Scherf, co-founder of dot-com Gracenote, ended up engaged in an earnest, lengthy exchange with someone known as “Fatandhappy” about the way his company’s history was portrayed.79 The exchange was heated and clearly frustrating for Scherf, but after another Wikipedian intervened to make edits, Scherf pronounced himself happy if not thrilled with the revised text. These conversations are possible, and they are still the norm at Wikipedia.

The elements of Wikipedia that have led to its success can help us come to solutions for problems besetting generative successes at other layers of the Internet. They are verkeersbordvrij, a light regulatory touch coupled with an openness to flexible public involvement, including a way for members of the public to make changes, good or bad, with immediate effect; a focus on earnest discussion, including reference to neutral dispute resolution policies, as a means of being strengthened rather than driven by disagreements; and a core of people prepared to model an ethos that others can follow. With any of these pieces missing Wikipedia would likely not have worked. Dot-coms that have rushed in to adopt wikis as the latest cool technology have found mixed results. Microsoft’s Encarta Web site, in a naked concession to the popularity of Wikipedia, now has an empty box at the bottom of each article where users are asked to enter comments or corrections, which will be forwarded to the Encarta staff for review. Users receive no further feedback.

Makers of cars and soap have run contests80 for the public to make advertisements based on stock footage found in their respective commercials, complete with online editing tools so that amateurs can easily put their commercials together. Dove ran the winner of its contest during the Super Bowl.81 Many commercial Web sites like Amazon solicit customer reviews of products as a way to earn credibility with other customers—and some, like, have business models premised entirely on the reviews themselves. asks for such ratings while also organizing its users into geographically based groups and giving them the basic tools of social networking: an ability to praise each other for good reviews, to name fellow reviewers as friends, and to discuss and comment on each others’ views. As one Yelp participant put it in reviewing the very Yelp “elite status” that she had just earned for contributing so many well-regarded reviews, “[It m]akes you feel special for about two weeks. Then you either realize you’re working for someone else without getting paid, you totally lose interest, or you get really into it.”82

Such “user-generated content,” whether cultivated through fully grassroots-motivated dot-org enterprises or well-constructed dot-com ones, forms part of a new hybrid economy now studied by Lessig, Benkler, von Hippel, and others. These public solicitations to manipulate corporate and cultural symbols, pitched at varying levels of expertise, may prove to be further building blocks of “semiotic democracy,” where we can participate in the making and remaking of cultural meanings instead of having them foisted upon us.83

But Wikipedia stands for more than the ability of people to craft their own knowledge and culture. It stands for the idea that people of diverse backgrounds can work together on a common project with, whatever its other weaknesses, a noble aim—bringing such knowledge to the world. Jimbo Wales has said that the open development model of Wikipedia is only a means to that end—recall that he started with the far more restrictive Nupedia development model. And we see that Wikipedia rejects straightforward democracy, favoring discussion and consensus over outright voting, thereby sidestepping the kinds of ballot-stuffing that can take place in a digital environment, whether because one person adopts multiple identities or because a person can simply ask friends to stack a sparsely attended vote.

Instead, Wikipedia has since come to stand for the idea that involvement of people in the information they read—whether to fix a typographical error or to join a debate over its veracity or completeness—is an important end itself, one made possible by the recursive generativity of a network that welcomes new outposts without gatekeepers; of software that can be created and deployed at those outposts; and of an ethos that welcomes new ideas without gatekeepers, one that asks the people bearing those ideas to argue for and substantiate them to those who question.

There are plenty of online services whose choices can affect our lives. For example, Google’s choices about how to rank and calculate its search results can determine which ideas have prominence and which do not. That is one reason why Google’s agreement to censor its own search results for the Chinese version of Google has attracted so much disapprobation.84 But even those who are most critical of Google’s actions appear to wish to pressure the company through standard channels: moral suasion, shareholder resolutions, government regulation compelling noncensorship, or a boycott to inflict financial pressure. Unlike Wikipedia, no one thinks that Google ought to be “governed” by its users in some democratic or communitarian way, even as it draws upon the wisdom of the crowds in deciding upon its rankings,85 basing them in part on the ways in which millions of individual Web sites have decided about to whom to link. Amazon and Yelp welcome user reviews (and reviews of those reviews), but the public at large does not “govern” these institutions.

People instinctively expect more of Wikipedia. They see it as a shared resource and a public one, even though it is not an arm of any territorial sovereign. The same could be said of the Internet Engineering Task Force and the Internet itself, but Wikipedia appears to have further found a way to involve nontechnical people in its governance. Every time someone reads a Wikipedia article and knowingly chooses not to vandalize it, he or she has an opportunity to identify with and reinforce its ethos. Wales is setting his sights next on a search engine built and governed on this model, “free and transparent” about its rankings, with a “huge degree of human community oversight.”86 The next chapters explore how that ethos may be replicable: vertically to solve generative problems found at other layers of the Internet, and horizontally to other applications within the content and social layers.

If Wikipedia did not exist there would still be reason to cheer the generative possibilities of the Internet, its capacity to bring people together in meaningful conversations, commerce, or action. There are leading examples of each—the community of commentary and critique that has evolved around blogging, the user-driven reputation system within eBay, the “civil society” type of gatherings fostered by Meetup, or the social pressure–induced promises via Pledgebank, each drawing on the power of individuals contributing to community-driven goals. But Wikipedia is the canonical bee that flies despite scientists’ skepticism that the aerodynamics add up.87 These examples will grow, transform, or fade over time, and their futures may depend not just on the public’s appetites and attention, but on the technical substrate that holds them all: the powerful but delicate generative Internet and PC, themselves vaulted unexpectedly into the mainstream because of amateur contribution and cooperation. We now explore how the lessons of Wikipedia, both its successes and shortcomings, shed light on how to maintain our technologies’ generativity in the face of the problems arising from their widespread adoption.

Posted by The Editors on March 1, 2008
Tags: Uncategorized