Today’s Internet is not the only way to build a network. In the 1990s, the Internet passed unnoticed in mainstream circles while networks were deployed by competing proprietary barons such as AOL, CompuServe, and Prodigy. The technorati placed bets on which baron would prevail over the others, apparently imagining that the proprietary networks would develop in the same way that the separate phone networks—at one time requiring differently colored phones on each person’s desk—had converged to just one lucky provider.1 All those bets lost. The proprietary networks went extinct, despite having accumulated millions of subscribers. They were crushed by a network built by government researchers and computer scientists who had no CEO, no master business plan, no paying subscribers, no investment in content, and no financial interest in accumulating subscribers.
The framers of the Internet did not design their network with visions of mainstream dominance. Instead, the very unexpectedness of its success was a critical ingredient. The Internet was able to develop quietly and organically for years before it became widely known, remaining outside the notice of those who would have insisted on more cautious strictures had they only suspected how ubiquitous it would become.
This first part of the book traces the battle between the centralized proprietary networks and the Internet, and a corresponding fight between specialized information appliances like smart typewriters and the general-purpose PC, highlighting the qualities that allowed the Internet and PC to win.
Today, the same qualities that led to their successes are causing the Internet and the PC to falter. As ubiquitous as Internet technologies are today, the pieces are in place for a wholesale shift away from the original chaotic design that has given rise to the modern information revolution. This counterrevolution would push mainstream users away from a generative Internet that fosters innovation and disruption, to an appliancized network that incorporates some of the most powerful features of today’s Internet while greatly limiting its innovative capacity—and, for better or worse, heightening its regulability. A seductive and more powerful generation of proprietary networks and information appliances is waiting for round two. If the problems associated with the Internet and PC are not addressed, a set of blunt solutions will likely be applied to solve the problems at the expense of much of what we love about today’s information ecosystem. Understanding its history sheds light on different possible futures and helps us to recognize and avoid what might otherwise be very tempting dead ends.
One vital lesson from the past is that the endpoint matters. Too often, a discussion of the Internet and its future stops just short of its endpoints, focusing only on the literal network itself: how many people are connected, whether and how it is filtered, and how fast it carries data.2 These are important questions, but they risk obscuring the reality that people’s experiences with the Internet are shaped at least as much by the devices they use to access it.
As Internet-aware devices proliferate, questions posed about network regulation must also be applied to the endpoints—which, until recently, have been so open and so nonconstricting as to be nearly unnoticeable, and therefore absent from most debates about Internet policy. Yet increasingly the box has come to matter.
History shows that the box had competitors—and today they are back. The early models of commercial (as compared to academic) computing assumed that the vendor of the machinery would provide most or all of its programming. The PC of the 1980s—the parent of today’s PC—diverged from these models, but the result was by no means a foregone conclusion. Internet users are again embracing a range of “tethered appliances,” reflecting a resurgence of the initial model of bundled hardware and software that is created and controlled by one company. This will affect how readily behavior on the Internet can be regulated, which in turn will determine the extent that regulators and commercial incumbents can constrain amateur innovation, which has been responsible for much of what we now consider precious about the Internet.3
The Internet also had competitors—and they are back. Compared to the Internet, early online information services were built around very different technical and business models. Their designs were much easier to secure against illegal behavior and security threats; the cost was that innovation became much more difficult. The Internet outpaced these services by assuming that every user was contributing a goodwill subsidy: people would not behave destructively even when there were no easy ways to monitor or stop them.
The Internet’s tradeoff of more flexibility for less security worked: most imaginable risks failed to materialize—for example, people did not routinely spy on one another’s communications, even though it was eminently possible, and for years there were no spam and no viruses. By observing at which point these tradeoffs were made, we will see that the current portfolio of tradeoffs is no longer optimal, and that some of the natural adjustments in that balance, while predictable, are also undesirable.
The fundamental challenges for those who have built and maintained the Internet are to acknowledge crucial deficiencies in a network-and-endpoint structure that has otherwise served so well for so long, to understand our alternatives as the status quo evaporates, and to devise ways to push the system toward a future that addresses the very real problems that are forcing change, while preserving the elements we hold most dear.
Posted by The Editors on March 1, 2008
Tags: Uncategorized