Today’s Internet is not the only way to build a network. In the 1990s, the Internet passed unnoticed in mainstream circles while networks were deployed by competing proprietary barons such as AOL, CompuServe, and Prodigy. The technorati placed bets on which baron would prevail over the others, apparently imagining that the proprietary networks would develop in the same way that the separate phone networks—at one time requiring differently colored phones on each person’s desk—had converged to just one lucky provider.1 All those bets lost. The proprietary networks went extinct, despite having accumulated millions of subscribers. They were crushed by a network built by government researchers and computer scientists who had no CEO, no master business plan, no paying subscribers, no investment in content, and no financial interest in accumulating subscribers.

The framers of the Internet did not design their network with visions of mainstream dominance. Instead, the very unexpectedness of its success was a critical ingredient. The Internet was able to develop quietly and organically for years before it became widely known, remaining outside the notice of those who would have insisted on more cautious strictures had they only suspected how ubiquitous it would become.

This first part of the book traces the battle between the centralized proprietary networks and the Internet, and a corresponding fight between specialized information appliances like smart typewriters and the general-purpose PC, highlighting the qualities that allowed the Internet and PC to win.


Today, the same qualities that led to their successes are causing the Internet and the PC to falter. As ubiquitous as Internet technologies are today, the pieces are in place for a wholesale shift away from the original chaotic design that has given rise to the modern information revolution. This counterrevolution would push mainstream users away from a generative Internet that fosters innovation and disruption, to an appliancized network that incorporates some of the most powerful features of today’s Internet while greatly limiting its innovative capacity—and, for better or worse, heightening its regulability. A seductive and more powerful generation of proprietary networks and information appliances is waiting for round two. If the problems associated with the Internet and PC are not addressed, a set of blunt solutions will likely be applied to solve the problems at the expense of much of what we love about today’s information ecosystem. Understanding its history sheds light on different possible futures and helps us to recognize and avoid what might otherwise be very tempting dead ends.

One vital lesson from the past is that the endpoint matters. Too often, a discussion of the Internet and its future stops just short of its endpoints, focusing only on the literal network itself: how many people are connected, whether and how it is filtered, and how fast it carries data.2 These are important questions, but they risk obscuring the reality that people’s experiences with the Internet are shaped at least as much by the devices they use to access it.

As Internet-aware devices proliferate, questions posed about network regulation must also be applied to the endpoints—which, until recently, have been so open and so nonconstricting as to be nearly unnoticeable, and therefore absent from most debates about Internet policy. Yet increasingly the box has come to matter.


History shows that the box had competitors—and today they are back. The early models of commercial (as compared to academic) computing assumed that the vendor of the machinery would provide most or all of its programming. The PC of the 1980s—the parent of today’s PC—diverged from these models, but the result was by no means a foregone conclusion. Internet users are again embracing a range of “tethered appliances,” reflecting a resurgence of the initial model of bundled hardware and software that is created and controlled by one company. This will affect how readily behavior on the Internet can be regulated, which in turn will determine the extent that regulators and commercial incumbents can constrain amateur innovation, which has been responsible for much of what we now consider precious about the Internet.3

The Internet also had competitors—and they are back. Compared to the Internet, early online information services were built around very different technical and business models. Their designs were much easier to secure against illegal behavior and security threats; the cost was that innovation became much more difficult. The Internet outpaced these services by assuming that every user was contributing a goodwill subsidy: people would not behave destructively even when there were no easy ways to monitor or stop them.


The Internet’s tradeoff of more flexibility for less security worked: most imaginable risks failed to materialize—for example, people did not routinely spy on one another’s communications, even though it was eminently possible, and for years there were no spam and no viruses. By observing at which point these tradeoffs were made, we will see that the current portfolio of tradeoffs is no longer optimal, and that some of the natural adjustments in that balance, while predictable, are also undesirable.

The fundamental challenges for those who have built and maintained the Internet are to acknowledge crucial deficiencies in a network-and-endpoint structure that has otherwise served so well for so long, to understand our alternatives as the status quo evaporates, and to devise ways to push the system toward a future that addresses the very real problems that are forcing change, while preserving the elements we hold most dear.

Posted by The Editors on March 1, 2008
Tags: Uncategorized

Total comments on this page: 7

How to read/write comments

Comments on specific paragraphs:

Click the icon to the right of a paragraph

  • If there are no prior comments there, a comment entry form will appear automatically
  • If there are already comments, you will see them and the form will be at the bottom of the thread

Comments on the page as a whole:

Click the icon to the right of the page title (works the same as paragraphs)


No comments yet.

Mike Hammer on paragraph 9:

The old paradigm of technical, behaving users fails when the masses are not technical or not behaving. PCs locked up by spam and viruses is not a good result. So, identity, trust, and security are necessary characteristics. Ones, which affect the Internet evolution. And evolution teaches us one thing: change or die off.

April 25, 2008 9:53 am
david on paragraph 7:

make sense

April 27, 2008 7:40 pm
Matthew C on paragraph 9:

“… and for years there were no spam and no viruses.” This is certainly looking at history through rose colored glasses. There were computer viruses, even before there was an Internet. This is the same for email spam (see: http://www.templetons.com/brad/spamreact.html ). Yes, there was an increase in viruses and spam with increased use of the WWW, but this was a consequence of a certain insecure operating system entering the Internet network, in huge numbers. Computers running this operating system served as an ideal launchpad for these malicious software programs. Not only did users not know of the presence of the malicious software, the operating system actively prevented the user from knowing what their computer was doing, in so much log files and process lists were hidden.

June 2, 2008 2:57 pm
Orval on paragraph 9:

I think this is overstating the case somewhat, Matthew C is right. As an evolutionist, I am struck by the metaphor of the internet as biological growth; the overall computing experience really has not gotten worse over time. The malware and viruses will always be with us, but will always be contained. Indeed, following the biological model, malware is much less destructive and noticeable now than it was 10 or 20 years ago, a common pattern for disease evolution in nature.

Myself, I dumped all my anti-virus software a couple of years back because they had all become bloatware with an unacceptable overhead; good hygiene practices and depending on the spam and virus filters from ISPs and gmail and hotmail and so on has worked very well for me.

And there is always the arrogant assumption that there will always be this flood of technological newbies joining up and never learning anything. This is probably not the case– most users, over time, acquire a considerable amount of sophistication.

So, no, I don’t think that the internet is in serious danger of collapse from malware. It is just that, as in the biological model, we will never ever totally defeat it; the best we can hope for is our immune systems to keep on learning and evolving and fighting back, and keeping it at a tolerable level, but never vanquishing it.

This is admittedly a scary thought. Life is scary too; ebola could get you at any moment.

June 8, 2008 1:23 pm
Marty on paragraph 4:

appliancized? Coining a term is one thing; verbing of a noun, while common enough, is not necessarily the best way to go.

_appliance driven network_ for example
might work.

June 8, 2008 2:11 pm
Linda :

While I agree with you that the word “coiniage” is unfortunate, it’s much lighter than your average businessman. I think the idea behind the work is much more troubling…

April 22, 2009 7:05 pm
L. Reese on paragraph 1:

Having used both AOL and CompuServe back in the day, I much prefer the wide/wild open spaces of today’s Internet. AOL and CS both exerted a great deal of control over their domains, and these restrictions often pushed a user towards consumerism and entertainment rather than research and education. Yet, this seems to be happening again in “Web 2.0” with Google’s dominance and the rampant commercialism that seems to drive the ‘net.

February 1, 2010 6:29 pm
Name (required)
E-mail (required - never shown publicly)