On January 9, 2007, Steve Jobs introduced the iPhone to an eager audience crammed into San Francisco’s Moscone Center.1 A beautiful and brilliantly engineered device, the iPhone blended three products into one: an iPod, with the highest-quality screen Apple had ever produced; a phone, with cleverly integrated functionality, such as voicemail that came wrapped as separately accessible messages; and a device to access the Internet, with a smart and elegant browser, and with built-in map, weather, stock, and e-mail capabilities. It was a technical and design triumph for Jobs, bringing the company into a market with an extraordinary potential for growth, and pushing the industry to a new level of competition in ways to connect us to each other and to the Web.


1

This was not the first time Steve Jobs had launched a revolution. Thirty years earlier, at the First West Coast Computer Faire in nearly the same spot, the twenty-one-year-old Jobs, wearing his first suit, exhibited the Apple II personal computer to great buzz amidst “10,000 walking, talking computer freaks.”2 The Apple II was a machine for hobbyists who did not want to fuss with soldering irons: all the ingredients for a functioning PC were provided in a convenient molded plastic case. It looked clunky, yet it could be at home on someone’s desk. Instead of puzzling over bits of hardware or typing up punch cards to feed into someone else’s mainframe, Apple owners faced only the hurdle of a cryptic blinking cursor in the upper left corner of the screen: the PC awaited instructions. But the hurdle was not high. Some owners were inspired to program the machines themselves, but true beginners simply could load up software written and then shared or sold by their more skilled or inspired counterparts. The Apple II was a blank slate, a bold departure from previous technology that had been developed and marketed to perform specific tasks from the first day of its sale to the last day of its use.

The Apple II quickly became popular. And when programmer and entrepreneur Dan Bricklin introduced the first killer application for the Apple II in 1979—VisiCalc, the world’s first spreadsheet program—sales of the ungainly but very cool machine took off dramatically.3 An Apple running VisiCalc helped to convince a skeptical world that there was a place for the PC at everyone’s desk and hence a market to build many, and to build them very fast.

Though these two inventions—iPhone and Apple II—were launched by the same man, the revolutions that they inaugurated are radically different. For the technology that each inaugurated is radically different. The Apple II was quintessentially generative technology. It was a platform. It invited people to tinker with it. Hobbyists wrote programs. Businesses began to plan on selling software. Jobs (and Apple) had no clue how the machine would be used. They had their hunches, but, fortunately for them, nothing constrained the PC to the hunches of the founders. Apple did not even know that VisiCalc was on the market when it noticed sales of the Apple II skyrocketing. The Apple II was designed for surprises—some very good (VisiCalc), and some not so good (the inevitable and frequent computer crashes).


6

The iPhone is the opposite. It is sterile. Rather than a platform that invites innovation, the iPhone comes preprogrammed. You are not allowed to add programs to the all-in-one device that Steve Jobs sells you. Its functionality is locked in, though Apple can change it through remote updates. Indeed, to those who managed to tinker with the code to enable the iPhone to support more or different applications,4 Apple threatened (and then delivered on the threat) to transform the iPhone into an iBrick.5 The machine was not to be generative beyond the innovations that Apple (and its exclusive carrier, AT&T) wanted. Whereas the world would innovate for the Apple II, only Apple would innovate for the iPhone. (A promised software development kit may allow others to program the iPhone with Apple’s permission.)

Jobs was not shy about these restrictions baked into the iPhone. As he said at its launch:


1

We define everything that is on the phone. . . . You don’t want your phone to be like a PC. The last thing you want is to have loaded three apps on your phone and then you go to make a call and it doesn’t work anymore. These are more like iPods than they are like computers.6

No doubt, for a significant number of us, Jobs was exactly right. For in the thirty years between the first flashing cursor on the Apple II and the gorgeous iconized touch menu of the iPhone, we have grown weary not with the unexpected cool stuff that the generative PC had produced, but instead with the unexpected very uncool stuff that came along with it. Viruses, spam, identity theft, crashes: all of these were the consequences of a certain freedom built into the generative PC. As these problems grow worse, for many the promise of security is enough reason to give up that freedom.


1

* * *


1

In the arc from the Apple II to the iPhone, we learn something important about where the Internet has been, and something more important about where it is going. The PC revolution was launched with PCs that invited innovation by others. So too with the Internet. Both were generative: they were designed to accept any contribution that followed a basic set of rules (either coded for a particular operating system, or respecting the protocols of the Internet). Both overwhelmed their respective proprietary, non-generative competitors, such as the makers of stand-alone word processors and proprietary online services like CompuServe and AOL. But the future unfolding right now is very different from this past. The future is not one of generative PCs attached to a generative network. It is instead one of sterile appliances tethered to a network of control.

These appliances take the innovations already created by Internet users and package them neatly and compellingly, which is good—but only if the Internet and PC can remain sufficiently central in the digital ecosystem to compete with locked-down appliances and facilitate the next round of innovations. The balance between the two spheres is precarious, and it is slipping toward the safer appliance. For example, Microsoft’s Xbox 360 video game console is a powerful computer, but, unlike Microsoft’s Windows operating system for PCs, it does not allow just anyone to write software that can run on it. Bill Gates sees the Xbox as at the center of the future digital ecosystem, rather than at its periphery: “It is a general purpose computer. . . . [W]e wouldn’t have done it if it was just a gaming device. We wouldn’t have gotten into the category at all. It was about strategically being in the living room. . . . [T]his is not some big secret. Sony says the same things.”7

It is not easy to imagine the PC going extinct, and taking with it the possibility of allowing outside code to run—code that is the original source of so much of what we find useful about the Internet. But along with the rise of information appliances that package those useful activities without readily allowing new ones, there is the increasing lockdown of the PC itself. PCs may not be competing with information appliances so much as they are becoming them. The trend is starting in schools, libraries, cyber cafés, and offices, where the users of PCs are not their owners. The owners’ interests in maintaining stable computing environments are naturally aligned with technologies that tame the wildness of the Internet and PC, at the expense of valuable activities their users might otherwise discover.


3

The need for stability is growing. Today’s viruses and spyware are not merely annoyances to be ignored as one might tune out loud conversations at nearby tables in a restaurant. They will not be fixed by some new round of patches to bug-filled PC operating systems, or by abandoning now-ubiquitous Windows for Mac. Rather, they pose a fundamental dilemma: as long as people control the code that runs on their machines, they can make mistakes and be tricked into running dangerous code. As more people use PCs and make them more accessible to the outside world through broadband, the value of corrupting these users’ decisions is increasing. That value is derived from stealing people’s attention, PC processing cycles, network bandwidth, or online preferences. And the fact that a Web page can be and often is rendered on the fly by drawing upon hundreds of different sources scattered across the Net—a page may pull in content from its owner, advertisements from a syndicate, and links from various other feeds—means that bad code can infect huge swaths of the Web in a heartbeat.


2

If security problems worsen and fear spreads, rank-and-file users will not be far behind in preferring some form of lockdown—and regulators will speed the process along. In turn, that lockdown opens the door to new forms of regulatory surveillance and control. We have some hints of what that can look like. Enterprising law enforcement officers have been able to eavesdrop on occupants of motor vehicles equipped with the latest travel assistance systems by producing secret warrants and flicking a distant switch. They can turn a standard mobile phone into a roving microphone—whether or not it is being used for a call. As these opportunities arise in places under the rule of law—where some might welcome them—they also arise within technology-embracing authoritarian states, because the technology is exported.

A lockdown on PCs and a corresponding rise of tethered appliances will eliminate what today we take for granted: a world where mainstream technology can be influenced, even revolutionized, out of left field. Stopping this future depends on some wisely developed and implemented locks, along with new technologies and a community ethos that secures the keys to those locks among groups with shared norms and a sense of public purpose, rather than in the hands of a single gatekeeping entity, whether public or private.

The iPhone is a product of both fashion and fear. It boasts an undeniably attractive aesthetic, and it bottles some of the best innovations from the PC and Internet in a stable, controlled form. The PC and Internet were the engines of those innovations, and if they can be saved, they will offer more. As time passes, the brand names on each side will change. But the core battle will remain. It will be fought through information appliances and Web 2.0 platforms like today’s Facebook apps and Google Maps mash-ups. These are not just products but also services, watched and updated according to the constant dictates of their makers and those who can pressure them.

In this book I take up the question of what is likely to come next and what we should do about it.

Posted by The Editors on February 20, 2008
Tags: Uncategorized

Total comments on this page: 19

How to read/write comments

Comments on specific paragraphs:

Click the icon to the right of a paragraph

  • If there are no prior comments there, a comment entry form will appear automatically
  • If there are already comments, you will see them and the form will be at the bottom of the thread

Comments on the page as a whole:

Click the icon to the right of the page title (works the same as paragraphs)

Comments

No comments yet.

Jonathan Zittrain on paragraph 5:

The Apple SDK was announced in March, 2008. As suspected, third party coders for the iPhone are to be licensed, and their software made available only through an Apple iPhone Apps Store. Apple will get a cut of the sales of apps. (App authors can also set the price to zero.) See, e.g., here. This is an example of what I call a “contingently generative technology,” like Facebook Apps, too, even though the iPhone is client-side and Facebook is “in the cloud.”

March 14, 2008 10:06 pm
Tim on paragraph 5:

While Jobs may not have intended for the iPhone to accept outside applications, there are plenty of hackers who decided otherwise. Without a lot of extra trouble, its possible to put all kinds of apps on the phone in its current form. And that’s completely separate from the “jailbreak” schemes to divorce the phone from AT&T (something many of us less geeky types would love to do :-).

April 18, 2008 6:56 am

[…] 3) and CommentPress blogware. Both are about to go head-to-head over Jonathan Zittrain’s book The Future of the Internet–and how to stop it. Zittrain’s book has already been published online with the CommentPress system in place. Now […]

April 28, 2008 1:15 pm

Three times now I’ve started working on reading this and am stuck after only a couple of chapters. Yes, it is interesting, but I’m distracted by too many options. Not only do I need to read from scrolling windows of text, I am sideswiped by two competing annotation products, Diigo and CommentPress.

I’ve been reading text online for almost four decades. But my attention is perpetually being pulled from one place to the other. I have too many opportunities outside of the main text (I can read other people’s notes, I can write my own, I can check my e-mail, and so on). There is no “ludic reading” experience as described by Sven Birkerts.

I wish I had blinders on, as provided by the printed codex book. I can write in the margins without having to be distracted by too many other conversations.

Talmudically inspired study — of a hyped text admittedly carrying intriguing thoughts but ultimately of debatable lasting value — may not be necessary. It is not for everything. It is too self-referential. Are you suggesting that you’re creating a product as important as that of the prophets, which took 1,500 years to produce? We have seen similar overweening efforts to engineer a prophetic buzz — such as Jason Epstein’s (commercially motivated) article The Rattle of Pebbles, and its byproduct, Book Business, to Eric Raymond’s The Cathedral and the Bazaar.

The prophets did not need CommentPress. There was something more humble and monastic there, and it took millennia to evolve. We don’t even really know the names of most of them. Perhaps you should begin on paper.

October 6, 2008 10:45 am
Ben :

So you’re basically faulting the book for your own shortcomings.

July 14, 2010 4:29 pm
steve on paragraph 5:

I’ve been reading that this is less a matter of control for Apple, and even AT&T, but more an issue of how AT&T makes profits. AT&T claims that they cannot afford the number of support calls they would get for a phone running on an open applications platform.

May 5, 2008 3:33 pm
steve pepple on paragraph 14:

I immediately think of an recent British thriller, Surveillance 24/7 that, while it had some plot holes, was interesting and accurate.

May 5, 2008 3:42 pm

This is the IMDB entry for it…

June 25, 2008 3:53 pm
Dave on paragraph 2:

Maybe the Apple II is not the best example since Apple attempted heavy duty control over at least the hardware (squashing clones). A bad example is Texas Instruments, who made the iPhone of the 1980 era with a totally controlled TI-99/4a platform–and failed. However, like the Internet, the IBM PC was the most open hardware and software platform–and this is the platform that skyrocketed similar to the way the Internet has.

May 6, 2008 1:48 pm
Michael on paragraph 5:

Of course, it has been rumored that the apps licensed by Apple will be distributed / sold / managed through iTunes keeping the tethered appliance/network alive and inclusive…of course, you will have to register and provide credit card details and personal information before download and use..

May 15, 2008 11:50 am
Timabee on paragraph 5:

It’s too early to say the iPhone is a sterile device. Like the Apple II, Apple Inc. cannot know where this new technology/packaging of technology will go. They have a road map, but others have ideas of how to use the device. As noted by Tim, hackers are moving the iPhone in other directions, and even the “bricked” comment doesn’t show an understanding of the desire for people to take a tool designed by one entity and make it work for many others. All of this to say that there are ways around or beyond Apple Inc.’s control and plans. None of the hacks and workarounds have hurt Apple’s sales of the device.

May 30, 2008 11:18 pm
Adam on paragraph 9:

What about the deluge of user-generated content on the web today (i.e. widgets, DIY video, facebook applications). Also, what about Facebook open sourcing it’s platform (http://tinyurl.com/57fldr)? I know these events are post hoc, but these developments weren’t out of the blue.

June 8, 2008 5:21 pm
Bob on paragraph 10:

Which is why Linux appliances are so important but even more so is the competitions to create applications.

June 9, 2008 7:22 pm
Martin on paragraph 5:

If iphone 3G is a way to change the rule of the game towards the appliance network, the openmoko project (www.openmoko.org) is the other extreme.

Openmoko strives to build a free and generative mobile network ( at the time of writing google ‘Gphone is still a vapourware.)

But the success depends on the market’s choice.

I am rather optimistic that the generative network can be retained. the PC market defeated the Apple II or Mac appliances. So will the Openmoko defeat iPhone 3G.

July 8, 2008 8:31 am
L. Rogers on paragraph 7:

I agree that security and stability are factors in the drive to standardize and lock down systems; what the introduction glosses over is the more pedestrian motivation that silently buttresses the lock down trend – the old school profit motive. Companies (or investors) tend to prefer the certain profits of a turnpike to the multivalent profit of a highway. I believe that if Apple Computers had been able to foresee the success of the programs that eventually ran on their system, their impulse at the time would have been to control and profit off of these add-ons. Apparently, this is Apple’s impulse now.

So the openness that is, in retrospect, was a very good strategy, was at the time a happy accident. I agree with the author that the benefits of that accident – that an open market creates more value than a closed one, or to put it another way for the rising tide lifts all boats - should have served as a lesson for the iPhone. But is it paranoid for me to speculate that perhaps Apple, and the inventors of other tethered appliances drew the opposite conclusion?

July 24, 2008 10:38 pm
bhaskar on whole page :

good

September 21, 2008 7:06 am
Dan on paragraph 13:

Well… this is pretty misinformed. While, yes, websites are pulling in content from “hundreds of different sources”, there is a wall between the code that runs these sites and the code that runs your machine… for instance, Mac’s OS X has been without significantly effective viruses for all of its decade of existence.

If this is your only evidence that the “need for stability is growing,” your argument is in trouble.

December 9, 2009 8:57 pm
Rick :

Agree with above. This betrays the author’s lack of experience outside the Windows sphere. Unix systems are not, have never been, and never will be vulnerable the way Windows is. Of course this myopic view of computer security is probably shared with Steve Jobs, wherefore we have arrived at the impasse we’re at today.

July 10, 2010 4:16 pm
Ben on paragraph 13:

“Today’s viruses and spyware are not merely annoyances to be ignored as one might tune out loud conversations at nearby tables in a restaurant. They will not be fixed by some new round of patches to bug-filled PC operating systems, or by abandoning now-ubiquitous Windows for Mac.”

This is just obnoxious. The author is a professor of computer science? O RLY? The author is a farce when it comes to computer science. And such stupid statements at the very beginning of this book totally undermine the confidence of the reader.

July 14, 2010 4:42 pm
Name (required)
E-mail (required - never shown publicly)
URI