Skip to main content

The Web is (not) dead...if you believe Scientific American, not Wired

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Too often, we don't appreciate what we have until it's gone. That could happen with the World Wide Web—unless we protect the basic principles on which the Web is built.

Protecting principles is also the key to the Web's future growth, an argument laid out in "Long Live the Web," written as an exclusive for Scientific American by the man who invented the biggest killer app of all time, Tim Berners-Lee. Twenty years ago this month, the Web went live inside a single computer on Berners-Lee's desk at CERN, the high-energy physics lab in Geneva.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


People like to reflect on anniversaries, but Berners-Lee—who rarely writes for anyone or agrees to be interviewed—had a stronger motivation: to wake us up. Threats are rising—from business and government—that could compromise the Web, and all of us who use it. As the article reveals, in recent months large social networking sites have tried to use your personal information for their own gain while making it difficult for you to access it. Wireless Internet providers have slowed traffic to sites with which they do not have commercial deals. Governments—totalitarian and democratic alike—as well as local Internet service providers, have snooped into your online habits, filtered the information you can see, censored certain Web sites altogether and disconnected people accused of wrongdoing before they are proven guilty.

Some Internet carriers, and some big Web sites, are also bent on fragmenting the Web, so that when you search for something you won't find all the possible answers, and when you publish something it won't be seen by everyone who might be interested.

Protecting the Web's principles is critical not merely to the digital revolution but to our continued prosperity, our free speech, even our liberty. It is also necessary if the Web is to bring us much more online power than it already does. As Berners-Lee often says, "The Web is not done."

Berners-Lee and I were well into the editing—targeting the magazine's December issue to mark the anniversary—when a red-faced reminder arose about one of the most sinister threats to the Web's ideals: cynicism. The September issue of Wired magazine appeared on newsstands. It had an all-orange cover with four black words that said: "The Web is Dead."

Needless to say, Berners-Lee was not amused. Neither was I, as his editor. Full disclosure: the two of us co-wrote a book in 1999, long before I joined the Scientific American staff, titled Weaving the Web. It told the true story of how the Web was created (Al Gore didn't invent it) and how it grew. One revelation the book made, which few people knew, was that several times during the 1990s, one company or another tried to rebrand the Web as its own product, or tried to violate the Web's principles so it could attempt to take over the new medium.

Partly motivated by those threats, Berners-Lee founded the international World Wide Web Consortium, with a U.S. base at the Massachusetts Institute of Technology. The consortium brought together individuals, companies and universities that were devising all sorts of Web technologies, so that instead of competing, the stakeholders could work together in open groups to build a better Web than any company could build by itself. Raising the tide would float everyone's boat higher. All the major Web companies signed on, and the approach has worked remarkably well, to this day.

The Wired story, however, made the same "Tired" argument (to borrow one of its monikers) that has been made for two decades: that the idealistic, grassroots intention to continue building a tool that benefits all of humanity will inevitably crumble, as some big company or companies inevitably take over. "The delirious chaos of the open Web was an adolescent phase subsidized by industrial giants groping their way in a new world," Wired said. "Now they're doing what industrialists do best—finding choke points." In other words, commercial powers will take over any application that rides on the Internet, especially the Web. May it rest in peace.

This point of view is nothing more than naked cynicism, which is bad enough. But Wired made its jaded portrayal worse by opening the article with an enormous, misleading two-page graph. It showed the percent of Internet traffic taken up by various applications, indicating that video's share was growing while the Web's share was shrinking—shrinking enough to claim that it was doomed. So there you have it: data. The Web must be dead.

What the graph did not show—because the editors chose to display relative percentages of Internet traffic—was that raw traffic related to the Web was still expanding, rapidly. Video was just growing faster. Hardly a death sentence.

Smarter—or less deceptive—analysts immediately jumped all over the gimmick. Rob Beschizza at the technology Web site BoingBoingsaid it best. He wrote that he "found this graph immediately suspect. The use of proportion of the total as the vertical axis instead of the actual total is an interesting editorial choice. You can probably guess that total use increases so rapidly that the Web is not declining at all."

Beschizza didn't just guess. He replotted the graph and showed that the Web is growing just fine, thank you very much. Furthermore, he pointed out, "It doesn't even seem to be the case that the Web's ongoing growth has slowed. It's rather been joined by even more explosive growth in file-sharing and video, which is often embedded in the Web in any case." Other media, including The New York Times, saw through the misleading graph also.

The Wired argument sticks in my craw for several reasons, one of which I came to understand while Berners-Lee and I were writing the book. Once the media realized the Web was growing like mad, it began to hound Berners-Lee for interviews. And once reporters learned that Berners-Lee did not turn his creation into a company but instead put it out there as a tool for the good of humanity, he was asked time and again, "Why didn't you get rich off the Web?" Meaning, "You fool. You could have been a billionaire." Now you can understand why Berners-Lee isn't keen on interviews.

Believe it or not, some people are actually motivated by ideals that rise above money. As Berners-Lee says in the book, "What is maddening is the terrible notion that a person's value depends on how important and financially successful they are, and that that is measured in the form of money. This suggests disrespect for the researchers across the globe developing ideas for the next leaps in science and technology. To use net worth as a criterion by which to judge people is to set our children's sights on cash rather than on things that will actually make them happy."

Every dot.com millionaire, and every person who's ever found a nugget of information or friended a friend on the Web, owes a debt to Berners-Lee for deliberately setting up his creation as a free and open platform, not as a for-profit venture. And he's not looking to be lionized for that. "I'm happy to let others play the role of royalty," he says. "Just as long as they don't try to control the Web."

Which is what Wired's crass argument maintains is inevitable. In Scientific American, Berners-Lee says the Web's future does not have to play out that way, if people preserve the basic principles so that the Web can continue to improve and benefit everyone.

Yes, Berners-Lee and I are friends, but we rarely see one another, and I have no stake in anything related to the Web. But I don't like stunts that parade as journalism. I don't like data that is spun. Most of all, I don't like the attitude that the work of individuals or groups that are motivated by the common good is somehow adolescent because it hasn't proven itself by going commercial.

The argument is also polarizing. As Berners-Lee also often says, the Web was designed so that for-profit companies could flourish on it just as well as individuals could. For example, companies such as Conde Nast, which publishes Wired. (I mean, really; Wired succeeded because the Web created an online life for millions of people, which became the magazine's bread and butter.) As Berners-Lee writes in the article, "Indeed, many companies spend the time and money to develop extraordinary applications precisely because they are confident that the applications will work for anyone, regardless of the computer hardware, operating system or ISP they are using—all made possible by the Web's open, royalty-free standards."

The reason the Web has remained open is because Berners-Lee, and the Web consortium, have protected the founding principles. Is society so crass that it won't stand up for ideals that go beyond a profit motive? Many more truly human benefits, as well as commercial successes, can come from an open Web than from a commercially controlled Web. You can read about examples in other stories in this online package. Linked data, entered into OpenSourceMap, actually saved people's lives immediately after the terrible Haiti earthquake in January. JoinAfrica.org could finally bring Web access to million of Africans who could not otherwise afford it, improving education and providing a sudden voice to the forgotten. Social machines could broker a better democracy. All these activities and many more are made possible because the Web, and the Internet on which it rides, allow individuals, institutions and for-profit companies to use the Web unfettered.

January 2009 image of Berners-Lee courtesy of Silvio Tanaka

Mark Fischetti has been a senior editor at Scientific American for 17 years and has covered sustainability issues, including climate, weather, environment, energy, food, water, biodiversity, population, and more. He assigns and edits feature articles, commentaries and news by journalists and scientists and also writes in those formats. He edits History, the magazine's department looking at science advances throughout time. He was founding managing editor of two spinoff magazines: Scientific American Mind and Scientific American Earth 3.0. His 2001 freelance article for the magazine, "Drowning New Orleans," predicted the widespread disaster that a storm like Hurricane Katrina would impose on the city. His video What Happens to Your Body after You Die?, has more than 12 million views on YouTube. Fischetti has written freelance articles for the New York Times, Sports Illustrated, Smithsonian, Technology Review, Fast Company, and many others. He co-authored the book Weaving the Web with Tim Berners-Lee, inventor of the World Wide Web, which tells the real story of how the Web was created. He also co-authored The New Killer Diseases with microbiologist Elinor Levy. Fischetti is a former managing editor of IEEE Spectrum Magazine and of Family Business Magazine. He has a physics degree and has twice served as the Attaway Fellow in Civic Culture at Centenary College of Louisiana, which awarded him an honorary doctorate. In 2021 he received the American Geophysical Union's Robert C. Cowen Award for Sustained Achievement in Science Journalism, which celebrates a career of outstanding reporting on the Earth and space sciences. He has appeared on NBC's Meet the Press, CNN, the History Channel, NPR News and many news radio stations. Follow Fischetti on X (formerly Twitter) @markfischetti

More by Mark Fischetti