Skip to main content

Who needs high-speed broadband?

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


On paper, the main crux of the Federal Communications Commission's (FCC) recently released National Broadband Plan is fairly straightforward: help 100 million rural, underprivileged and otherwise underserved households across the U.S. get access to the Internet at speeds of at least 100 megabits per second over the next decade. The reality of the country's efforts to expand broadband access is much more complicated, according to a roundtable discussion hosted Monday by New York Law School in New York City.

Roundtable participants, including Blair Levin, executive director of the FCC Omnibus Broadband Initiative, addressed the plan from a number of angles, including how high-speed broadband would be used and the factors that go into delivering the consistently high speeds the government is promising.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The 400-page plan (the first draft was 2,300 pages), introduced on March 16, attempts to address issues that will be relevant to broadband access over the next several years, said Levin, who served as chief of staff for FCC Chairman Reed Hundt from 1993 to 1997.

One of the most important reasons for the U.S. to scale-up its broadband capabilities is so its telecommunications infrastructure can handle new applications that will require high speeds such as 100 megabits per second. "Apps are something we're very good at and something we want to continue to be very good at," Levin said. For this to happen, U.S. developers need access to the highest speeds available.

Levin acknowledged, however, that there is little demand outside of businesses for speeds anywhere near 100 megabits per second. "It could be that cloud computing is one of the things that drives the demand for 100 megabits, but I'm not seeing that right now," he said, later adding that there is nothing to suggest that smart electrical grids or Internet-based learning resources for children will require 100 megabits per second anytime soon either.

Another key component of the National Broadband Plan is the government's effort to get broadband providers to quote realistic data-transmission speeds when marketing their services. Often, telecommunications companies and cable providers say they can offer particular high speeds but do not typically deliver on those speeds, Levin said, although he clarified that he was not accusing anyone of committing fraud.

Delivering high-speed broadband data flow consistently is not as simple as it would seem and is an area where the National Broadband Plan "got it wrong," Comcast Senior Director for Public Policy David Don said during a roundtable discussion that took place after Levin had departed. The plan suggests that Internet service providers (ISPs), this would include Comcast, are to blame for Internet connections that are slower than peak performance, he said.

Comcast can calculate what its networks are providing, he added, but actual speeds depend significantly on a variety of different factors, include the quality of the computer the consumer is using to access the Internet, the operating system running on that computer and the speed of the servers hosting the Web sites that a consumer is trying to access. "We can talk about what our network is capable of doing," Don said, "but the actual user experience is not something within our control entirely."

Image © iStockphoto.com/Fabian Kerbusch

Larry Greenemeier is the associate editor of technology for Scientific American, covering a variety of tech-related topics, including biotech, computers, military tech, nanotech and robots.

More by Larry Greenemeier