Skip to main content

The Downside of Net Neutrality

Its principles are out of touch with today's wireless world

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


“Treating everyone and everything in the same manner,” sounds fair and seems like the right thing to do—except when you’re talking about wireless networks. Here many different types of services are competing for the same limited amount of bandwidth to reach their respective audiences. When we apply net neutrality principles of the 2015 regulatory framework to wireless networks, such proposed equal treatment of all network traffic could have the unintended effect of interfering with the quality of our communications (voice and video calls) as well as our enjoyment of high-definition videos and other streaming content.

That’s because the forced equal treatment of services would increase network congestion—something that threatens the quality of even simple voice calls, not to mention the introduction and growth of amazing new services such as hologram videos, virtual reality and augmented reality.

Such arguments are not new but bear exploring once again as the Federal Communications Commission is expected to vote on December 14 to scale back the net neutrality regulations adopted in 2015. Given the resistance many are expressing to this decision, it is important to analyze the implications of the blanket application of the network neutrality framework—and to wireless networks in particular.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


All Wireless Data Are Not the Same

We particularly like that the FCC is now proposing to, “Restore the determination that mobile broadband is not a ‘commercial mobile service’ subject to heavy-handed regulation.” Essentially, the principles of the 2015 framework aimed to ensure equal treatment of traffic passing through networks. As we explain below, however, wireless networks must treat traffic and users differently in order to provide the best possible service.

One of the primary reasons all of us can enjoy natural-sounding voice calls, high-quality video calls and multimedia messages on our smartphones is that the wireless network works very hard to provide custom quality of service (QoS) for each of these services. QoS is quantified by metrics such as the packet error rate and network delay, along with data speed, or rate.

QoS is important because it takes into account the way our human bodies are designed (!) and the expectations that we have as consumers. Our ears can easily tolerate 2 to 3 percent packet error rate but they would notice more than 300 milliseconds of delay (whether due to the travel of data packets in the wired portion of the service provider’s network or the packet’s wait time at the cell tower). In contrast, even a handful of dropped or corrupted data packets would keep Web browsing, e-mail and file downloads from working properly, although those services can tolerate relatively lengthy delays between the smartphone and the servers. Most people wouldn’t notice if, for example, an e-mail delivery was delayed by a few hundred milliseconds.

Prioritizing Data

The wireless network has a complex intelligent scheduling mechanism that considers numerous factors such as the QoS requirements of services, availability of scarce radio resources, such as bandwidth, and radio signal quality to allocate resources. Signal quality can be impacted by, for example, interference resulting from the reuse of the same radio resources, such as a radio channel, by different cell towers. Such interference is unavoidable because radio channel reuse is fundamental to the existence of wireless or cellular communications. As a result, wireless service providers must be able to quickly—as fast as every millisecond—differentiate among signal types in order to efficiently and cost-effectively support different services.

AT&T, Verizon and other wireless service providers optimize precious yet scarce radio resources and differentiate users and services based on QoS requirements and prevailing radio conditions such as interference. For example, the wireless network aims for an aggregate data rate of 50 kilobits per second (kbps) and 1 percent packet error rate to enable high-definition voice calls. When the signal carrying your voice becomes weaker or the interference becomes stronger, the cell tower dynamically adjusts transmission parameters to maintain these target QoS parameters of the data and packet error rates.

To do that, the cell tower may add redundancy so that voice packets can be received reliably in the face of these challenges.

If the wireless network were forced to treat all users the same way all the time to adhere to the 2015 regulatory framework’s blanket net neutrality principles, many subscribers would experience an increase in frozen videos, poor voice quality, inaccessible Web sites and call drops. That’s because the wireless network can quickly run out of radio resources (for example, bandwidth) when cell phone towers must try to send data at the same rate to all users, regardless of their proximity. Consider two users downloading the same video at the same rate, with one user close to the cell tower and the other far away from it. If the cell tower has a total radio channel bandwidth of 10 megahertz (MHz), the user close by would consume only 1 MHz bandwidth whereas the faraway one could easily require the remaining 9-MHz. In practice, the disparity of resource consumption could be much greater than this factor of 9.

Networks already prioritize data traffic, even with the 2015 regulatory framework in place. For example, internet provider routers quickly forward packets of delay-sensitive services such as voice but might store e-mail packets in buffers for some time when those routers get busy. As you can see, the complexities of wireless networks sharing different types of data with differing needs make it as difficult to define “fair access” as it would be to define a concept such as “beauty.”

New Types of Wireless Data on the Way

We are at the doorstep of emerging fifth-generation (5G) cellular technology, which promises to offer amazing services such as hologram videos (remember Princess Leia’s hologram message in Star Wars?), self-driving cars, augmented/virtual reality and even remote surgery. New services that make the most of 5G’s speed and latency performance will require a flexible network if they are to transform the way we live and revolutionize the world around us. More specifically, 5G networks must adapt to different services’ bandwidth and latency requirements even more than today’s 4G networks do, something that wouldn’t be possible if regulations require all traffic be treated equally.

In a 5G world some companies may focus on providing exceptional services in enhanced mobile broadband (eMBB) whereas others may strive to perform exceptional ultra-reliable and low-latency communication (URLLC) services. The way data is processed and prioritized for these different business models is going to be quite different. Trying to gauge fair treatment of traffic is not an obvious or feasible goal for a regulatory agency such as the FCC. Rather, consumers will decide if they are getting fair services for their specific needs.

Editor’s Note:The authors have worked as consultants to the wireless industry in the area of network neutrality and other matters related to technical aspects of wireless policy.