Skip to Content

Is Netflix Slowing Down? Good Luck Finding Out Why.

The FCC wants to force ISPs to show whether delays are due to unavoidable congestion or created to extract fees from content providers.
February 21, 2014

Even when they’ve made sure to sign up for more-than-adequate plans with their Internet service providers, many customers of popular cloud services such as Netflix encounter glitches such as frozen screens on streamed movies.

Indeed, in its latest update to its ISP speed reports, Netflix notes a trend of slower speeds from some providers, including Verizon and Comcast. But why? ISPs blame unavoidable congestion at various points in the network. Yet some critics believe it’s part of a calculated plan by ISPs to force content providers to pay.

Getting the answer means getting the data, but ISPs do not reveal the data, nor are they required to. Changing this was one of Federal Communications Commission chairman Tom Wheeler’s goals when he urged the FCC yesterday to try again to write new rules, popularly called “net neutrality” regulations, that generally ban ISPs from blocking or slowing different forms of data or charging extra to deliver it more quickly.

His call comes one month after a federal appeals court blocked the FCC’s most recent attempt to craft such regulations (see “Net Neutrality Quashed: New Pricing Schemes, Throttling, and Business Models to Follow”). At the moment, it is legal for ISPs to try to collect fees from parties like Netflix to ensure fast data delivery, though they do not do so in practice.

The topic exploded earlier this month when an engineer at a Texas security firm collected data showing extremely slow service from his company’s site, which relies on Amazon Web Services, despite robust Internet connections at his home and office. He accused his provider, Verizon FIOS, of slowing the Amazon service deliberately. Verizon denied this, saying congestion was to blame.

Netflix declined to comment to MIT Technology Review, though previously it has not blamed active slowing, or throttling, for the slow speeds experienced by its customers.

In fact, it’s hard to determine whether slowdowns are caused deliberately or just by too much traffic. The Internet is a collection of smaller networks, which connect with one another at junctions often called peering points. Depending on traffic patterns at any given time, some such interconnections can get clogged.

Today, critics complain, it’s hard to monitor this because the arrangements between Internet providers are not regulated, the agreements that govern them are not public, and the data is kept private. “We have a gigantic problem in that the FCC doesn’t currently have access to data,” says Susan Crawford, a visiting professor of law at Harvard University and a co-director of Harvard’s Berkman Center for Internet & Society. “You want to be able to show that all other explanations for inadequate Netflix experience—or YouTube experience, or Amazon Web Services experience—can be ruled out, and that the reason for the bottleneck lies within the control of the network operator.”

Wheeler seemed to directly address that problem, calling for “genuine transparency in how Internet service providers manage traffic.” His statement added that the data should be detailed enough to give content providers “the technical information they need to create and maintain their products and services as well as to assess the risks and benefits of embarking on new projects.”

Ben Scott, a researcher working on measurement data from M-Lab, a research consortium that tracks global Internet performance and throttling, says researchers are trying to do their own sleuthing in the meantime, but no results are yet available.

A Verizon spokesman said in a company statement earlier this month: “We treat all traffic equally, and that has not changed. Many factors can affect the speed of a customer’s experience for a specific site, including that site’s servers, the way the traffic is routed over the Internet, and other considerations.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.