August 31, 2016

avatar

How Will Consumers Use Faster Internet Speeds?

This week saw an exciting announcement about the experimental deployment of DOCSIS 3.1 in limited markets in the United States, including Philadelphia, Atlanta, and parts of northern California, which will bring gigabit-per-second Internet speeds to many homes over the existing cable infrastructure. The potential for gigabit speeds over the existing cable networks bring hope that more consumers will ultimately enjoy much higher-speed Internet connectivity both in the United States and elsewhere.

This development is also a pointed response to the not-so-implicit pressure from the Federal Communications Commission to deploy higher-speed Internet connectivity, which includes other developments such as the redefinition of broadband to a downstream throughput rate of 25 megabits per second, up from a previous (and somewhat laughable) definition of 4 Mbps; many commissioners have also stated their intentions to raise the threshold for the definition of a broadband network to a downstream throughput of 100 Mbps, as a further indication that ISPs will see increasing pressure for higher speed links to home networks. Yet, the National Cable and Telecommunications Association has also claimed in an FCC filing that such speeds are far more than a “typical” broadband user would require.

These developments and posturing beg the question: How will consumers change their behavior in response to faster downstream throughput from their Internet service providers? 

Ph.D. student Sarthak Grover, postdoc Roya Ensafi, and I set out to study this question with a cohort of about 6,000 Comcast subscribers in Salt Lake City, Utah, from October through December 2014. The study involved what is called a randomized controlled trial, an experimental method commonly used in scientific experiments where a group of users is randomly divided into a control group (whose user experience no change in conditions) and a treatment group (whose users are subject to a change in conditions).  Assuming the cohort is large enough and represents a cross-section of the demographic of interest, and that the users for the treatment group are selected at random, it is possible to observe differences between the two groups’ outcomes and conclude how the treatment affects the outcome.

In the case of this specific study, the control group consisted of about 5,000 Comcast subscribers who were paying for (and receiving) 105 Mbps downstream throughput; the treatment group, on the other hand, comprised about 1,500 Comcast subscribers who were paying for 105 Mbps but at the beginning of the study period were silently upgraded to 250 Mbps. In other words, users in the treatment group were receiving faster Internet service but was unaware of the faster downstream throughput of their connections. We explored how this treatment affected user behavior and made a few surprising discoveries:

“Moderate” users tend to adjust their behavior more than the “heavy” users. We expected that subscribers who downloaded the most data in the 250 Mbps service tier would be the ones causing the largest difference in mean demand between the two groups of users (previous studies have observed this phenomenon, and we do observe this behavior for the most aggressive users). To our surprise, however, the median subscribers in the two groups exhibited much more significant differences in traffic demand, particularly at peak times.  Notably, the 40% of subscribers with lowest peak demands more than double their daily peak traffic demand in response to service-tier upgrades (i.e., in the treatment group).

With the exception of the most aggressive peak-time subscribers, the subscribers who are below the 40th percentile in terms of peak demands increase their peak demand more than users who initially had higher peak demands.

This result suggests a surprising trend: it’s not the aggressive data hogs who account for most of the increased use in response to faster speeds, but rather the “typical” Internet user, who tends to use the Internet more as a result of the faster speeds. Our dataset does not contain application information, so it is difficult to say what, exactly is responsible for the higher data usage of the median user. Yet, the result uncovers an oft-forgotten phenomena of faster links: even existing applications that do not need to “max out” the link capacity (e.g., Web browsing, and even most video streaming) can benefit from a higher capacity link, simply because they will see better performance overall (e.g., faster load times and more resilience to packet loss, particularly when multiple parallel connections are in use). It might just be that the typical user is using the Internet more with the faster connection simply because the experience is better, not because they’re interested in filling the link to capacity (at least not yet!).

Users may use faster speeds for shorter periods of time, not always during “prime time”. There has been much ado about prime-time video streaming usage, and we most certainly see those effects in our data. To our surprise, the average usage per subscriber during prime-time hours was roughly the same between the treatment and control groups, yet outside of prime time, the difference in usage was much more pronounced between the two groups, with average usage per subscriber in the treatment group exhibiting 25% more usage than that in the control group for non-prime-time weekday hours.  We also observe that the peak-to-mean ratios for usage in the treatment group are significantly higher than they are in the control group, indicating that users with faster speeds may periodically (and for short times) take advantage of the significantly higher speeds, even though they are not sustaining a high rate that exhausts the higher capacity.

These results are interesting for last-mile Internet service providers because they suggest that the speeds at the edge may not currently be the limiting factor for user traffic demand. Specifically, the changes in peak traffic outside of prime-time hours also suggest that even the (relatively) lower-speed connections (e.g., 105 Mbps) may be sufficient to satisfy the demands of users during prime-time hours. Of course, the constraints on prime-time demand (much of which is largely streaming) likely result from other factors, including both available content and perhaps the well-known phenomena of congestion in the middle of the network, rather than in the last mile. All of this points to the increasing importance of resolving the performance issues that we see as a result of interconnection. In the best case, faster Internet service moves the bottleneck from the last mile to elsewhere in the network (e.g., interconnection points, long-haul transit links); but, in reality, it seems that the bottlenecks are already there, and we should focus on mitigating those points of congestion.

Further reading and study. You’ll be able to read more about our study in the following paper: A Case Study of Traffic Demand Response to Broadband Service-Plan Upgrades. S. Grover, R. Ensafi, N. Feamster. Passive and Active Measurement Conference (PAM). Heraklion, Crete, Greece. March 2016. (We will post an update when the final paper is published in early 2016.) There is plenty of room for follow-up work, of course; notably, the data we had access to did not have information about application usage, and only reflected byte-level usage at fifteen-minute intervals. Future studies could (and should) continue to study the effects of higher-speed links by exploring how the usage of specific applications (e.g., streaming video, file sharing, Web browsing) changes in response to higher downstream throughput.

avatar

Where is Internet Congestion Occurring?

In my post last week, I explained how Netflix traffic was experiencing congestion along end-to-end paths to broadband Internet subscribers, and how the resulting congestion was slowing down traffic to many Internet destinations. Although Netflix and Comcast ultimately mitigated this particular congestion episode by connecting directly to one another in a contractual arrangement known as paid peering, several mysteries about the congestion in this episode and other congestion episodes that persist. In the congestion episodes between Netflix and Comcast in 2014, perhaps the biggest question concerns where the congestion was actually taking place. There are several theories about where congestion was occurring; one or more of them are likely the case. I’ll dissect these cases in a bit more detail, and then talk more generally about some of the difficulties with locating congestion in today’s Internet, and why there’s still work for us to do to shed more light on these mysteries.
[Read more…]

avatar

Why Your Netflix Traffic is Slow, and Why the Open Internet Order Won’t (Necessarily) Make It Faster

The FCC recently released the Open Internet Order, which has much to say about “net neutrality” whether (and in what circumstances) an Internet service provider is permitted to prioritize traffic. I’ll leave more detailed thoughts on the order itself to future posts; in this post, I would like to clarify what seems to be a fairly widespread misconception about the sources of Internet congestion, and why “net neutrality” has very little to do with the performance problems between Netflix and consumer ISPs such as Comcast.

Much of the popular media has led consumers to believe that the reason that certain Internet traffic—specifically, Netflix video streams—were experiencing poor performance because Internet service providers are explicitly slowing down Internet traffic. John Oliver accuses Comcast of intentionally slowing down Netflix traffic (an Oatmeal cartoon reiterates this claim). These caricatures are false, and they demonstrate a fundamental misunderstanding of how Internet connectivity works, what led to the congestion in the first place, and the economics of how the problems were ultimately resolved.
[Read more…]

avatar

Google Spain and the “Right to Be Forgotten”

The European Court of Justice (CJEU) has decided the Google Spain case, which involves the “right to be forgotten” on the Internet. The case was brought by Mario Costeja González, a lawyer who, back in 1998, had unpaid debts that resulted in the attachment and public auction of his real estate. Notices of the auctions, including Mr. Costeja’s name, were published in a Spanish newspaper that was later made available online. Google indexed the newspaper’s website, and links to pages containing the announcements appeared in search results when Mr. Costeja’s name was queried. After failing in his effort to have the newspaper publisher remove the announcements from its website, Mr. Costeja asked Google not to return search results relating to the auction. Google refused, and Mr. Costeja filed a complaint with Spanish data protection authorities, the AEPD. In 2010, the AEPD ordered Google to de-index the pages. In the same ruling, the AEPD declined to order the newspaper publisher to take any action concerning the primary content, because the publication of the information by the press was legally justified. In other words, it was legal in the AEPD’s view for the newspaper to publish the information but a violation of privacy law for Google to help people find it. Google appealed the AEPD’s decision, and the appeal was referred by the Spanish court to the CJEU for a decision on whether Google’s publication of the search results violates the EU Data Protection Directive.
[Read more…]

avatar

Revisiting the potential hazards of the ‘Protect America’ act

In light of recent news reports about NSA wiretapping of U.S. Internet communications, folks may be interested in some background on the ‘warrantless wiretapping’ provisions of the Protect America act, and the potential security risks such wiretapping systems can introduce. Here’s a 2007 article a group of us wrote entitled “Risking Communications Security: Potential Hazards of the ‘Protect America’ Act”. http://www.cs.princeton.edu/~jrex/papers/PAA.pdf

avatar

A Response to Jerry: Craig Should Still Dismiss

[Cross-posted on my blog, Managing Miracles]

Jerry Brito, a sometimes contributor to this blog, has a new post on the Reason blog arguing that I and others have been too harsh on Craigslist for their recent lawsuit. As I wrote in my earlier post, Craigslist should give up the lawsuit not just because it’s unlikely to prevail, but also because it risks setting bad precedents and is downright distasteful. Jerry argues that what the startups that scrape Craigslist data are doing doesn’t “sit well,” and that there are a several reasons to temper criticism of Craigslist.

I remain unconvinced.

To begin with, the notion that something doesn’t “sit well” is not necessarily a good indicator that one can or should prevail in legal action. To be sure, tort law (and common law more generally) develops in part out of our collective notion of what does or doesn’t seem right. Jerry concedes that the copyright claims are bogus, and that the CFAA claims are ill-advised, so we’re left with doctrines like misappropriation and trespass to chattels. I’ll get to those in a moment.
[Read more…]

avatar

Are There Countries Whose Situations Worsened with the Arrival of the Internet?

Are there countries whose situations worsened with the arrival of the internet?  I’ve been arguing that there are lots of examples of countries where technology diffusion has helped democratic institutions deepen.  And there are several examples of countries where technology diffusion has been part of the story of rapid democratic transition.  But there are no good examples of countries where technology diffusion has been high, and the dictators got nastier as a result.

Over twitter, Eric Schmidt, Google CEO, recently opined the same thing.  Evgeny Morozov, professional naysayer, asked for a graph.

So here is a graph and a list.  I used PolityIV’s democratization scores from 2002 and 2011.  I used the World Bank/ITU data on internet users.  I merged the data and made a basic graph.  On the vertical axis is the change in percent of a country’s population online over the last decade.  The horizontal axis reflects any change in the democratization score–any slide towards authoritarianism is represented by a negative number.  For Morozov to be right, the top left corner of this graph needs to have some cases in it.

Change in Percentage Internet Users and Democracy Scores, By Country, 2002-2011

noexamples

Look at the raw data.

[Read more…]

avatar

Congressman Issa’s “Internet Law Freeze”: Appealing but Impractical

This week, Congressman Darrell Issa released a draft bill that would prevent Congress and administrative agencies from creating any new internet-related laws, rules, or regulations. The Internet American Moratorium Act (IAMA) is a rhetorical stake in the ground for the notion that the government should “keep its hands off the internet.” In the wake of successful blockage of SOPA/PIPA legislation–which would have interfered with basic internet functionality in the name of combating content piracy–there is renewed energy in DC to stop ill-advised internet-related laws and rules. Issa has been quoted as saying that the government needs a, “cooling-off period to figure out a better way to create policy that impacts Internet users.” The relevant portion of the bill reads:

It is resolved in the House of Representatives and Senate that they shall not pass any new legislation for a period of 2 years from the date of enactment of this Act that would require individuals or corporations engaged in activities on the Internet to meet additional requirements or activities. After 90 days of passage of this Act no Department or Agency of the United States shall publish new rules or regulations, or finalize or otherwise enforce or give lawful effect to draft rules or regulations affecting the Internet until a period of at least 2 years from the enactment of this legislation has elapsed.

[Read more…]

avatar

Don't Regulate the Internet. No, Wait. Regulate the Internet.

When Congress considered net neutrality legislation in the form of the Internet Freedom Preservation Act of 2008 (H.R. 5353), representatives of corporate copyright owners weighed in to oppose government regulation of the Internet. They feared that such regulation might inhibit their private efforts to convince ISPs to help them enforce copyrights online through various forms of broadband traffic management (e.g., filtering and bandwidth shaping). “Our view,” the RIAA’s Mitch Bainwol testified at a Congressional hearing, “is that the marketplace is generally a better mechanism than regulation for addressing such complex issues as how to address online piracy, and we believe the marketplace should be given the chance to succeed.” And the marketplace presumably did succeed, at least from the RIAA’s point of view, when ISPs and corporate rights owners entered into a Memorandum of Understanding last summer to implement a standardized, six-strikes graduated response protocol for curbing domestic illegal P2P file sharing. Chalk one up for the market.

What, then, should we make of the RIAA’s full-throated support for the Senate’s pending PROTECT IP Act (S. 968) and its companion bill in the House, SOPA (H.R. 3261)? PROTECT IP and SOPA are bills that would regulate the technical workings of the Internet by requiring operators of domain name servers to block user access to “rogue websites”—defined in PROTECT IP as sites “dedicated to infringing activities”—by preventing the domain names for those sites from resolving to their corresponding IP addresses. In a recent RIAA press release on PROTECT IP, the RIAA’s Bainwol praised the draft legislation, asserting the need for—you guessed it—new government regulation of the Internet: “[C]urrent laws have not kept pace with criminal enterprises that set up rogue websites overseas to escape accountability.” So much, I guess, for giving the marketplace the chance to succeed.

As the Social Science Research Council’s groundbreaking 2011 report on global piracy concluded, the marketplace could succeed in addressing the problem of piracy beyond U.S. borders if corporate copyright owners were willing to address global disparities in the accessibility of legal digital goods. As the authors explain, “[t]he flood of legal media goods available in high-income countries over the past two decades has been a trickle in most parts of the world.” Looking at the statistics on piracy in the developing world from the consumption side rather than the production side, the SSRC authors assert that what developing markets want and need are “price and service innovations” that have already been rolled out in the developed world. Who is in a better position to deliver such innovations, through the global marketplace, than the owners of copyrights in digital entertainment and information goods? Why not give the marketplace another chance to succeed, particularly when the alternative presented is a radical policy intrusion into the fundamental operation of the Internet?

The RIAA’s political strategy in the war on piracy has been alternately to oppose and support government regulation of the Internet, depending on what’s expedient. I wonder if rights owners and the trade groups that represent them experience any sense of cognitive dissonance when they advocate against something at one moment and for it a little while later—to the same audience, on the same issue.