March 28, 2024

Search Neutrality ? Net Neutrality

Sunday’s New York Times featured a provocative op-ed arguing in addition to regulating “net neutrality” the FCC should also effectuate “search neutrality” – requiring search providers rank results without consideration of business entities. The author heaps particular scorn upon Google for promoting its own context-relevant services (i.e. maps and weather) at the fore of search results. Others have already reviewed the proposal, leveled implementation critiques, and criticized the author’s gripes with his own site. My aim here is to rebut the piece’s core argument: the analogy of search neutrality to net neutrality. Clearly both are debates about the promotion of innovation and competition through a level playing field. But beyond this commonality the parallel breaks down.

Net neutrality advocates call for regulation because ISP discrimination could render innovative services either impossible to implement owing to traffic restrictions or too expensive to deploy owing to traffic pricing. Consumers cannot “vote with their dollars” for a nondiscriminatory ISP since most locales have few providers and the market is hard to break into. Violations of net neutrality, the argument goes, threaten to nip entire industries in the bud and rob the economy of growth.

Violations of search neutrality, on the other hand, at most increase marketing costs for an innovative or competitive offering. Consumers are more than clever enough to seek and use an alternative to a weaker Google offering (Yelp vs. Google restaurant reviews, anyone?). The author of the op-ed cites Google Maps’ dethroning of MapQuest as evidence of the power of search non-neutrality; on the contrary, I would contend users flocked to Google’s service because it was, well, better. If Google Maps featured MapQuest’s clunky interface and vice versa, would you use it? A glance at historical map site statistics empirically rebuts the author’s claim. The mid-May 2007 introduction of Google’s context-relevant (“universal”) search does not appear correlated with any irregular shift in map site traffic.

Moreover, unlike with net neutrality search consumers stand ready to “vote with their [ad] dollars.” Should Google consistently favor its own services to the detriment of search result quality, consumers can effortlessly shift to any of its numerous competitors. It is no coincidence Google sinks enormous manpower into improving result quality.

There may also be a benefit to the increase in marketing costs from existing violations of search neutrality, like Google’s map and weather offerings. If a service would have to be extensively marketed to compete with Google’s promoted offering – say, a current weather site vs. searching for “Stanford weather” – the market is sending a signal that consumers don’t care about the marginal quality of the product, and the non-Google provider should quit the market.

There is merit to the observation that violations of search neutrality are, on the margin, slightly anti-competitive. But this issue is dwarfed by the potential economy-scale implications of net neutrality. The FCC should not deviate in its rulemaking.

DARPA Pays MIT to Pay Someone Who Recruited Someone Who Recruited Someone Who Recruited Someone Who Found a Red Balloon

DARPA, the Defense Department’s research arm, recently sponsored a “Network Challenge” in which groups competed to find ten big red weather balloons that were positioned in public places around the U.S. The first team to discover where all the balloons were would win $40,000.

A team from MIT won, using a clever method of sharing the cash with volunteers. MIT let anyone join their team, and they paid money to the members who found balloons, as well as the people who recruited the balloon-finders, and the people who recruited the balloon-finder-finders. For example, if Alice recruited Bob, and Bob recruited Charlie, and Charlie recruited Diane, and Diane found a balloon, then Alice would get $250, Bob would get $500, Charlie would get $1000, and Diane would get $2000. Multi-level marketing meets treasure hunting! It’s the Amway of balloon-hunting!

On DARPA’s side, this was inspired by the famous Grand Challenge and Urban Challenge, in which teams built autonomous cars that had to drive themselves safely through a desert landscape and then a city.

The autonomous-car challenges have obvious value, both for the military and in ordinary civilian life. But it’s hard to say the same for the balloon-hunting challenge. Granted, the balloon-hunting prize was much smaller, but it’s still hard to avoid the impression that the balloon hunt was more of a publicity stunt than a spur to research. We already knew that the Internet lets people organize themselves into effective groups at a distance. We already knew that people like a scavenger hunt, especially if you offer significant cash prizes. And we already knew that you can pay Internet strangers to do jobs for you. But how are we going to apply what we learned in the balloon hunt?

The autonomous-car challenge has value because it asks the teams to build something that will eventually have practical use. Someday we will all have autonomous cars, and they will have major implications for our transportation infrastructure. The autonomous-car challenge helped to bring that day closer. But will the day ever come when all, or even many, of us will want to pay large teams of people to find things for us?

(There’s more to be said about the general approach of offering challenge prizes as an alternative to traditional research funding, but that’s a topic for another day.)

Wu on Zittrain's Future of the Internet

Related to my previous post about the future of open technologies, Tim Wu has a great review of Jonathan Zittrain’s book. Wu reviews the origins of the 20th century’s great media empires, which steadily consolidated once-fractious markets. He suggests that the Internet likely won’t meet the same fate. My favorite part:

In the 2000s, AOL and Time Warner took the biggest and most notorious run at trying to make the Internet more like traditional media. The merger was a bet that unifying content and distribution might yield the kind of power that Paramount and NBC gained in the 1920s. They were not alone: Microsoft in the 1990s thought that, by owning a browser (Explorer), dial-in service (MSN), and some content (Slate), it could emerge as the NBC of the Internet era. Lastly, AT&T, the same firm that built the first radio network, keeps signaling plans to assert more control over “its pipes,” or even create its own competitor to the Internet. In 2000, when AT&T first announced its plans to enter the media market, a spokesman said: “We believe it’s very important to have control of the underlying network.”

Yet so far these would-be Zukors and NBCs have crashed and burned. Unlike radio or film, the structure of the Internet stoutly resists integration. AOL tried, in the 1990s, to keep its users in a “walled garden” of AOL content, but its users wanted the whole Internet, and finally AOL gave in. To make it after the merger, AOL-Time Warner needed to build a new garden with even higher walls–some way for AOL to discriminate in favor of Time Warner content. But AOL had no real power over its users, and pretty soon it did not have many of them left.

I think the monolithic media firms of the 20th century ultimately owed their size and success to economies of scale in the communication technologies of their day. For example, a single newspaper with a million readers is a lot cheaper to produce and distribute than ten newspapers with 100,000 readers each. And so the larger film studios, newspapers, broadcast networks, and so on were able to squeeze out smaller players. Once one newspaper in a given area began reaping the benefits of scale, it made it difficult for its competitors to turn a profit, and a lot of them went out of business or got acquired at firesale prices.

On the Internet, distributing content is so cheap that economies of scale in distribution just don’t matter. On a per-reader basis, my personal blog certainly costs more to operate than CNN. But the cost is so small that it’s simply not a significant factor in deciding whether to continue publishing it. Even if the larger sites capture the bulk of the readership and advertising revenue, that doesn’t preclude a “long tail” of small, often amateur sites with a wide variety of different content.

Economic Growth, Censorship, and Search Engines

Economic growth depends on an ability to access relevant information. Although censorship prevents access to certain information, the direct consequences of censorship are well-known and somewhat predictable. For example, blocking access to Falun Gong literature is unlikely to harm a country’s consumer electronics industry. On the web, however, information of all types is interconnected. Blocking a web page might have an indirect impact reaching well beyond that page’s contents. To understand this impact, let’s consider how search results are affected by censorship.

Search engines keep track of what’s available on the web and suggest useful pages to users. No comprehensive list of web pages exists, so search providers check known pages for links to unknown neighbors. If a government blocks a page, all links from the page to its neighbors are lost. Unless detours exist to the page’s unknown neighbors, those neighbors become unreachable and remain unknown. These unknown pages can’t appear in search results — even if their contents are uncontroversial.

When presented with a query, search engines respond with relevant known pages sorted by expected usefulness. Censorship also affects this sorting process. In predicting usefulness, search engines consider both the contents of pages and the links between pages. Links here are like friendships in a stereotypical high school popularity contest: the more popular friends you have, the more popular you become. If your friend moves away, you become less popular, which makes your friends less popular by association, and so on. Even people you’ve never met might be affected.

“Popular” web pages tend to appear higher in search results. Censoring a page distorts this popularity contest and can change the order of even unrelated results. As more pages are blocked, the censored view of the web becomes increasingly distorted. As an aside, Ed notes that blocking a page removes more than just the offending material. If censors block Ed’s site due to an off-hand comment on Falun Gong, he also loses any influence he has on information security.

These effects would typically be rare and have a disproportionately small impact on popular pages. Google’s emphasis on the long tail, however, suggests that considerable value lies in providing high-quality results covering even less-popular pages. To avoid these issues, a government could allow limited individuals full web access to develop tools like search engines. This approach seems likely to stifle competition and innovation.

Countries with greater censorship might produce lower-quality search engines, but Google, Yahoo, Microsoft, and others can provide high-quality search results in those countries. These companies can access uncensored data, mitigating the indirect effects of censorship. This emphasizes the significance of measures like the Global Network Initiative, which has a participant list that includes Google, Yahoo, and Microsoft. Among other things, the initiative provides guidelines for participants regarding when and how information access may be restricted. The effectiveness of this specific initiative remains to be seen, but such measures may provide leading search engines with greater leverage to resist arbitrary censorship.

Search engines are unlikely to be the only tools adversely impacted by the indirect effects of censorship. Any tool that relies on links between information (think social networks) might be affected, and repressive states place themselves at a competitive disadvantage in developing these tools. Future developments might make these points moot: in a recent talk at the Center, Ethan Zuckerman mentioned tricks and trends that might make censorship more difficult. In the meantime, however, governments that censor information may increasingly find that they do so at their own expense.

How Fragile Is the Internet?

With Barack Obama’s election, we’re likely to see a revival of the network neutrality debate. Thus far the popular debate over the issue has produced more heat than light. On one side have been people who scoff at the very idea of network neutrality, arguing either that network neutrality is a myth or that we’d be better off without it. On the other are people who believe the open Internet is hanging on by its fingernails. These advocates believe that unless Congress passes new regulations quickly, major network providers will transform the Internet into a closed network where only their preferred content and applications are available.

One assumption that seems to be shared by both sides in the debate is that the Internet’s end-to-end architecture is fragile. At times, advocates on both sides debate seem to think that AT&T, Verizon, and Comcast have big levers in their network closets labeled “network neutrality” that they will set to “off” if Congress doesn’t stop them. In a new study for the Cato Institute, I argue that this assumption is unrealistic. The Internet has the open architecture it has for good technical reasons. The end-to-end principle is deeply embedded in the Internet’s architecture, and there’s no straightforward way to change it without breaking existing Internet applications.

One reason is technical. Advocates of regulation point to a technology called deep packet inspection as a major threat to the Internet’s open architecture. DPI allows network owners to look “inside” Internet packets, reconstructing the web page, email, or other information as it comes across the wire. This is an impressive technology, but it’s also important to remember its limitations. DPI is inherently reactive and brittle. It requires human engineers to precisely describe each type of traffic that is to be blocked. That means that as the Internet grows ever more complex, more and more effort would be required to keep DPI’s filters up to date. It also means that configuration problems will lead to the accidental blocking of unrelated traffic.

The more fundamental reason is economic. The Internet works as well as it does precisely because it is decentralized. No organization on Earth has the manpower that would have been required to directly manage all of the content and applications on the Internet. Networks like AOL and Compuserve that were managed that way got bogged down in bureaucracy while they were still a small fraction of the Internet’s current size. It is not plausible that bureaucracies at Comcast, AT&T, or Verizon could manage their TCP/IP networks the way AOL ran its network a decade ago.

Of course what advocates of regulation fear is precisely that these companies will try to manage their networks this way, fail, and screw the Internet up in the process. But I think this underestimates the magnitude of the disaster that would befall any network provider that tried to convert their Internet service into a proprietary network. People pay for Internet access because they find it useful. A proprietary Internet would be dramatically less useful than an open one because network providers would inevitably block an enormous number of useful applications and websites. A network provider that deliberately broke a significant fraction of the content or applications on its network would find many fewer customers willing to pay for it. Customers that could switch to a competitor would. Some others would simply cancel their home Internet service and rely instead on Internet access at work, school, libraries, etc. And many customers that had previously taken higher-speed Internet service would downgrade to basic service. In short, even in an environment of limited competition, reducing the value of one’s product is rarely a good business strategy.

This isn’t to say that ISPs will never violate network neutrality. A few have done so already. The most significant was Comcast’s interference with the BitTorrent protocol last year. I think there’s plenty to criticize about what Comcast did. But there’s a big difference between interfering with one networking protocol and the kind of comprehensive filtering that network neutrality advocates fear. And it’s worth noting that even Comcast’s modest interference with network neutrality provoked a ferocious response from customers, the press, and the political process. The Comcast/BitTorrent story certainly isn’t going to make other ISPs think that more aggressive violations of network neutrality would be a good business strategy.

So it seems to me that new regulations are unnecessary to protect network neutrality. They are likely to be counterproductive as well. As Ed has argued, defining network neutrality precisely is surprisingly difficult, and enacting a ban without a clear definition is a recipe for problems. In addition, there’s a real danger of what economists call regulatory capture—that industry incumbents will find ways to turn regulatory authority to their advantage. As I document in my study, this is what happened with 20th-century regulation of the railroad, airline, and telephone industries. Congress should proceed carefully, lest regulations designed to protect consumers from telecom industry incumbents wind up protecting incumbents from competition instead.