December 15, 2024

Chinese Internet Censorship: See It For Yourself

You probably know already that the Chinese government censors Internet traffic. But you might not have known that you can experience this censorship yourself. Here’s how:

(1) Open up another browser window or tab, so you can browse without losing this page.

(2) In the other window, browse to baidu.com. This is a search engine located in China.

(3) Search for an innocuous term such as “freedom to tinker”. You’ll see a list of search results, sent back by Baidu’s servers in China.

(4) Now return to the main page of baidu.com, and search for “Falun Gong”. [Falun Gong is a dissident religious group that is banned in China.]

(5) At this point your browser will report an error — it might say that the connection was interrupted or that the page could not be loaded. What really happened is that the Great Firewall of China saw your Internet packets, containing the forbidden term “Falun Gong”, and responded by disrupting your connection to Baidu.

(6) Now try to go back to the Baidu home page. You’ll find that this connection is disrupted too. Just a minute ago, you could visit the Baidu page with no trouble, but now you’re blocked. The Great Firewall is now cutting you off from Baidu, because you searched for Falun Gong.

(7) After a few minutes, you’ll be allowed to connect to Baidu again, and you can do more experiments.

(Reportedly, users in China see different behavior. When they search for “Falun Gong” on Baidu, the connection isn’t blocked. Instead, they see “sanitized” search results, containing only pages that criticize Falun Gong.)

If you do try more experiments, feel free to report your results in the comments.

Watching Google's Gatekeepers

Google’s legal team has extraordinary power to decide which videos can be seen by audiences around the world, according to Jeffrey Rosen’s piece, Google’s Gatekeepers in yesterday’s New York Times magazine. Google, of course, owns YouTube, which gives it the technical ability to block particular videos — though of course so many videos are submitted that it’s impractical to review them all in advance.

Some takedown requests are easy — content that is offensive and illegal (almost) everywhere will come own immediately once a complaint is received and processed. But Rosen focuses on more difficult cases, where a government asks YouTube to take down a video that expresses dissent or is otherwise inconvenient for that government. Sometimes these videos violate local laws, but more often their legal status is murky and in any case the laws in question may be contrary to widely accepted free speech principles.

Rosen worries that too much power to decide what can be seen is being concentrated in the hands of one company. He acknowledges that Google has behaved reasonably so far, but he worries about what might happen in the future.

I understand his point, but it’s hard to see an alternative that would be better in practice. If Google, as the owner of YouTube, is not going to have this power, then the power will have to be given to somebody else. Any nominations? I don’t have any.

What we’re left with, then, is Google making the decisions. But this doesn’t mean all of us are out in the cold, without influence. As consumers of Google’s services, we have a certain amount of leverage. And this is not just hypothetical — Google’s “don’t be evil” reputation contributes greatly to the value of its brand. The moment people think Google is misbehaving is the moment they’ll consider taking their business elsewhere.

As concerned members of the public — concerned customers, from Google’s viewpoint — there are things we can do to help keep Google honest. First, we can insist on transparency, that Google reveal what it is blocking and why. Rosen describes some transparency mechanisms that are in place, such as Google’s use of the Chilling Effects website.

Second, when we use Google’s services, we can try to minimize our switching costs, so that moving to an alternative service is a realistic possibility. The less we’re locked in to Google’s service, the less we’ll feel forced to keep using those services even if the company’s behavior changes. And of course we should think carefully about switching costs in all our technology decisions, even when larger policy issues aren’t at stake.

Finally, we can make sure that Google knows we care about free speech, and about its corporate behavior generally. This means criticizing them when they slip up, and praising them when they do well. Most of all, it means debating their decisions — which Rosen’s article helpfully invites us to do.

Economic Growth, Censorship, and Search Engines

Economic growth depends on an ability to access relevant information. Although censorship prevents access to certain information, the direct consequences of censorship are well-known and somewhat predictable. For example, blocking access to Falun Gong literature is unlikely to harm a country’s consumer electronics industry. On the web, however, information of all types is interconnected. Blocking a web page might have an indirect impact reaching well beyond that page’s contents. To understand this impact, let’s consider how search results are affected by censorship.

Search engines keep track of what’s available on the web and suggest useful pages to users. No comprehensive list of web pages exists, so search providers check known pages for links to unknown neighbors. If a government blocks a page, all links from the page to its neighbors are lost. Unless detours exist to the page’s unknown neighbors, those neighbors become unreachable and remain unknown. These unknown pages can’t appear in search results — even if their contents are uncontroversial.

When presented with a query, search engines respond with relevant known pages sorted by expected usefulness. Censorship also affects this sorting process. In predicting usefulness, search engines consider both the contents of pages and the links between pages. Links here are like friendships in a stereotypical high school popularity contest: the more popular friends you have, the more popular you become. If your friend moves away, you become less popular, which makes your friends less popular by association, and so on. Even people you’ve never met might be affected.

“Popular” web pages tend to appear higher in search results. Censoring a page distorts this popularity contest and can change the order of even unrelated results. As more pages are blocked, the censored view of the web becomes increasingly distorted. As an aside, Ed notes that blocking a page removes more than just the offending material. If censors block Ed’s site due to an off-hand comment on Falun Gong, he also loses any influence he has on information security.

These effects would typically be rare and have a disproportionately small impact on popular pages. Google’s emphasis on the long tail, however, suggests that considerable value lies in providing high-quality results covering even less-popular pages. To avoid these issues, a government could allow limited individuals full web access to develop tools like search engines. This approach seems likely to stifle competition and innovation.

Countries with greater censorship might produce lower-quality search engines, but Google, Yahoo, Microsoft, and others can provide high-quality search results in those countries. These companies can access uncensored data, mitigating the indirect effects of censorship. This emphasizes the significance of measures like the Global Network Initiative, which has a participant list that includes Google, Yahoo, and Microsoft. Among other things, the initiative provides guidelines for participants regarding when and how information access may be restricted. The effectiveness of this specific initiative remains to be seen, but such measures may provide leading search engines with greater leverage to resist arbitrary censorship.

Search engines are unlikely to be the only tools adversely impacted by the indirect effects of censorship. Any tool that relies on links between information (think social networks) might be affected, and repressive states place themselves at a competitive disadvantage in developing these tools. Future developments might make these points moot: in a recent talk at the Center, Ethan Zuckerman mentioned tricks and trends that might make censorship more difficult. In the meantime, however, governments that censor information may increasingly find that they do so at their own expense.

"Censorship" Bill Lifts Ban on Speech

The House has now joined the Senate in passing the Family Movie Act; the Act is almost sure to be signed into law soon by the President. (The Act is bundled with some unrelated provisions into a multi-part bill called the Family Entertainment and Copyright Act. Here I’ll focus only on Section 201, called the Family Movie Act, or “FMA”.)

Some people who haven’t read the FMA, or haven’t thought carefully enough about what it says, decry it as censorship. In fact, it is best understood as an anti-censorship proposal.

The Register, under the headline “Congress legalizes DVD Censorship” summarizes the FMA as follows:

It will soon become legal to alter a motion picture so long as all the sex, profanity, and violence have been edited out, thanks to a bill called the Family Movie Act…

Let’s look at what the FMA actually says:

[The following is not an infringement of copyright:]

the making imperceptible, by or at the direction of a member of a private household, of limited portions of audio or video content of a motion picture, during a performance in or transmitted to that household for private home viewing, from an authorized copy of the motion picture, or the creation or provision of a computer program or other technology that enables such making imperceptible and that is designed and marketed to be used, at the direction of a member of a private household, for such making imperceptible, if no fixed copy of the altered version of the motion picture is created by such computer program or other technology.

There is nothing here (or elsewhere in the FMA) that says you can only skip the dirty bits. The FMA says that you can skip any portions of the movie you like, as long as the portions you skip are “limited”. You can skip the clean parts if you want, as long as they make up only a limited portion, which may be the case for some movies. If the motion picture has commercials in it, you can skip the commercials. If you don’t like the soccer scenes in “Bend It Like Beckham”, you can watch the movie without them.

The soccer-free version of “Bend It Like Beckham” is speech. The FMA allows that speech to occur, by preventing a copyright owner from suing to block it. And the FMA does this in an ideal way, ensuring that the copyright owner on the original work will be paid for the use of their work. That’s the purpose of the “from an authorized copy” and “no fixed copy” language – to ensure that a valid copy of the original work is needed in order to view the new, modified work.

Let’s review. The FMA prevents no speech. The FMA allows more speech. The FMA prevents private parties from suing to stop speech they don’t like. The FMA is not censorship. The FMA prevents censorship.

China Blocks Altavista

The Great Firewall of China is now blocking Altavista too.