March 28, 2024

Online Porn Issue Not Going Away

Adam Thierer at Technology Liberation Front offers a long and interesting discussion of the online porn wars, in the form of a review of two articles by Jeffrey Rosen and Larry Lessig. I’ve been meaning to write about online porn regulation for a while, and Thierer’s post seems like a good excuse to address that topic now.

Recent years have seen a series of laws, such as the Communications Decency Act (CDA) and the Child Online Protection Act (COPA), aimed at restricting access to porn by minors, that have been the subject of several important court decisions. These cases have driven a blip in interest, and commentary, on online porn regulation.

The argument of Rosen’s article is captured in its title: “The End of Obscenity.”
Rosen argues that it’s only a matter of time before the very notion of obscenity – a word which here means “porn too icky to receive First Amendment protection” – is abandoned. Rosen makes a two-part argument for this proposition. First, he argues that the Miller test – the obscenity-detection rule decreed by the Supreme Court in the 1970’s – is no longer tenable. Second, he argues that porn is becoming socially acceptable. Neither claim is as strong as Rosen claims.

The Miller test says that material is obscene if it meets all three of these criteria: (1) the average person, applying contemporary community standards, would find it is designed to appeal to the prurient interest; (2) it depicts [icky sexual stuff]; and (3) taken as a whole, it lacks serious literary, artistic, scientific, or political value.

Rosen argues that the “community standards” language, which was originally intended to account for differences in standards between, say, Las Vegas and Provo, no longer makes sense now that the Internet makes the porn market international. How is an online porn purveyor to know whether he is violating community standards somewhere? The result, Rosen argues, must be that the most censorious community in the U.S. will impose its standards on everybody else.

The implication of Rosen’s argument is that, for the purposes of porn distribution, the whole Internet, or indeed the whole nation, is essentially a single community. Applying the standards of the national community would seem to solve this problem – and the rest of Rosen’s essay supports the notion that national standards are converging anyway.

The other problem with the Miller standard is that it’s hopelessly vague. This seems unavoidable with any standard that divides obscene from non-obscene material. As long as there is a legal and political consensus for drawing such a line, it will be drawn somewhere; so at best we might replace the Miller line with a slightly clearer one.

Which brings us to the second, and more provocative, part of Rosen’s essay, in which he argues that community standards are shifting to make porn acceptable, so that the very notion of obscenity is becoming a dinosaur. There is something to this argument – the market for online porn does seem to be growing – but I think Rosen goes too far. It’s one thing to say that Americans spend $10 billion annually on online porn, but it’s another thing entirely to say that a consensus is developing that all porn should be legal. For one thing, I would guess that the vast majority of that $10 billion is spent on material that is allowed under the Miller test, and the use of already-legal material does not in itself indicate a consensus for legalizing more material.

But the biggest flaw in Rosen’s argument is that the laws at issue in this debate, such as the CDA and COPA, are about restricting access to porn by children. And there’s just no way that the porn-tolerant consensus that Rosen predicts will extend to giving kids uncontrolled access to porn.

It looks like we’re stuck with more of less the current situation – limits on porn access by kids, implemented by ugly, messy law and/or technology – for the foreseeable future. What, if anything, can we do to mitigate this mess? I’ll address that question, and the Lessig essay, later in the week.

Comments

  1. The Least Objectionable Content Labeling System

    Today I’ll wrap up Vice Week here at Freedom to Tinker with an entry on porn filtering. On Monday I agreed with the conventional wisdom that online porn regulation is a mess. On Tuesday I wrote about what my wife and I do in our home to control underag…

  2. Very little money is spent protecting children from downloading porn, while enormous sums are spent on family assistance, at all levels of government. The spending levels of the two are utterly incomparable.

  3. I have always found it deeply disturbing that we can, as a nation, spend so much to protect children from sexual abuse but can tolerate that that so many children are homeless, abused, beaten, starved and without basic care.

    I would like to see some of the monies that go to protect children from downloading porn into protecting families from living in the street.

    Especially since homelessness, with aggressive federal housing and intervention, was nearly beaten in the 60’s and70’s but was massively increased through active policy choices in the 80’s.

    I mean – see people having sex or live a life of deprivation on the streets. Which one of these should we invest in preventing? Which is worse?

  4. I won’t take a strong position on the porn issue itself, but I am strongly against allowing libraries or public networks to censor the material vieawable by American adults.

  5. Censorship and Internet Porn

    Ed Felten is blogging today on obscenity laws and the possibility of regulating online pornography, and in particular access to it by chil…

  6. Seriously, primetime television mostly fails to meet the Miller test. Parts 1 and 3 are self-evident, and part 2 is just a semantic matter (what is a “depiction”).

  7. Isn’t protecting children from seeing porn primarily a job for the parents?

    I forgot who said it but “the internet sees censorship as damage, and routes around it” still seems to apply.

    IMHO Any attempt short of a police state will fail to contain online porn; there’s to much money to be made.

    People who think that you can make children (or adults) virtuous be removing temptation have a surprise coming.

    Just like the people who thought that by not giving sex-ed and making preservatives unavailable will stop minors from experimenting with sex. Resulting in the US having the highest rate of teenage childbirths in the developed world (twice as high as in the runner-up, the UK).

  8. FYI, a small correction to your article, from http://www.wireniusreport.net/overview.html

    “An Overview of Nitke v. Ashcroft

    The theory underlying the case is based on an inconsistency between the definition of “obscenity” as speech outside of the protection of the First Amendment and the Court’s recent articulation of its understanding of the rules of free speech as applicable to the Internet. The Communications Decency Act of 1996 (known generally as the “CDA”), 47 U.S.C.§ 223(a), et seq., was passed as part of a comprehensive act regulating telecommunications, and governed both “indecent” materials and “obscene” materials on the Internet. The CDA effectively bans such materials, and allows for federal criminal enforcement of the ban. The “indecency” provisions were struck down by the Supreme Court in Reno v. American Civil Liberties Union, 521 U.S. 844 (1997). In Reno, the Supreme Court did not, contrary to popular understanding, completely gut the CDA. Rather, the Court found that “obscene speech … can be banned totally because it enjoys no First Amendment protection.” This led it to “sever” (exempt from the finding of unconstitutionality) the CDA provision banning obscenity on the Internet from the remainder of the statute. This finding had the effect of leaving the obscenity section of the CDA in effect. 521 U.S. at 883. The Court relied upon its own prior approval of a ban on obscenity set forth in Miller v. California, 413 U.S. 15, 18 (1973). In Miller, obscenity was defined as materials which (1) excite the prurient interest of the audience; (2) are patently offensive under local community standards; and (3) are lacking in serious literary, artistic or political social value (the so-called “SLAPS” test).”

  9. People interested in issues of Internet censorship and community standards might want to read my now-released work for the Nitke v. Ashcroft case:

    Nitke v. Ashcroft : Seth Finkelstein expert witness report
    http://sethf.com/nitke/ashcroft.php

    “I. Opinion of Witness with Basis and Reasons Therefore

    A provider of content via the Internet cannot reasonably be expected to know the location of readers, if the context is one in which location would lead to a denial of the ability to read the content.”