Today I’ll wrap up Vice Week here at Freedom to Tinker with an entry on porn labeling. On Monday I agreed with the conventional wisdom that online porn regulation is a mess. On Tuesday I wrote about what my wife and I do in our home to control underage access to inappropriate material. Today, I’ll suggest a public approach to online porn that might possibly do a little bit of good. And as Seth Finkelstein (a.k.a. Eeyore, a.k.a. The Voice of Experience) would probably say, a little bit of good is the best one can hope for on this issue. My approach is similar to one that Larry Lessig sketched in a recent piece in Wired.
My proposal is to implement a voluntary labeling scheme for Web content. It’s voluntary, because we can’t force overseas sites to comply, so we might as well just ask people politely to participate. Labeling schemes tend not to be adopted if the labels are complicated, or if the scheme requires all sites to be labeled. So I’ll propose the simplest possible labels, in a scheme where the vast majority of sites need no labels at all.
The idea is to create a label, which I’ll call “adultsonly” (Lessig calls it “porn” but I think that’s imprecise). Putting the adultsonly tag on a page indicates that the publisher requests that the page be shown only to adults. And that’s all it means. There’s no official rule about when material should be labeled, and no spectrum of labels. It’s just the publisher’s judgment as to whether the material should be shown to kids. You could label an entire page by adding to it an adultsonly meta-tag; or you could label a portion of a page by surrounding it with “adultsonly” and “/adultsonly” tags. This would be easy to implement, and it would be backward compatible since browsers ignore tags that they don’t understand. Browsers could include a kids-mode that would hide all adultsonly material.
But where, you ask, is the incentive for web site publishers to label their racy material as adultsonly? The answer is that we create that incentive by decreeing that although material published on the open Internet is normally deemed as having been made available to kids, any material labeled as adultsonly will be deemed as having been made available only to adults. So by labeling its content, a publisher can ensure that the content’s First Amendment status is determined by the standard obscenity-for-adults test, rather than the less permissive obscenity-for-kids test. (I’m assuming that such tests will exist and their nature will be determined by immovable politico-legal forces.)
This is a labeling scheme that even a strict libertarian might be able to love. It’s simple and strictly voluntary, and it doesn’t put the government in the business of establishing fancy taxonomies of harmful content (beyond the basic test for obscenity, which is in practice unchangeable anyway). It’s more permissive of speech than the current system, at least if that speech is labeled. This is, I think, the least objectionable content labeling system possible.
I like Ed’s idea of a tag that can be used for a portion of text – instead of blocking the entire webpage. The tag should have an “alt” attribute.
Or there should be a related tag called kidsonly, which would be used in conjunction with the adultsonly tag
This would allow designers to customize the sites to be kid friendly. Great for content that is “adult” but not pornographic.
I can’t help wondering if we’re starting with the wrong assumptions. In my experience adults assume their communication is going to be received by adults unless they are explicitly aware that a child may be present. Watch a group of young parents conversing and you’ll see how much of an effort they have to make to censor their topics and language, and how often they slip up.
I don’t believe that anyone who publishes on the Internet is any different. Whether you are a commercial porn supplier, advertising a business or writing a travellog, you will tend to make assumptions about your audience based on your intended target group – and that will often exclude children.
Given this assumption I would rather suggest a “kidsafe” opt-in in preference to an “adultonly” opt-out. Any publisher who has taken the time to think of the possibility of underage readers and to consider their content in that light can label their pages “kidsafe”. The legal implications of a kidsafe opt-in are that any publisher who labels their content kidsafe needs to be able to defend that position.
Although the “real world” tends to lean towards labelling content that isn’t child friendly, there are certain fundamental differences in Internet publishing that cannot be overlooked; in particular the accessibility of the medium (allowing anyone to publish content) and the lack of legal awareness of your average publisher (compared with, say, the book publishing industry). This distinction suggests a special need to protect Internet publishers.
Such protection, in my opinion, is best accomplished by a legislated assumption that the Internet is a medium of free speech and that no information communicated on the Web is trustworthy (accurate or suitable for a particular group or purpose) unless explicitly labelled as such.
It will be a sad day when you need to wrap your site or statements in a disclaimer (or label) to ensure it is protected under free speech laws. But then, this may already have happened.
I think you’re overlooking an important class of content that would fall through the cracks in a tagging proposal: sexual education sites for teenagers (e.g. sites that try to educate gay teens so they don’t commit suicide). They obviously can’t use tags because their content isn’t intended for adults, but at the same time there are enough places and people hostile to sexual education that they can face considerable legal risk. Note that, ironically, these sites are *already* blocked in many high schools by filtering systems. I think a tagging proposal for adult content would only further isolate these sorts of sites and make it harder for them to continue their public service.
Mike Liveright Writes:
“The compromise that I would suggest is that any site labled AdultOnly would be lible only to “national” documented, not local, restrictions…
As I understand it, one of the major problems with pornography laws is that they refer to community standards, so a site that can be sued by a locality if it does not satisfy their standard”
Making the only subject to national standard would not help. The way the Ashcroft Justice Department is currently prosecuting pornography cases is by jurisdiction shopping and prosecuting pornography in the least tolerant jurisdiction.
There is no national standard for “community standards” of what constitutes “obscenity” so it is up to the jury in what ever federal court jurisdiction the Justice department files the case in to make a determination. Thus, just making an Adult flag as a national standard does not protect against intolerant local community standards.
Right now the Fed is currently prosecuting people for distributing consensual adult videos and selling it to adults. The PBS documentary “American Porn” provides an excellent overview of this renewed federal agenda.
The compromise that I would suggest is that any site labled AdultOnly would be lible only to “national” documented, not local, restrictions…
As I understand it, one of the major problems with pornography laws is that they refer to community standards, so a site that can be sued by a locality if it does not satisfy their standards.
If we had a law that stated that a site that had an AdultOnly id, or perhaps lived in the ????.sex domain, then it could be filtered out be the individuals, as they prefer, and thus would only be lible to national documented standards. This would provide a “reason” for a site to label itself.
That’s exactly how I’d expect a browser mandate to work — here are flags (AdultsOnly, say) that the browser must detect and respond to. If the webcasting provisions of the WIPO Broadcasters Treaty pass, I expect that the flags that browsers are mandated to respond to would expand to include NoCopy, NoPrint, NoSave.
As bad as these restrictions might be, the real harm arises with the requirement that browser implementations be tamper-resistant (“robust” as it is called in DRM circles) — which is to say, not free software. We already have a Broadcast Flag mandate that bans open source and free software from touching a DTV signal or the saved programs it carries, and requires component vendors (i.e. HDD and video-card manufacturers) to implement hardware spoilers to prevent reverse-engineering by free software authors.
If there were a browser mandate to support detection and response to ANY flag, that mandate would likely include a “robustness” rule that would preclude free software.
I’m not sure how a browser mandate would work. There is no chance that it would be legal to require that all browsers refuse to view adult material. Those kinds of laws have been repeatedly overturned. The greater danger is that (American) sites would be required to self-label if they had indecent content.
By analogy with the V chip, the most I’d expect on the browser side is a mandate that it support filtering, so parents could set limits on the permitted labels of visited sites. I don’t think it makes much sense to impose a robustness requirement in that context.
I’ve heard that TV support lines never get calls from parents wanting to know how to program the V chip to limit their kids’ viewing. It’s always parents saying that their kids were playing with the set and changed the V chip settings, so now the parents can’t watch their own shows. They’re calling to find out how to turn the chip off.
I follow that, Ed — my question was, “If we create a porn flag, why shouldn’t we expect a porn flag mandate to follow it?”
Such a mandate might say that all browsers have to detect and respond to the flag if asked to (I’m reasonably OK with this so far), but the danger comes in with the “robustness” requirement that might follow this — browsers must detect and respond to the flag AND must resist attempts to modify them so that they don’t detect and respond to the flag.
The existence of content flags seems to me to cry out for a mandate. Maybe it’s just because I’ve been fighting the Broadcast Flag so long, but I’m very nervous that this is setting out an architectural foundation for supporting control mechanisms (like the 8 bits in ATSC) and that the outcome might be that a control mechanism emerges.
PICS is a standard way of expressing content ratings, but doesn’t define any ratings vocabularies.
For those, see RSACI/ICRA and SafeSurf.
As several other people have already mentioned, this will only work for responsible porn purveyers. There are several whole classes of porn sites this won’t ever work with — the typo sites, the expired-domain extortionists, etc.
Since no one else mentioned it, I thought I would point out W3C PICS (see also REC-PICS-labels) which attempted to solve this problem (among others). It’s pretty simple to embed this in HTML documents, e.g. see PICS meta tag examples. I believe there is even vendor support (e.g. Internet Explorer) for this W3C Recommendation, e.g. see the “Ratings” panel of the “Preferences” dialog in IE5/Mac.
Quick story re porn sites:
A friend of mine runs an Evangelical youth group in Post Falls, Idaho. His youth gorup has something to do with a regular Christian youth conference called “Teen Extreme” (the “extreme” is for extreme sports – the event seems to have something to do with BMX biking), whose web-site is located at teenextreme.org. Naturally, there was a porn site with a similar name (I believe it was teenextreme.net). Anyway, a lot of kids looking for this Christian youth conference were getting to this porn site by mistake. The organizers of the event contacted the owners of the porn site and explained the issue to them. They not only voluntarily moved their site, but actually gave their domain to the Teen Extreme conference people, free of charge.
Now, the point of the story, is that, as Professor Felten points out, people in the porn industry generally understand that American culture doesn’t have a big problem with them, as long as they don’t try to draw in underage kids, and don’t show content to adults who don’t want to see it. As long as everyone who is looking at the site is of age and is there on purpose, no one (except possibly John Ashcroft, who is a fascist anyway) is going to come after them. As a result, it is definitely in the interest of the “corporate” porn sites, who want to be able to operate legally in the United States, not to upset people by showing content to minors (or having URLs similar to Christian youth events). For that reason I think Professor Felten’s scheme would work for these porn sellers.
However, as Jay Gischer pointed out, there is another class of porn web-site: those who are illegal and know they are illegal, and create popups and so forth so those poor schmucks using web-browsers without popup blocking (*cough*Microsoft*cough*) (yes, I know, IE post SP2 does have popup blocking) are always seeing explicit images against their will. This won’t do anything to solve that problem, and I don’t know what will, except lawsuits (which is too bad).
Ed Felten,
I think I understand your premise. Unfortunately, we have seen how Hollywood has worked with this kind of labeling. The NC-17 label was to be an Adult label, not necessarily pornographic, but it has become a hot topic. Rather than merely being a designator of materials not intended for children, this label has become a litmus test for newspapers, theaters, video rental chains and e-commerce providers (*coughPay Palcough*) who will automatically reject any content with the NC-17 label. Thus, the NC-17 label is just a scarlet letter of a lighter shade.
Likewise, I think your adult label would meet a similar fate. And your proposal still wouldn’t stop the already illegal porn spam in my e-mail box.
I doing the opposite, labeling some sites kid safe, is the better opt in route.
Note that the “NSFW” acronym (Not Safe For Work) is used already in a similar way to label links on many link blogs (BoingBoing, MetaFilter, and many others), but as human-readable text rather than machine-readable tags. I’ve also seen “SFW” to label things that might otherwise be assumed to be unsafe, but I don’t think that’s applicable in the case of machine-readable tags.
PrivacyWatch,
I understand your concern that the label might be taken as an admission that the labeled material is somehow harmful. That’s why I defined the label in the way I did. It’s not a “porn” label but an “adultsonly” label; and the criterion for attaching the label to material is based not on attributes of the material itself, but merely on the publisher’s desire that the material not be shown to children. By defining things this way, we allow an accused party to argue that, even though they were not compelled to label their material, they chose to do so because of their extreme sensitivity to parental values.
Note also that censorious organizations will be invited to use the label liberally on their own sites to protect the delicate sensibilities of their readers, thereby providing examples of labeled material that isn’t obscene.
To make this plan work, the label must be defined and used in a purely voluntary fashion, so that it can take on the desired meaning by custom. The plan doesn’t work if the label is pre-defined by the government as a “this material is obscene” flag.
Cory,
I think the difference between our positions boils down to our starting assumptions. My assumption is that we’re going to have laws banning distribution of obscene material, and that we’re going to have even more restrictive laws banning distribution of obscene-for-kids material to kids.
To put it another way, I’m assuming that the law will create a category of “adults only” material that can be distributed to adults (because it is not obscene-for-adults) but can’t be distributed to kids (because it’s banned and is obscene-for-kids). Given that assumption, I think it’s better to have a simple labeling scheme than to not have one, because not having one will require publishers to withhold adults-only material for fear of being prosecuted for distributing it to kids.
It seems to me that quite a lot of porn sites these days (in my, ahem, limited experience) have a thing on their front page saying “If you’re under 18 click here [and get sent to Disney.com or somewhere innocent], or if you’re 18 and over click here [to get to the hot chix]”, so I can see them voluntarily adding an over-18 flag with no worries at all.
While this is true, there’s another class of porn purveyors who will go to great lengths to interrupt otherwise innocent websurfing with ads for their website. They will put hundreds of non-porn related words in their home pages, just to make these pages show up on web search engines.
These guys are pernicious weeds, the crabgrass in your lawn that you can never be rid of. They would never agree to label themselves.
It seems to me that quite a lot of porn sites these days (in my, ahem, limited experience) have a thing on their front page saying “If you’re under 18 click here [and get sent to Disney.com or somewhere innocent], or if you’re 18 and over click here [to get to the hot chix]”, so I can see them voluntarily adding an over-18 flag with no worries at all. After all, they don’t want to get prosecuted, and there’s not enough money to be made off kids to make it worth the hassle of not taking precautions.
Another objection to a porn flag mandate is that John Ashcroft is currently on an anti porn agenda and is prosecuting people for “indecency” (ala Larry Flint) as we speak. Ashcroft (the man who was so offended by the naked breast of lady justice that he spent $10,000 in taxpayer funds to put drape over the decades old sculpture in the Justice Department building,) Ashcroft has promised that anyone distributing porn could be the next target. So far, he is going after hardcore porn. (Basically, any hardcore porn qualifies for prosecution by Ashcroft. Hardcore may loosely be classified has having any of the 3 E’s: erection, entry or emission. Though, I’d bet Ashcroft is willing to go after softcore next. I doubt if Ashrcroft approves of any nakedness in any context.)
The porn flag will be a great way to pick the next target. The presence of the porn flag can be used as proof for use in prosecutions that the website new they had “illegal” porn.
I think I see the point of this, but my concern is that if “architecture is politics” then a porn-flag lays out the architecture for a porn-flag mandate.
I can imagine that the ATSC people argued that their hands were clean of any mandates that arose from the 8 bits reserved for “content redistribution marking.” They could have been used by rightsholders to signal that they *wished* that their materials would not be redistributed, but they were just as amenable to forming the basis for the Broadcast Flag mandate.
Indeed, when we object to DRM markers in standards and specifications like the DVB Forum’s international Broadcast Flag, the advocates say that they are merely enabling the machine-readable description of a rightsholder’s wishes — whether that description forms the basis of a mandate is not a foregone conclusion on the basis of a description alone.
So while I take your point here, I remain skeptical. If we create an architectural foundation for control, don’t we invite control itself?