September 25, 2020

Cal-Induce Bill Morphs Into Filtering Mandate

A bill in the California state senate (SB 96), previously dubbed the “Cal-Induce Act,” has now morphed via amendment into a requirement that copyright and porn filters be included in many network software programs.

Here’s the heart of the bill:

Any person or entity that [sells, advertises, or distributes] peer-to-peer file sharing software that enables its user to electronically disseminate commercial recordings or audiovisual works via the Internet or any other digital network, and who fails to incorporate available filtering technology into that software to prevent use of that software to commit an unlawful act with respect to a commercial recording or audiovisual work, or a violation of [state obscenity or computer intrusion statutes] is punishable … by a fine not exceeding [$2500], imprisonment … for a period not to exceed one year, or by both …

This section shall not apply to the following:
(A) Computer operating system or Internet browser software.
(B) An electronic mail service or Internet service provider.
(C) Transmissions via a [home network] or [LAN]. [Note: The bill uses an odd definition of “LAN” that would exclude almost all of the real LANs I know. – EF]

As used in this section, “peer to peer file sharing software” means software … the primary purpose of which … is to enable the user to connect his or her computer to a network of other computers on which the users of these computers have made available recordings or audiovisual works for electronic dissemination to other users who are connected to the network. When a transaction is complete, the user has an identical copy of the file on his or her computer and may also then disseminate the file to other users connected to the network.

The main change from the previous version of the bill is the requirement to include filtering technologies; the previous version had required instead that the person “take reasonable care in preventing” bad uses of the software. This part of the bill is odd in several ways.

First, if the system in question uses a client-server architecture (as in the original Napster system), the bill applies only to the client-side software, since only the client software meets the bill’s definition of P2P. Since the bill requires that a filter be incorporated into the P2P software, a provider could not protect itself by doing server-side filtering, even if that filtering were perfectly effective. This bill doesn’t just mandate filtering, it mandates client-side filtering.

Second, the bill apparently requires anyone who advertises or distributes P2P software to incorporate filters into it. This seems a bit odd; normally advertisers and distributors don’t control the design of the products they advertise. Typically, third party advertisers and distributors aren’t allowed to inspect a software product’s design.

Third, the “primary purpose” language is pretty hard to apply. A program’s author may have one purpose in mind; a distributor may have another purpose in mind; and users may have a variety of purposes in using the software. Of course, the software itself can’t properly be said to have a purpose, other than doing what it is programmed to do. Most P2P software is programmed to distribute whatever files its users ask it to distribute. Is purpose to be inferred from the intent of the designer, or from the design of the software itself, or from the actual use of the software by users? Each of these alternatives leads to problems of one sort or another.

Note also the clever construction of the P2P definition, which requires only that the primary purpose be to connect the user to a network where some other people are offering files to share. It does not seem to require that the primary purpose of the network be to share files, or that the primary purpose of the software be to share files, but only that the software connects the user to a network where some people are sharing files. Note also that the purpose language refers only to the transfer of audio or video files, not to the infringing transfer of such files; so even a system that did only authorized transfers would seem to be covered by the definition. Finally, note that the bill apparently requires the filters to apply to all uses of the software in question, not just uses that involve networking or file transfer.

Fourth, it’s not clear what the bill says about situations where there is no workable filtering software, or where the only available filtering software is seriously flawed. Is there an obligation to install some filtering software, even if doesn’t work very well, and even if it makes the P2P software unusable in practice? The bill’s language seems to assume that there is available filtering software that is known to work well, which is not necessarily the case.

The new version of the bill also adds enumerated exceptions for operating system or web browser software, email services, ISPs, home networks, and LANs (though the bill’s quirky definition of “LAN” would exclude most LANs I know of). As usual, it’s not a good sign when you have to create explicit exceptions for commonly used products like these. The definition still seems likely to ensnare new legitimate communication technologies.

(Thanks to Morgan Woodson (creator of an amusing Induce Act Hearing mashup) for bringing this to my attention.)


  1. I liked the “identical copy” requirement, especially since the bill targets “audiovisual” files. Any aspiring not-a-P2P-system would be excempt from the law if it altered just one bit in any file copied.
    Well, every file where it could determine one bit that could be changed without breaking the file… A file with a degree of entropy… Like audio or video. (Or anything with a meta-data field like title/artist)
    If it’s not identical, it’s not P2P. if it’s not P2P it’s not covered.

  2. Walt R. says:

    Here is another of California government prostituting itself to special interests. What happened to, “Government by the people for the people?” It was stolen and sold to special interests.

    Walt R.

  3. Gerard points out one loophole. But there is a veritable sieve of holes.
    1) Do not make an identical copy (Gerard)
    2) Write a P2P kernel module
    3) Incorporate the client as an extension to an “Internet browser”
    4) Split any “audiovisual” file into 20 pieces (“but does not include an excerpt consisting of
    less than substantially all of a recording or audiovisual work”)
    5) Set up a virtual network that routes packets between connected hosts using IPSec or mrouted or IP-IP tunnels (“Internet Service Provider”)

    As Prof Felton points out, the bill also is so expansive as to include commercial electronic film distributors or USENET.

    It is the worst of both worlds.

  4. “the primary purpose of which … is to enable the user to connect his or her computer to a network of other computers on which the users of these computers have made available recordings or audiovisual works for electronic dissemination to other users who are connected to the network.”

    They seem to be confusing the whole network thing again, too, which gives a handy get-out clause: the network that P2P sotware connects to is (in a broad sense) the internet. As the primary purpose of the internet is *not* sharing “recordings or audiovisual works” (assuming by which they mean MP3s, TV shows, etc.) then it’s all good, and no-one needs to bother with ridiculous and unworkable filtering…

    (And on the other hand a TV network is almost precisely their definition of a “P2P network” — it’s designed to allow one user (the TV station) to disseminate audiovisual works.)

  5. Apparently not discussed yet is the fact that this would ban Instant Messenger software.

    Both IM and “file sharing” P2P have their roots in IRC.

    Of course there are loopholes around it, but it would then only apply to Californian companies.

    Also: “Any person or entity that sells, offers for sale, advertises, distributes, disseminates, provides, or otherwise makes available”
    …is the text of the bill. It means that users wouldn’t be liable, only companies that make or distribute it.

    All the “offending” companies need to do is move out of California. That’s the biggest loophole of all.

  6. All this law will do is force hosting companies out of california. Any similar laws by the federal government will do the exact same thing on a national level. Just more tech sector job losses if you ask me.

  7. Matt St. Peter says:

    A classic end-around argument.

    A key plank in the RIAA case against Sharman is that it possesses filtering controls to keep child porn off its network.

    As soon as you mandate that porn filtering be put into place, you arm the RIAA with the indisputable fact that Sharman can and does control their network. Then the argument that they can and should filter intellectual property of the RIAA follows.

    Believe me, the moral argument is a lot easier to make for the active filtering of porn [think of the children!] than for the much more nebulous active filtering of copyrighted material.

    Incremental gains = victory. Pay close attention.