According to a BBC story, Greek police have stepped up the pace of arrests in enforcing a new Greek law banning all computer games. Many Internet cafes have been shut down.
[Link credit: disLEXia]
Research and commentary on digital technologies in public life
According to a BBC story, Greek police have stepped up the pace of arrests in enforcing a new Greek law banning all computer games. Many Internet cafes have been shut down.
[Link credit: disLEXia]
Today on Fritz’s Hit List: Big Mouth Billy Bass.
That’s right, your favorite wall-hanging, singing, dancing, animatronic fish qualifies for regulation as a “digital media device” under the Hollings CBDTPA. If the CBDTPA passes, any new Billy Bass will have to incorporate government-approved copy restriction technology.
Fight piracy – regulate singing fish novelties!
=======
What is Fritz’s Hit List?
Most readers have probably heard me, or someone like me, say that the Hollings CBDTPA has far-reaching effects – that it would regulate virtually all digital devices, including many that have nothing at all to do with copyright infringement. Though this argument is right, it is too abstract to capture the full absurdity of the CBDTPA’s scope.
To foster reasoned debate on this topic, I’m inaugurating a new daily feature here at freedom-to-tinker.com, called “Fritz’s Hit List.” Each entry will give an actual example of a device that would meet the CBDTPA’s definition of “digital media device” and would thereby fall under the heavy hand of CBDTPA regulation.
I’ll post a new example every weekday for as long as I can keep it up. Please email me if you want to suggest an example. (I have plenty of good ones in the queue already, but your suggestions may be better than mine.)
Rep. Billy Tauzin is circulating a draft of a bill that would restrict digital technology. One effect of the bill would be to mandate “broadcast flag” technology.
The bill has not yet been introduced.
The term “trusted system” is often used in discussing Digital Rights/Restrictions Management (DRM). Somehow the “trusted” part is supposed to make us feel better about the technology. Yet often the things that make the system “trusted” are precisely the things we should worry about.
The meaning of “trusted” has morphed at least twice over the years.
“Trusted system” was originally used by the U.S. Department of Defense (DoD). To DoD, a “trusted system” was any system whose security you were obliged to rely upon. “Trusted” didn’t say anything about how secure the system was; all it said was that you needed to worry about the system’s level of security. “Trusted” meant that you had placed your trust in the system, whether or not that trust was ill-advised.
Since trusted systems had more need for security, DoD established security criteria that any system would (theoretically) have to meet before being used as a trusted system. Vendors began to label their systems as “trusted” if those systems met the DoD criteria (and sometimes if the vendor hoped they would). So the meaning of “trusted” morphed, from “something you have to rely upon” to “something you are safe to rely upon.”
In the 1990s, “trusted” morphed again. Somebody (perhaps Mark Stefik) realized that they could make DRM sound more palatable by calling it “trusted.” Where “trusted” had previously meant that the system’s owner could rely on the system’s behavior, it now came to mean that somebody else could rely on its behavior. Often it meant that somebody else could force the system to behave contrary to its owner’s wishes.
Today “trusted” seems to mean that somebody has some kind of control over the system. The key questions to ask are who has control, and what kind of control they have. Depending on the answers to those questions, a “trusted” system might be either good or bad.
Larry Lessig and I had a brief blog-discussion last week about the meaning of the end-to-end principle(s), and how end-to-end applies to DRM. The discussion continued off-line, and we ended up in pretty close agreement. Here is my version of what we agree on:
(1) End-to-end is not a single principle, but a cluster of related principles. Some are engineering principles, and others are policy/economic principles. It is good to be clear about what version of end-to-end you are using.
(2) The MPAA/Hollings approach does harm by forcing all computers to implement certain functions, even though those functions are not needed by all law-abiding network users. This violates the engineering end-to-end principle that says that functions should not be required unless needed by all.
(3) The MPAA/Hollings approach does even more harm by forbidding a great many non-infringing functions from being implemented at all. This offends both engineering and policy versions of the end-to-end principle, all of which favor giving end users flexibility in how they use the network.
(4) DRM is generally a bad idea, but some DRM systems are worse than others.
Copyright © 2024 on Genesis Framework · WordPress · Log in