March 24, 2018

Archives for October 2002

How Much Progress?

Dan Gillmor quotes Ray Kurzweil as saying that:

The rate of change … is accelerating exponentially. We are “doubling the paradigm shift rate” on a constant basis. This century will be the equivalent to 20,000 years of progress at today’s rate, and people don’t appreciate the implications of this.

I have to admit that this 20,000-years-of-progress claim sounded roughly plausible to me at first. Ted Shelton had the same reaction. But even a little bit of number-crunching shows that Kurzweil must be wildly wrong.

I’m not precisely sure what Kurzweil means by “progress,” but in light of the talk about paradigm shifts, it seems reasonable to assume that “progress” has something to do with the advancement of human knowledge, understanding, or well-being.

Kurzweil says that progress advances exponentially, which seems to be a reasonable assumption. But how fast does the exponential rise? Kurzweil’s “20,000 years” claim turns out, through the magic of logarithms, to imply a 7% annual growth rate, that is, 7% more progress each year than the year before, with the increases compounding over time. That translates to a doubling in human progress every ten years.

That just can’t be right. For one thing, it implies that the amount of human progress between 1,000,000 B.C. and 1992 A.D. is equal to the amount of progress between 1992 and 2002. By any reasonable definition of human progress, things can’t be advancing nearly as fast as Kurzweil claims.

It’s surprising that a guy as smart as Kurzweil made this kind of mistake. In retrospect, I’m surprised that the claim sounded plausible to me and Gillmor and Shelton. I guess people are not very good at thinking about exponentials.

Too Stupid to Look the Other Way

David Weinberger explains the value of “leeway,” or small decisions not to enforce the rules in cases where enforcement wouldn’t be reasonable.

Imagine that your mother were visiting your apartment, and she got sick, so you let her stay overnight because she wan’t well enough to travel home. If this happened, no reasonable landlord would enforce a no-overnight-guests rule against you. Weinberger says:

Leeway is the only way we manage to live together: We ignore what isn’t our business. We cut one another some slack. We forgive one another when we transgress.

By bending the rules we’re not violating fairness. The equal and blind application of rules is a bureaucracy’s idea of fairness. Judiciously granting leeway is what fairness is all about. Fairness comes in dealing with the exceptions.

And there will always be exceptions because rules are imposed on an unruly reality. The analog world is continuous. It has no edges and barely has corners. Rules at best work pretty well. That’s why in the analog world we have a variety of judges, arbiters, and referees to settle issues fairly when smudgy reality outstrips clear rules.

The problem, Weinberger says, is computers don’t give leeway. Would the computer toss your sick mother out on the street, or cancel your lease because you let her stay?

Of course, you can always change the rules to add exceptions, such as a sick-mother allowance. Doing this would cover some cases, but you would be left with a more complex set of rules that was still enforced inflexibly. You can change the rules, but you can’t teach a computer to give leeway.

Weinberger goes on:

Which brings us to “digital rights management” which implements in code a digital view of rights. Yes, vendors and users should have a wide variety of agreements possible, but the nature of those agreements is necessarily digital….

If we build software that enables us to “negotiate” usage rules with content providers, the rules can be as favorable as we’d like but their enforcement will necessarily be strict, literal and unforgiving. Binary, not human.

DRM raises very difficult leeway issues. Fair use is an officially sanctioned leeway mechanism, designed to prevent enforcement of certain rules when the particular circumstances would make enforcement unwise. Fair use is just the kind of subtle and context-dependent leeway mechanism that computers can’t handle.

Weinberger’s message can be summed up in a quote attributed to him by Jon Udell:

That’s the problem with DRM. Computers are too stupid to look the other way.

Wiley's Super-Worm

Brandon Wiley writes about the possibility of a “super-worm” that would use sophisticated methods to infect a large fraction of Internet hosts, and to maintain and evolve the infection over time. This is scary stuff. I have two comments to add.

First, the worst case is probably even worse than Wiley suggests. His paper may only scratch the surface of what a really sophisticated bad guy could do.

Second, Wiley’s paper points out the double-edged nature of basic security technology. The methods we use to protect ourselves against attacks – encryption, redundancy, decentralization, code patching – are the same methods that Wiley’s bad guy would use to protect himself against our counterattacks. To counterattack, we would need to understand the flaws in these methods, and to know how to attack them. If we ban or stigmatize discussion of these flaws, we put ourselves at risk.