In the spring of 2012, I attended the memorial service for John McCarthy, a computer science founding father, at an auditorium on the Stanford campus. Among the great and good anecdotes told about this great and good guy was the mention of how McCarthy, more or less in around 1961, invented time-sharing—which, as was pointed out, is what is now called cloud computing. The attendees at the memorial service gave small rueful laughs of recognition; other incarnations of the same idea have long cropped up from the 1960s onward, among them client/server architectures of the 1990s as well as The Long Now Foundation’s Danny Hillis’s notion of computing as a utility you pull in from the wall.
In 1987, when I wanted my then-boss to pay for a kind of gray-market Internet access (I had hunted down a Net route-around, as in those explicitly non-commercial days only academics and government were supposed to have Net access) he harrumphed and said, “I don’t want to pay for your hanging out online and flirting.” Even then, wasting time on screens was a known Thing.
Before there were Arab Spring and Twitter, there were Fidonet and Serbia. Before there was Facebook, there were Usenet, Compuserve, Plato, DECnet, Minitel. 1960s-era FCC Commissioner Nicholas Johnson often warned about the losses to privacy when big databases would be tracking and storing information about everyone, and sharing that information with each other. He was also concerned about the losses of privacy when video would make it possible to monitor people in public places.
As long as there have been electronic computation and communications, humans do what they have always done, their desires and drives are eternal. People want to communicate and flirt; businesses want to find competitive advantages; governments want to keep track of internal and external threats. Technology and its implementations may change but people do not. [Read more…]