November 23, 2024

Archives for 2009

Being Acquitted Versus Being Searched (YANAL)

With this post, I’m launching a new, (very) occasional series I’m calling YANAL, for “You Are Not A Lawyer.” In this series, I will try to disabuse computer scientists and other technically minded people of some commonly held misconceptions about the law (and the legal system).

I start with something from criminal law. As you probably already know, in the American criminal law system, as in most others, a jury must find a defendant guilty “beyond a reasonable doubt” to convict. “Beyond a reasonable doubt” is a famously high standard, and many guilty people are free today only because the evidence against them does not meet this standard.

When techies think about criminal law, and in particular crimes committed online, they tend to fixate on this legal standard, dreaming up ways people can use technology to inject doubt into the evidence to avoid being convicted. I can’t count how many conversations I have had with techies about things like the “open wireless access point defense,” the “trojaned computer defense,” the “NAT-ted firewall defense,” and the “dynamic IP address defense.” Many people have talked excitedly to me about tools like TrackMeNot or more exotic methods which promise, at least in part, to inject jail-springing reasonable doubt onto a hard drive or into a network.

People who place stock in these theories and tools are neglecting an important drawback. There are another set of legal standards–the legal standards governing search and seizure–you should worry about long before you ever get to “beyond a reasonable doubt”. Omitting a lot of detail, the police, even without going to a judge first, can obtain your name, address, and credit card number from your ISP if they can show the information is relevant to a criminal investigation. They can obtain transaction logs (think apache or sendmail logs) after convincing a judge the evidence is “relevant and material to an ongoing criminal investigation.” If they have probable cause–another famous, but often misunderstood standard–they can read all of your stored email, rifle through your bedroom dresser drawers, and image your hard drive. If they jump through a few other hoops, they can wiretap your telephone. Some of these standards aren’t easy to meet, but all of them are well below the “beyond a reasonable doubt” standard for guilt.

So by the time you’ve had your Perry Mason moment in front of the jurors, somehow convincing them that the fact that you don’t enable WiFi authentication means your neighbor could’ve sent the death threat, your life will have been turned upside down in many ways: The police will have searched your home and seized all of your computers. They will have examined all of the files on your hard drives and read all of the messages in your inboxes. (And if you have a shred of kiddie porn stored anywhere, the alleged death threat will be the least of your worries. I know, I know, the virus on your computer raises doubt that the kiddie porn is yours!) They will have arrested you and possibly incarcerated you pending trial. Guys with guns will have interviewed you and many of your friends, co-workers, and neighbors.

In addition, you will have been assigned an overworked public defender who has no time for far-fetched technological defenses and prefers you take a plea bargain, or you will have paid thousands of dollars to a private attorney who knows less than the public defender about technology, but who is “excited to learn” on your dime. Maybe, maybe, maybe after all of this, your lawyer convinces the judge or the jury. You’re free! Congratulations?

The police and prosecutors run into many legal standards, many of which are much easier to satisfy than “beyond a reasonable doubt” and most of which are met long before they see an access point or notice a virus infection. By meeting any of these standards, they can seriously disrupt your life, even if they never end up putting you away.

New USACM Poilcy Recommendations on Open Government

USACM is the Washington policy committee of the Association for Computing Machinery, the professional association that represents computer scientists and computing practitioners. Today, USACM released Policy Recommendations on Open Government. The recommendations offer simple, clear advice to help Congress and the new administration make government initiatives—like the pending recovery bill—transparent to citizens.

The leading recommendation is that data be published in formats that “promote analysis and reuse of the data”—in other words, machine-readable formats that give citizens, rather than only government, the chance to decide how the data will be analyzed and presented. Regular Freedom to Tinker readers may recall that we have made this argument here before: The proposed Recovery.gov should offer machine-readable data, rather than only government-issue “presentations” of it. Ed and I both took part in the working group that drafted these new recommendations, and we’re pleased to be able to share them with you now, while the issue is in the spotlight.

Today’s statement puts the weight of America’s computing professionals behind the push for machine-readable government data. It also sends a clear signal to the Executive Branch, and to Congress, that America’s computing professionals stand ready to help realize the full potential of new information technologies in government.

Here are the recommendations in full:

  • Data published by the government should be in formats and approaches that promote analysis and reuse of that data.
  • Data republished by the government that has been received or stored in a machine-readable format (such as as online regulatory filings) should preserve the machine-readability of that data.
  • Information should be posted so as to also be accessible to citizens with limitations and disabilities.
  • Citizens should be able to download complete datasets of regulatory, legislative or other information, or appropriately chosen subsets of that information, when it is published by government.
  • Citizens should be able to directly access government-published datasets using standard methods such as queries via an API (Application Programming Interface).
  • Government bodies publishing data online should always seek to publish using data formats that do not include executable content.
  • Published content should be digitally signed or include attestation of publication/creation date, authenticity, and integrity.

Obama's CTO: two positions?

Paul Blumenthal over at the Sunlight Foundation Blog points to a new report from the Congressional Research Service: “A Federal Chief Technology Officer in the Obama Administration: Option and Issues for Consideration”.

This report does a good job of analyzing both existing positions in federal government that have roles that overlap with some of the potential responsibilities of an “Obama CTO” and the questions that Congress would want to consider if such a position is established by statute rather than an executive order.

The crux of the current issue, for me, is summed up well by this quote from the CRS report’s conclusion:

Although the campaign position paper and transition website provide explicit information on at least some of the duties of a CTO, they do not provide information on a CTO’s organizational placement, structure, or relationship to existing offices. In addition, neither the paper nor website states whether the president intends to establish this position/office by executive order or whether he would seek legislation to create a statutory foundation for its duties and authorities.

The various issues in the mix here lead me to one conclusion: an “Obama CTO” position will be very different from the responsibilities of a traditional chief technology officer. There seem to be at least two positions involved: one visionary and one fixer. That is, one person to push the envelope in a grounded-but-futurist style in terms of what is possible and then one person to negotiate the myriad of agencies and bureaucratic parameters to get things done.

As for the first position, I’d like to say a futurist would be a good idea. However, futurists don’t like to be tethered so much to current reality. A better idea is, I think, a senior academic with broad connections and deep interest and understanding in emerging technologies. The culture of academia, when it works well, can produce individuals who make connections quickly, know how to evaluate complex ideas and are good at filling gaps between what is known and not known for a particular proposal. I’m thinking a Felten, Lessig, etc. here.

As for the fixer, this desperately needs to be someone with experience negotiating complex endeavors between conflicting government fiefdoms. Vivek Kundra, the CTO for the District of Columbia, struck me as exactly this kind of person when he came to visit last semester here at Princeton’s CITP. When Kundra’s name came up as one of two shortlisted candidates for “Obama CTO”, I was a bit skeptical as I wasn’t convinced he had the appropriate visionary qualities. However, as part of a team, I think he’d be invaluable.

It could be possible that the other shortlisted candidate, Cisco’s Padmasree Warrior, would have enough of the visionary element to make up the other side of the team; I doubt she has (what I consider to be) the requisite governmental fixer qualities.

So, why not two positions? Does anyone have both these qualities? Do people agree that these are the right qualities?

As to how it would be structured, it’s almost as if it should be a spider position — a reference to a position in soccer that isn’t tethered by role. That is, they should be free from some of the encumbrances that make government information technology innovation so difficult.

Please participate in research project — requires only one click

As part of a research project on web browser security we are currently taking a “census” of browser installations. We hope you’ll agree to participate.

If you do participate, a small snippet of JavaScript will collect your browser’s settings and send them to our server. We will record a cryptographic hash of those settings in our research database. We will also store a non-unique cookie (saying only that you participated) in your browser. We will do all of this immediately if you click this link.

(If you want to see in advance the Javascript code we run on participants’ machines, you can read it here.)

[I revised this entry to be more clear about what we are doing. — Ed]

New Site Tests Crowd-Sourced Transparency

Some of my colleagues here at CITP have written about the importance of open data formats for promoting government transparency and achieving government accountability. Another leading thinker in this area is my friend Jerry Brito, a George Mason University scholar who contributed a post here at Freedom to Tinker last year. Jerry wrote one of the first papers on the importance of mashups using government data. Now, Jerry and a few collaborators have put his ideas into action by building a site called Stimulus Watch that will facilitate crowd-sourced analysis of the hundreds of billions of dollars of deficit spending that President Obama has made a centerpiece of his economic agenda.

Jerry and his collaborators parsed a report containing more than 10,000 “shovel ready” spending proposals from the nation’s mayors. Many of these proposals will likely be funded if Congress approves Obama’s spending bill. Using the site, ordinary Americans across the country can review the proposals in their own metropolitan areas and provide feedback on which proposals deserve the highest priority. As the site grows in popularity, it may prove extremely valuable for federal officials deciding where to allocate money. And if there are turkeys like the “Bridge to Nowhere” among the mayors’ requests, the site will allow citizens to quickly identify and publicize these proposals and perhaps shame government officials into canceling them.