May 30, 2024

Archives for February 2009

New USACM Poilcy Recommendations on Open Government

USACM is the Washington policy committee of the Association for Computing Machinery, the professional association that represents computer scientists and computing practitioners. Today, USACM released Policy Recommendations on Open Government. The recommendations offer simple, clear advice to help Congress and the new administration make government initiatives—like the pending recovery bill—transparent to citizens.

The leading recommendation is that data be published in formats that “promote analysis and reuse of the data”—in other words, machine-readable formats that give citizens, rather than only government, the chance to decide how the data will be analyzed and presented. Regular Freedom to Tinker readers may recall that we have made this argument here before: The proposed Recovery.gov should offer machine-readable data, rather than only government-issue “presentations” of it. Ed and I both took part in the working group that drafted these new recommendations, and we’re pleased to be able to share them with you now, while the issue is in the spotlight.

Today’s statement puts the weight of America’s computing professionals behind the push for machine-readable government data. It also sends a clear signal to the Executive Branch, and to Congress, that America’s computing professionals stand ready to help realize the full potential of new information technologies in government.

Here are the recommendations in full:

  • Data published by the government should be in formats and approaches that promote analysis and reuse of that data.
  • Data republished by the government that has been received or stored in a machine-readable format (such as as online regulatory filings) should preserve the machine-readability of that data.
  • Information should be posted so as to also be accessible to citizens with limitations and disabilities.
  • Citizens should be able to download complete datasets of regulatory, legislative or other information, or appropriately chosen subsets of that information, when it is published by government.
  • Citizens should be able to directly access government-published datasets using standard methods such as queries via an API (Application Programming Interface).
  • Government bodies publishing data online should always seek to publish using data formats that do not include executable content.
  • Published content should be digitally signed or include attestation of publication/creation date, authenticity, and integrity.

Obama's CTO: two positions?

Paul Blumenthal over at the Sunlight Foundation Blog points to a new report from the Congressional Research Service: “A Federal Chief Technology Officer in the Obama Administration: Option and Issues for Consideration”.

This report does a good job of analyzing both existing positions in federal government that have roles that overlap with some of the potential responsibilities of an “Obama CTO” and the questions that Congress would want to consider if such a position is established by statute rather than an executive order.

The crux of the current issue, for me, is summed up well by this quote from the CRS report’s conclusion:

Although the campaign position paper and transition website provide explicit information on at least some of the duties of a CTO, they do not provide information on a CTO’s organizational placement, structure, or relationship to existing offices. In addition, neither the paper nor website states whether the president intends to establish this position/office by executive order or whether he would seek legislation to create a statutory foundation for its duties and authorities.

The various issues in the mix here lead me to one conclusion: an “Obama CTO” position will be very different from the responsibilities of a traditional chief technology officer. There seem to be at least two positions involved: one visionary and one fixer. That is, one person to push the envelope in a grounded-but-futurist style in terms of what is possible and then one person to negotiate the myriad of agencies and bureaucratic parameters to get things done.

As for the first position, I’d like to say a futurist would be a good idea. However, futurists don’t like to be tethered so much to current reality. A better idea is, I think, a senior academic with broad connections and deep interest and understanding in emerging technologies. The culture of academia, when it works well, can produce individuals who make connections quickly, know how to evaluate complex ideas and are good at filling gaps between what is known and not known for a particular proposal. I’m thinking a Felten, Lessig, etc. here.

As for the fixer, this desperately needs to be someone with experience negotiating complex endeavors between conflicting government fiefdoms. Vivek Kundra, the CTO for the District of Columbia, struck me as exactly this kind of person when he came to visit last semester here at Princeton’s CITP. When Kundra’s name came up as one of two shortlisted candidates for “Obama CTO”, I was a bit skeptical as I wasn’t convinced he had the appropriate visionary qualities. However, as part of a team, I think he’d be invaluable.

It could be possible that the other shortlisted candidate, Cisco’s Padmasree Warrior, would have enough of the visionary element to make up the other side of the team; I doubt she has (what I consider to be) the requisite governmental fixer qualities.

So, why not two positions? Does anyone have both these qualities? Do people agree that these are the right qualities?

As to how it would be structured, it’s almost as if it should be a spider position — a reference to a position in soccer that isn’t tethered by role. That is, they should be free from some of the encumbrances that make government information technology innovation so difficult.

Please participate in research project — requires only one click

As part of a research project on web browser security we are currently taking a “census” of browser installations. We hope you’ll agree to participate.

If you do participate, a small snippet of JavaScript will collect your browser’s settings and send them to our server. We will record a cryptographic hash of those settings in our research database. We will also store a non-unique cookie (saying only that you participated) in your browser. We will do all of this immediately if you click this link.

(If you want to see in advance the Javascript code we run on participants’ machines, you can read it here.)

[I revised this entry to be more clear about what we are doing. — Ed]

New Site Tests Crowd-Sourced Transparency

Some of my colleagues here at CITP have written about the importance of open data formats for promoting government transparency and achieving government accountability. Another leading thinker in this area is my friend Jerry Brito, a George Mason University scholar who contributed a post here at Freedom to Tinker last year. Jerry wrote one of the first papers on the importance of mashups using government data. Now, Jerry and a few collaborators have put his ideas into action by building a site called Stimulus Watch that will facilitate crowd-sourced analysis of the hundreds of billions of dollars of deficit spending that President Obama has made a centerpiece of his economic agenda.

Jerry and his collaborators parsed a report containing more than 10,000 “shovel ready” spending proposals from the nation’s mayors. Many of these proposals will likely be funded if Congress approves Obama’s spending bill. Using the site, ordinary Americans across the country can review the proposals in their own metropolitan areas and provide feedback on which proposals deserve the highest priority. As the site grows in popularity, it may prove extremely valuable for federal officials deciding where to allocate money. And if there are turkeys like the “Bridge to Nowhere” among the mayors’ requests, the site will allow citizens to quickly identify and publicize these proposals and perhaps shame government officials into canceling them.