October 9, 2024

A Reivew of Oral Arguments in McBurney v. Young: State FOIA and State Rights

Yesterday, I attended oral arguments in the Supreme Court case of McBurney v. Young, which I have previously written about. The case involves two different petitioners who were denied access to state records under a Virginia “freedom of information” law that limits such access to Virginia residents only. McBurney is a former Virginia resident who […]

Supreme Court to Hear State Freedom of Information Act Case "McBurney v. Young"

On Friday, the Supreme Court granted certiorari to McBurney v. Young. This case formally concerns the “Privileges and Immunities Clause” of the Constitution. It raises questions about what access rights citizens have to government records and about who counts as a journalist. Oral argument will likely be scheduled for 2013. Mark McBurney is a citizen […]

Opening Government: On the Limits of FOIA and the Metaphor of Transparency

At a recent symposium (“Piracy and the Politics of Policing: Legislating and Enforcing Copyright Law”) sponsored by the Cardozo Arts and Entertainment Law Journal, I was invited to respond to an excellent paper by David Levine on secrecy, national security, and the denial of public access to documents from the Anti-Counterfeiting Trade Agreement (ACTA) negotiation […]

The New Ambiguity of "Open Government"

David Robinson and I have just released a draft paper—The New Ambiguity of “Open Government”—that describes, and tries to help solve, a key problem in recent discussions around online transparency. As the paper explains, the phrase “open government” has become ambiguous in a way that makes life harder for both advocates and policymakers, by combining the politics of transparency with the technologies of open data. We propose using new terminology that is politically neutral: the word adaptable to describe desirable features of data (and the word inert to describe their absence), separately from descriptions of the governments that use these technologies.

Clearer language will serve everyone well, and we hope this paper will spark a conversation among those who focus on civic transparency and innovation. Thanks to Justin Grimes and Josh Tauberer, for their helpful insight and discussions as we drafted this paper.

Download the full paper here.

Abstract:

“Open government” used to carry a hard political edge: it referred to politically sensitive disclosures of government information. The phrase was first used in the 1950s, in the debates leading up to passage of the Freedom of Information Act. But over the last few years, that traditional meaning has blurred, and has shifted toward technology.

Open technologies involve sharing data over the Internet, and all kinds of governments can use them, for all kinds of reasons. Recent public policies have stretched the label “open government” to reach any public sector use of these technologies. Thus, “open government data” might refer to data that makes the government as a whole more open (that is, more transparent), but might equally well refer to politically neutral public sector disclosures that are easy to reuse, but that may have nothing to do with public accountability. Today a regime can call itself “open” if it builds the right kind of web site—even if it does not become more accountable or transparent. This shift in vocabulary makes it harder for policymakers and activists to articulate clear priorities and make cogent demands.

This essay proposes a more useful way for participants on all sides to frame the debate: We separate the politics of open government from the technologies of open data. Technology can make public information more adaptable, empowering third parties to contribute in exciting new ways across many aspects of civic life. But technological enhancements will not resolve debates about the best priorities for civic life, and enhancements to government services are no substitute for public accountability.

What We Lose if We Lose Data.gov

In its latest 2011 budget proposal, Congress makes deep cuts to the Electronic Government Fund. This fund supports the continued development and upkeep of several key open government websites, including Data.gov, USASpending.gov and the IT Dashboard. An earlier proposal would have cut the funding from $34 million to $2 million this year, although the current proposal would allocate $17 million to the fund.

Reports say that major cuts to the e-government fund would force OMB to shut down these transparency sites. This would strike a significant blow to the open government movement, and I think it’s important to emphasize exactly why shuttering a site like Data.gov would be so detrimental to transparency.

On its face, Data.gov is a useful catalog. It helps people find the datasets that government has made available to the public. But the catalog is really a convenience that doesn’t necessarily need to be provided by the government itself. Since the vast majority of datasets are hosted on individual agency servers—not directly by Data.gov—private developers could potentially replicate the catalog with only a small amount of effort. So even if Data.gov goes offline, nearly all of the data still exist online, and a private developer could go rebuild a version of the catalog, maybe with even better features and interfaces.

But Data.gov also plays a crucial behind the scenes role, setting standards for open data and helping individual departments and agencies live up to those standards. Data.gov establishes a standard, cross-agency process for publishing raw datasets. The program gives agencies clear guidance on the mechanics and requirements for releasing each new dataset online.

There’s a Data.gov manual that formally documents and teaches this process. Each agency has a lead Data.gov point-of-contact, who’s responsible for identifying publishable datasets and for ensuring that when data is published, it meets information quality guidelines. Each dataset needs to be published with a well-defined set of common metadata fields, so that it can be organized and searched. Moreover, thanks to Data.gov, all the data is funneled through at least five stages of intermediate review—including national security and privacy reviews—before final approval and publication. That process isn’t quick, but it does help ensure that key goals are satisfied.

When agency staff have data they want to publish, they use a special part of the Data.gov website, which outside users never see, called the Data Management System (DMS). This back-end administrative interface allows agency points-of-contact to efficiently coordinate publishing activities agency-wide, and it gives individual data stewards a way to easily upload, view and maintain their own datasets.

My main concern is that this invaluable but underappreciated infrastructure will be lost when IT systems are de-funded. The individual roles and responsibilities, the informal norms and pressures, and perhaps even the tacit authority to put new datasets online would likely also disappear. The loss of structure would probably mean that sharply reduced amounts of data will be put online in the future. The datasets that do get published in an ad hoc way would likely lack the uniformity and quality that the current process creates.

Releasing a new dataset online is already a difficult task for many agencies. While the current standards and processes may be far from perfect, Data.gov provides agencies with a firm footing on which they can base their transparency efforts. I don’t know how much funding is necessary to maintain these critical back-end processes, but whatever Congress decides, it should budget sufficient funds—and direct that they be used—to preserve these critically important tools.