November 13, 2024

Archives for 2011

Will the NJ Attorney General investigate the NJ Attorney General?

Part 3 of 4
In my recent posts I wrote about my discovery that (apparently) a County employee tampered with evidence in a computer that the NJ Superior Court had Ordered the County to present for examination. I described this discovery to the Court (Judge David E. Krell); and then a County employee did admit deleting files. Judge Krell was very concerned about this possible spoliation of evidence. In his Order signed September 9, 2011, he wrote,

“AND IT IS FURTHER ORDERED that the court recommends that the New Jersey Attorney General (New Jersey Department of Law and Public Safety), Division of Criminal Justice, undertake an investigation of … the deletion of files on August 16, 2011, from the Board’s laptop computer … by the County’s computer technician who is reponsible for servicing the Board’s computers.”

During the hearing on September 1, Plaintiffs’ attorneys pointed out that the New Jersey Attorney General’s office had been co-counsel for the Defendants in Zirkle v. Henry. This means that lawyers from the AG’s office had very possibly advised the County employees before and after the evidence was erased. Plaintiffs’ attorneys pointed out that this would mean that Judge Krell was asking the Attorney General’s office to investigate itself. Plaintiffs asked the Court to appoint a Special Master.

Judge Krell explained why he was not inclined to do that. He said, “My understanding is Criminal Justice is totally separate from the Civil part of [the Attorney General’s] office.” That is, during the hearing the Judge stated his belief that the Division of Criminal Justice in the NJ Department of Law and Public Safety is sufficiently independent from the Division of Law in the Department of Law and Public Safety, such that it can properly investigate the possibility of criminal tampering of evidence in which attorneys from the Division of Law might have had a role.

I hope Judge Krell is right about that.

Crowdsourcing State Secrets

Those who regularly listen to Fresh Air may have heard a recent interview with journalist Dana Priest about the dramatic expansion of the intelligence community over the past ten years. The guest mentioned how the government had paid contractors several times what their own intelligence officials would be paid to perform the same analysis tasks. The guest also mentioned how unwieldy the massive network of contractors had become (to the point where even decided who gets top secret clearance had been contracted out). At the same time, in this age of Wikileaks and #Antisec, leaks and break-ins are becoming all the more common. It’s only a matter of time before thousands of military intelligence reports show up on Pastebin.

However, what if we didn’t have to pay this mass of analysts? What if we stopped worrying so much about leaks and embraced them? What if we could bring in anyone who wanted to analyze the insane amount of information by simply dumping large amounts of the raw data to a publicly-accessible location? What if we crowdsourced intelligence analysis?

Granted, we wouldn’t be able to just dump everything, as some items (such as “al-Qaeda’s number 5 may be house X in Waziristan, according to informant Y who lives in Taliban-controlled territory”) would be damaging if released. But (at least according to the interview) many of the items which are classified as top secret actually wouldn’t cause “exceptionally grave damage.” As for particularly sensitive (but could benefit from analysis) information in such documents, we could simply use pseudonyms and keep the pseudonym-real name mapping top secret.

Adversaries would almost certainly attempt to piece together false analyses. This simply becomes an instance of the Byzantine generals problem, but with a twist: because the mainstream media is always looking for the next sensational story, it would be performing much of the analysis. Because this creates a common goal between the public and the news outlets, there would be some level of trust that other (potentially adversarial) actors would not necessarily have.

In an era when the talking heads in Washington and the media want to cut everything from the tiny National Endowment for the Arts to gigantic Social Security, the last thing we need is to pay people to do work that many would do for free. Applying open government principles to data that do not necessarily need to be kept secret could go a long way toward reducing the part of government that most politicians are unwilling to touch.

Did NJ election officials fail to respect court order to improve security of elections?

Part 2 of 4
The Gusciora case was filed in 2004 by the Rutgers Constitutional Litigation Clinic on behalf of Reed Gusciora and other public-interest plaintiffs. The Plaintiffs sought to end the use of paperless direct-recording electronic voting machines, which are very vulnerable to fraud and manipulation via replacement of their software. The defendant was the Governor of New Jersey, and as governors came and went it was variously titled Gusciora v. McGreevey, Gusciora v. Corzine, Guscioria v. Christie.

In 2010 Judge Linda Feinberg issued an Opinion. She did not ban the machines, but ordered the State to implement several kinds of security measures: some to improve the security of the computers on which ballots are programmed (and results are tabulated), and some to improve the security of the computers inside the voting machines themselves.

The Plaintiffs had shown evidence that ballot-programming computers (the so-called “WinEDS laptops”) in Union County had been used to surf the Internet even on election day in 2008. This, combined with many other security vulnerabilities in the configuration of Microsoft Windows, left the computers open to intrusion by outsiders, who could then interfere with and manipulate the programming of ballots before their installation on the voting machines, or manipulate the aggregation of results after the elections. Judge Feinberg also heard testimony that so-called “Hardening Guidelines”, which had previously been prepared by Sequoia Voting Systems at the request of the State of California, would help close some of these vulnerabilities. Basically, one wipes the hard drive clean on the “WinEDS laptop”, installs a fresh copy of Microsoft Windows, runs a script to shut down Internet access and generally tighten the Windows security configuration, and finally installs a fresh copy of the WinEDS ballot software. The Court also heard testimony (from me) that installing these Guidelines requires experience in Windows system administration, and would likely be beyond the capability of some election administrators.

Among the several steps the Court ordered in 2010 was the installation of these Hardening Guidelines on every WinEDS ballot-programming computer used in public elections, within 120 days.

Two years after I testified in the Gusciora case, I served as an expert witness in a different case, Zirkle v. Henry, in a different Court, before Judge David Krell. I wanted to determine whether an anomaly in the June 2011 Cumberland County primary election could have been caused by an intruder from the Internet, or whether such intrusion could reasonably be ruled out. Thus, the question became relevant of whether Cumberland County’s WinEDS laptop was in compliance with Judge Feinberg’s Order. That is, had the Hardening Guidelines been installed before the ballot programming was done for the election in question? If so, what would the event logs say about the use of that machine as the ballot cartridges were programmed?

One of the components of the Hardening Guidelines is to turn on certain Event Logs in the Windows operating system. So, during my examination of the WinEDS laptop on August 17, I opened the Windows Event Viewer and photographed screen-shots of the logs. To my surprise, the logs commenced on the afternoon of August 16, 2011, the day before my examination. Someone had wiped the logs clean, at the very least, or possibly on August 16 someone had wiped the entire hard drive clean in installing the Hardening Guidelines. In either case, evidence in a pending court case–files on a computer that the State of New Jersey and County of Cumberland had been ordered to produce for examination–was erased. I’m told that evidence-tampering is a crime. In an affidavit dated August 24, Jason Cossaboon, a Computer Systems Analyst employed by Cumberland County, stated that he erased the event logs on August 16.

Robert Giles, Director of the New Jersey Division of Elections, was present during my examination on August 17. Mr. Giles submitted to Judge David Krell an affidavit dated August 25 describing the steps he had taken to achieve compliance with Judge Feinberg’s Order. He writes, “The Sequoia hardening manual was sent, by email, to the various county election offices on March 29, 2010. To my knowledge, the hardening process was completed by the affected counties by the required deadline of June 1, 2010.” Mr. Giles does not say anything about how he acquired the “knowledge” that the process was completed.

Mr. Giles was present in Judge Feinberg’s courtroom in 2009 when I testified that the Hardening Guidelines are not simple to install and would typically require someone with technical training or experience. And yet he then pretended to discharge the State’s duty of compliance with Judge Feinberg’s Order by simply sending a mass e-mail to county election officials. Judge Feinberg herself said that sending an e-mail was not enough; a year later, Mr. Giles has done nothing more. In my opinion, this is disrespectful to the Court, and to the voters of New Jersey.

NJ election cover-up

Part 1 of 4
During the June 2011 New Jersey primary election, something went wrong in Cumberland County, which uses Sequoia AVC Advantage direct-recording electronic voting computers. From this we learned several things:

  1. New Jersey court-ordered election-security measures have not been effectively implemented.
  2. There is a reason to believe that New Jersey election officials have destroyed evidence in a pending court case, perhaps to cover up the noncompliance with these measures or to cover up irregularities in this election. There is enough evidence of a cover-up that a Superior Court judge has referred the matter to the State prosecutor’s office.
  3. Like any DRE voting machine, the AVC Advantage is vulnerable to software-based vote stealing by replacing the internal vote-counting firmware. That kind of fraud probably did not occur in this case. But even without replacing the internal firmware, the AVC Advantage voting machine is vulnerable to the accidental or deliberate swapping of vote-totals between candidates. It is clear that the machine misreported votes in this election, and both technical and procedural safeguards proved ineffective to fully correct the error.

Cumberland County is in the extreme southern part of New Jersey, a three-hour drive south of New York. In follow-up posts I’ll explain my 3 conclusions. In the remainder of this post, I’ll quote verbatim from the Honorable David E. Krell, the Superior Court judge in Cumberland County. This is his summary of the case, taken from the trial transcript of September 1, 2011, in the matter of Zirkle v. Henry.

(click here to continue)

DigiNotar Hack Highlights the Critical Failures of our SSL Web Security Model

This past week, the Dutch company DigiNotar admitted that their servers were hacked in June of 2011. DigiNotar is no ordinary company, and this was no ordinary hack. DigiNotar is one of the “certificate authorities” that has been entrusted by web browsers to certify to users that they are securely connecting to web sites. Without this certainty, users could have their communications intercepted by any nefarious entity that managed to insert itself in the network between the user and the web site they seek to reach.

It appears that DigiNotar did not deserve to be trusted with the responsibility to to issue certifying SSL certificates, because their systems allowed an outside hacker to break in and issue himself certificates for any web site domain he wished. He did so, for dozens of domain names. This included domains like *.google.com and www.cia.gov. Anyone with possession of these certificates and control over the network path between you and the outside world could, for example, view all of your traffic to Gmail. The attacker in this case seems to be the same person who similarly compromised certificate-issuing servers for the company Comodo back in March. He has posted a new manifesto, and he claims to have compromised four other certificate authorities. All signs point to the conclusion that this person is an Iranian national who supports the current regime, or is a member of the regime itself.

The Comodo breach was deeply troubling, and the DigiNotar compromise is far worse. First, this new break-in affected all of DigiNotar’s core certificate servers as opposed to Comodo’s more contained breach. Second, this afforded the attacker with the ability of issuing not only baseline “domain validated” certificates but also higher-security “extended validation” certificates and even special certificates used by the Dutch government to secure itself (see the Dutch government’s fact sheet on the incident). However, this damage was by no means limited to the Netherlands, because any certificate authority can issue certificates for any domain. The third difference when compared to the Comodo breach is that we have actual evidence of these certificates being deployed against users in the real world. In this case, it appears that they were used widely against Iranian users on many different Iranian internet service providers. Finally, and perhaps most damning for DigiNotar, the break-in was not detected for a whole month, and was then not disclosed to the public for almost two more months (see the timeline at the end of this incident report by Fox-IT). The public’s security was put at risk and browser vendors were prevented from implementing fixes because they were kept in the dark. Indeed, DigiNotar seems to have intended never to disclose the problem, and was only forced to do so after a perceptive Iranian Google user noticed that their connections were being hijacked.

The most frightening thing about this episode is not just that a particular certificate authority allowed a hacker to critically compromise its operations, or that the company did not disclose this to the affected public. More fundamentally, it reminds us that our web security model is prone to failure across the board. As I noted at the time of the Comodo breach:

I recently spoke on the subject at USENIX Security 2011 as part of the panel “SSL/TLS Certificates: Threat or Menace?” (video and audio here if you scroll down to Friday at 11:00 a.m., and slides here.)