June 23, 2021

Archives for February 2008

Comcast's Disappointing Defense

Last week, Comcast offered a defense in the FCC proceeding challenging the technical limitations it had placed on BitTorrent traffic in its network. (Back in October, I wrote twice about Comcast’s actions.)

The key battle line is whether Comcast is just managing its network reasonably in the face of routine network congestion, as it claims, or whether it is singling out certain kinds of traffic for unnecessary discrimination, as its critics claim. The FCC process has generated lots of verbiage, which I can’t hope to discuss, or even summarize, in this post.

I do want to call out one aspect of Comcast’s filing: the flimsiness of its technical argument.

Here’s one example (p. 14-15).

As Congresswoman Mary Bono Mack recently explained:

The service providers are watching more and more of their network monopolized by P2P bandwidth hogs who command a disproportionate amount of their network resources. . . . You might be asking yourself, why don’t the broadband service providers invest more into their networks and add more capacity? For the record, broadband service providers are investing in their networks, but simply adding more bandwidth does not solve [the P2P problem]. The reason for this is P2P applications are designed to consume as much bandwidth as is available, thus more capacity only results in more consumption.

(emphasis in original). The flaws in this argument start with the fact that the italicized segment is wrong. P2P protocols don’t aim to use more bandwidth rather than less. They’re not sparing with bandwidth, but they don’t use it for no reason, and there does come a point where they don’t want any more.

But even leaving aside the merits of the argument, what’s most remarkable here is that Comcast’s technical description of BitTorrent cites as evidence not a textbook, nor a standards document, nor a paper from the research literature, nor a paper by the designer of BitTorrent, nor a document from the BitTorrent company, nor the statement of any expert, but a speech by a member of Congress. Congressmembers know many things, but they’re not exactly the first group you would turn to for information about how network protocols work.

This is not the only odd source that Comcast cites. Later (p. 28) they claim that the forged TCP Reset packets that they send shouldn’t be called “forged”. For this proposition they cite some guy named George Ou who blogs at ZDNet. They give no reason why we should believe Mr. Ou on this point. My point isn’t to attack Mr. Ou, who for all I know might actually have some relevant expertise. My point is that if this is the most authoritative citation Comcast can find, then their argument doesn’t look very solid. (And, indeed, it seems pretty uncontroversial to call these particular packets “forged”, given that they mislead the recipient about (1) which IP address sent the packet, and (2) why the packet was sent.)

Comcast is a big company with plenty of resources. It’s a bit depressing that they would file arguments like this with the FCC, an agency smart enough to tell the difference. Is this really the standard of technical argumentation in FCC proceedings?

The continuing saga of Sarasota's lost votes

At a hearing today before a subcommittee of Congress’s Committee on House Administration, the U.S. Government Accountability Office (GAO) reported on the results of their technical investigation into the exceptional undervote rate in the November 2006 election for Florida’s 13th Congressional District.

David Dill and I wrote a long paper about shortcomings in previous investigations, so I’m not going to present a detailed review of the history of this case. [Disclosure: Dill and I were both expert witnesses on behalf of Jennings and the other plaintiffs in the Jennings v. Buchanan case. Writing this blog post, I’m only speaking on my own. I do not speak on behalf of Christine Jennings or anybody else involved with the campaign.]

Heavily abridged history: One in seven votes recorded on Sarasota’s ES&S iVotronic systems in the Congressional race were blank. The margin of victory was radically smaller than this. If you attempt to do a statistical projection from the votes that were cast onto the blank votes, then you inevitably end up with a different candidate seated in Congress.

While I’m not a lawyer, my understanding of Florida election law is that the summary screen, displayed before the voter casts a vote, is what really matters. If the summary screen showed no vote in the race and the voter missed it before casting the ballot, then that’s tough luck for them. If, however, the proper thing was displayed on the summary screen and things went wrong afterward, then there would be a legal basis under Florida law to reverse the election.

Florida’s court system never got far enough to make this call. The judge refused to even allow the plaintiffs access to the machines in order to conduct their own investigation. Consequently, Jennings took her case directly to Congress, which has the power to seat its own members. The last time this particular mechanism was used to overturn an election was in 1985. It’s unclear exactly what standard Congress must use when making a decision like this. Should they use Florida’s standard? Should they impose their own standard? Good question.

Okay, then. On to the GAO’s report. GAO did three tests:

  1. They sampled the machines to make sure the firmware that was inside the machines was the firmware that was supposed to be there. They also “witnessed” the source code being compiled and yielding the same thing as the firmware being used. Nothing surprising was found.
  2. They cast a number of test ballots. Everything worked.
  3. They deliberately miscalibrated some iVotronic systems in a variety of different ways and cast some more test votes. They found the machines were “difficult to use”, but that the summary screens were accurate with respect to the voter’s selections.

What they didn’t do:

  • They didn’t conduct any controlled human subject tests to cast simulated votes. Such a test, while difficult and expensive to perform, would allow us to quantify the extent to which voters are confused by different aspects of the voting system’s user interface.
  • They didn’t examine any of the warehoused machines for evidence of miscalibration. They speculate that grossly miscalibrated machines would have been detected in the field and would have been either recalibrated or taken out of service. They suggest that two such machines were, in fact, taken out of service.
  • They didn’t go through any of ES&S’s internal change logs or trouble tickets. If ES&S knows more, internally, about what may have caused this problem, they’re not saying and GAO was unable to learn more.
  • For the tests that they did conduct, GAO didn’t describe enough about the test setup and execution for us to make a reasonable critique of whether their test setup was done properly.

GAO’s conclusions are actually rather mild. All they’re saying is that they have some confidence that the machines in the field were running the correct software, and that the software doesn’t seem to induce failures. GAO has no opinion on whether poor human factors played a factor, nor do they offer any opinion on what the legal implications of poor human factors would be in terms of who should have won the race. Absent any sort of “smoking gun” (and, yes, 18,000 undervotes apparently didn’t make quite enough smoke on their own), it would seem unlikely that the Committee on House Administration would vote to overturn the election.

Meanwhile, you can expect ES&S and others to use the GAO report as some sort of vindication of the iVotronic, in specific, or of paperless DRE voting systems, in general. Don’t buy it. Even if Sarasota’s extreme undervote rate wasn’t itself sufficient to throw out this specific election result, it still represents compelling evidence that the voting system, as a whole, substantially failed to capture the intent of Sarasota’s voters. Finally, the extreme effort invested by Sarasota County, the State of Florida, and the GAO demonstrates the fundamental problem with the current generation of paperless DRE voting systems: when problems occur, it’s exceptionally difficult to diagnose them. There simply isn’t enough information left behind to determine what really happened during the election.

Other articles on today’s news: CNet News, Bradeton Herald, Sarasota Herald-Tribune, NetworkWorld, Miami Herald (AP wire story), VoteTrustUSA

UPDATE (2/12): Ted Selker (MIT Media Lab) has a press release online that describes human factors experiments with a Flash-based mock-up of the Sarasota CD-13 ballot. They appear to have found undervote rates of comparable magnitude to those obvserved in Sarasota. A press release is very different from a proper technical report, much less a conference or journal publication, so it’s inappropriate to look to this press release as “proof” of any sort of “ballot blindness” effect.

Google Objects to Microhoo: Pot Calling Kettle Black?

Last week Microsoft offered to buy Yahoo at a big premium over Yahoo’s current stock price; and Google complained vehemently that Microsoft’s purchase of Yahoo would reduce competition. There’s been tons of commentary about this. Here’s mine.

The first question to ask is why Microsoft made such a high offer for Yahoo. One possibility is that Microsoft thinks the market had drastically undervalued Yahoo, making it a good investment even at a big markup. This seems unlikely.

A more plausible theory is that Microsoft thinks Yahoo is a lot more valuable when combined with Microsoft than it would be on its own. Why might this be? There are two plausible theories.

The synergy theory says that combining Yahoo’s businesses with Microsoft’s businesses creates lots of extra value, that is that the whole is much more profitable than the parts would be separately.

The market structure theory says that Microsoft benefits from Yahoo’s presence in the market (as a counterweight to Google), that Microsoft worried that Yahoo’s market position was starting to slip, so Microsoft acted to prop up Yahoo by giving Yahoo credible access to capital and strong management. In this theory, Microsoft cares less (or not at all) about actually combining the businesses, and wants mostly to keep Google from capturing Yahoo’s market share.

My guess is that both theories have some merit – that Microsoft’s offer is both offensive (seeking synergies) and defensive (maintaining market structure).

Google objected almost immediately that a Microsoft-Yahoo merger would reduce competition to the extent that government should intervene to block the merger or restrict the conduct of the merged entity. The commentary on Google’s complaint has focused on two points. First, at least in some markets, two-way competition between Microhoo and Google might be more vigorous than the current three-way competition between a dominant Google and two rivals. Second, even assuming that the antitrust authorities ultimately reject Google’s argument and allow the merger to proceed, government scrutiny will delay the merger and distract Microsoft and Yahoo, thereby helping Google.

Complaining has downsides for Google too – a government skeptical of acquisitions by dominant high-tech companies could easily boomerang and cause Google its own antitrust headaches down the road.

So why is Google complaining, despite this risk? The most intriguing possibility is that Google is working the refs. Athletes and coaches often complain to the referee about a call, knowing that the ref won’t change the call, but hoping to generate some sympathy that will pay off next time a close call has to be made. Suppose Google complains, and the government rejects its complaint. Next time Google makes an acquisition and the government comes starts asking questions, Google can argue that if the government didn’t do anything about the Microhoo merger, then it should lay off Google too.

It’s fun to toss around these Machiavellian theories, but I doubt Google actually thought all this through before it reacted. Whatever the explanation, now that it has reacted, it’s stuck with the consequences of its reaction – just as Microsoft is stuck, for better or worse, with its offer to buy Yahoo.