May 7, 2024

Search Results for: net neutrality

2010 Predictions Scorecard

We’re running a little behind this year, but as we do every year, we’ll review the predictions we made for 2010. Below you’ll find our predictions from 2010 in italics, and the results in ordinary type. Please notify us in the comments if we missed anything.

(1) DRM technology will still fail to prevent widespread infringement. In a related development, pigs will still fail to fly.

We win again! There are many examples, but one that we predicted specifically is that HDCP was cracked. Guess what our first prediction for 2011 will be? Verdict: Right.

(2) Federated DRM systems, such as DECE and KeyChest, will not catch on.

Work on DECE (now renamed UltraViolet) continues to roll forward, with what appears to be broad industry support. It remains to be seen if those devices will actually work well, but the format seems to have at least “caught on” among industry players. We haven’t been following this market too closely, but given that KeyChest seems to mostly be mentioned as an also-ran in UltraViolet stories, its chances don’t look as good. Verdict: Mostly wrong.

(3) Content providers will crack down on online sites that host unlicensed re-streaming of live sports programming. DMCA takedown notices will be followed by a lawsuit claiming actual knowledge of infringing materials and direct financial benefits.

Like their non-live bretheren, live streaming sites like Justin.tv have received numerous DMCA takedown notices for copyrighted content. At the time of this prediction, we were unaware of the lawsuit against Ustream by a boxing promotional company, which began in August 2009. Nonetheless, the trend has continued. In the UK, there was an active game of cat-and-mouse between sports teams and live illegal restreaming sources for football (ahem: soccer) and cricket, which make much of their revenue on selling tickets to live matches. In some cases, a number of pubs were temporarily closed when their licenses were suspended in the face of complaints from content providers. In the US, Zuffa, the parent company for the mixed martial arts production company Ultimate Fighting Championship, sued when a patron at a Boston bar connected his laptop to one of the bar’s TVs to stream a UFC fight from an illicit site (Zuffa is claiming $640k in damages). In July, Zuffa subpoenaed the IP addresses of people uploading its content. And last week UFC sued Justin.tv directly for contributory and vicarious infringement, inducement, and other claims (RECAP docket). Verdict: Mostly right.

(4) Major newspaper content will continue to be available online for free (with ads) despite cheerleading for paywalls by Rupert Murdoch and others.

Early last year, the New York Times announced its intention to introduce a paywall in January 2011, and that plan still seems to be on track, but didn’t actually happen in 2010. The story is the same at the Philly Inquirer, which is considering a paywall but hasn’t put one in place. The Wall Street Journal was behind a paywall already. Other major papers, including the Los Angeles Times, the Washington Post, and USA Today, seem to be paywall-free. The one major paper we could find that did go behind a paywall is the Times of London went behind a paywall in July, with predictably poor results. Verdict: Mostly right.

(5) The Supreme Court will strike down pure business model patents in its Bilski opinion. The Court will establish a new test for patentability, rather than accepting the Federal Circuit’s test. The Court won’t go so far as to ban software patents, but the implications of the ruling for software patents will be unclear and will generate much debate.

The Supreme Court struck down the specific patent at issue in the case, but it declined to invalidate business method patents more generally. It also failed to articulate a clear new test. The decision did generate plenty of debate, but that went without saying. Verdict: Wrong.

(6) Patent reform legislation won’t pass in 2010. Calls for Congress to resolve the post-Bilski uncertainty will contribute to the delay.

Another prediction that works every year. Verdict: Right.

(7) After the upcoming rulings in Quon (Supreme Court), Comprehensive Drug Testing (Ninth Circuit or Supreme Court) and Warshak (Sixth Circuit), 2010 will be remembered as the year the courts finally extended the full protection of the Fourth Amendment to the Internet.

The Supreme Court decided Quon on relatively narrow grounds and deferred on the Fourth Amendment questions on electronic privacy, and the Ninth Circuit in Comprehensive Drug Testing dismissed the lower court's privacy-protective guidelines for electronic searches. However, the big privacy decision of the year was in Warshak, where the Sixth Circuit ruled strongly in favor of the privacy of remotely stored e-mail. Paul Ohm said of the decision: “It may someday be seen as a watershed moment in the extension of our Constitutional rights to the Internet.” Verdict: Mostly right.

(8) Fresh evidence will come to light of the extent of law enforcement access to mobile phone location-data, intensifying the debate about the status of mobile location data under the Fourth Amendment and electronic surveillance statutes. Civil libertarians will call for stronger oversight, but nothing will come of it by year’s end.

Even though we didn’t learn anything significant and new about the extent of government access to mobile location data, the debate around “cell-site” tracking privacy certainly intensified, in Congress, in the courts and in the public eye. The issue gained significant public attention through a trio of pro-privacy victories in the federal courts and Congress held a hearing on ECPA reform that focused specifically on location-based services. Despite the efforts of the Digital Due Process Coalition, no bills were introduced in Congress to reform and clarify electronic surveillance statutes. Verdict: Mostly right.

(9) The FTC will continue to threaten to do much more to punish online privacy violations, but it won’t do much to make good on the threats.

As a student of the FTC’s Chief Technologist, I’m not touching this one with a ten-foot pole.

(10) The new Apple tablet will be gorgeous but expensive. It will be a huge hit only if it offers some kind of advance in the basic human interface, such as a really effective full-sized on-screen keyboard.

Gorgeous? Check. Expensive? Check. Huge hit? Check. Advance in the basic human interface? The Reality Distortion Field forces me to say “yes.” Verdict: Mostly right.

(11) The disadvantages of iTunes-style walled garden app stores will become increasingly evident. Apple will consider relaxing its restrictions on iPhone apps, but in the end will offer only rhetoric, not real change.

Apple’s iPhone faced increasingly strong competition from Google’s rival Android platform, and it’s possible this could be attributed to Google’s more liberal policies for allowing apps to run on Android devices. Still, iPhones and iPads continued to sell briskly, and we’re not aware of any major problems arising from Apple’s closed business model. Verdict: Wrong.

(12) Internet Explorer’s usage share will fall below 50 percent for the first time in a decade, spurred by continued growth of Firefox, Chrome, and Safari.

There’s no generally-accepted yardstick for browser usage share, because there are so many different ways to measure it. But Wikipedia has helpfully aggregated browser usage share statistics. All five metrics listed there show the usage share falling by between 5 and 10 percent over the last years, with current values being between 41 to 61 percent. The mean of these statistics is 49.5 percent, and the median is 46.94 percent. Verdict: Right.

(13) Amazon and other online retailers will be forced to collect state sales tax in all 50 states. This will have little impact on the growth of their business, as they will continue to undercut local bricks-and-mortar stores on prices, but it will remove their incentive to build warehouses in odd places just to avoid having to collect sales tax.

State legislators continue to introduce proposals to tax out-of-state retailers, but Amazon has fought hard against these proposals, and so far the company has largely kept them at bay. Verdict: Wrong.

(14) Mobile carriers will continue locking consumers in to long-term service contracts despite the best efforts of Google and the handset manufacturers to sell unlocked phones.

Google’s experiment selling the Nexus One directly to consumers via the web ended in failure after about four months. T-Mobile, traditionally the nation’s least restrictive national wireless carrier, recently made it harder for consumers to find its no-contract “Even More Plus” plans. It’s still possible to get an unlocked phone if you really want one, but you have to pay a hefty premium, and few consumers are bothering. Verdict: Right.

(15) Palm will die, or be absorbed by Research In Motion or Microsoft.

This prediction was almost right. Palm’s Web OS didn’t catch on, and in April the company was acquired by a large IT firm. However, that technology firm was HP, not RIM or Microsoft. Verdict: Half right.

(16) In July, when all the iPhone 3G early adopters are coming off their two-year lock-in with AT&T, there will be a frenzy of Android and other smartphone devices competing for AT&T’s customers. Apple, no doubt offering yet another version of the iPhone at the time, will be forced to cut its prices, but will hang onto its centralized app store. Android will be the big winner in this battle, in terms of gained market share, but there will be all kinds of fragmentation, with different carriers offering slightly different and incompatible variants on Android.

Almost everything we predicted here happened. The one questionable prediction is the price cut, but we’re going to say that this counts. Verdict: Right.

(17) Hackers will quickly sort out how to install their own Android builds on locked-down Android phones from all the major vendors, leading to threatened or actual lawsuits but no successful legal action taken.

The XDA Developers Forum continues to be the locus for this type of Android hacking, and this year it did not disappoint. The Droid X was rooted and the Droid 2 was rooted, along with many other Android phones. The much-anticipated T-Mobile G2 came with a new lock-down mechanism based in hardware. HTC wasn’t initially forthcoming with the legally-mandated requirement to publish their modifications to the Linux source code that implemented this mechanism, but relented after a Freedom to Tinker post generated some heat. The crack took about a month, and now G2 owners are able to install their own Android builds. Verdict: Right.

(18) Twitter will peak and begin its decline as a human-to-human communication medium.

We’re not sure how to measure this prediction, but Twitter recently raised another $200 million in venture capital and its users exchanged 250 billion tweets in 2010. That doesn’t look like decline to us. Verdict: Wrong.

(19) A politican or a candidate will commit a high-profile “macaca”-like moment via Twitter.

We can’t think of any good examples of high-profile cases that severely affected a politician’s prospects in the 2010 elections, like the “macaca” comment did to George Allen’s 2006 Senate campaign. However, there were a number of more low-profile gaffes, including Sarah Palin’s call for peaceful muslims to “refudiate” the “Ground Zero Mosque” (the New Oxford American Dictionary named refudiate its word of the year), then-Senator Chris Dodd’s staff mis-tweeting inappropriate comments and a technical glitch in computer software at the U.S. embassy in Beijing tweeting that the air quality one day was “crazy bad”. Verdict: Mostly wrong.

(20) Facebook customers will become increasingly disenchanted with the company, but won’t leave in large numbers because they’ll have too much information locked up in the site.

In May 2010, Facebook once again changed its privacy policy to make more Facebook user information available to more people. On two occasions, Facebook has faced criticism for leaking user data to advertisers. But the site doesn’t seem to have declined in popularity. Verdict: Right.

(21) The fashionable anti-Internet argument of 2010 will be that the Net has passed its prime, supplanting the (equally bogus) 2009 fad argument that the Internet is bad for literacy.

Wired declared the web dead back in August. Is that the same thing as saying the Net has passed its prime? Bogus arguments all sound the same to us. Verdict: Mostly right.

(22) One year after the release of the Obama Administration’s Open Government Directive, the effort will be seen as a measured success. Agencies will show eagerness to embrace data transparency but will find the mechanics of releasing datasets to be long and difficult. Privacy– how to deal with personal information available in public data– will be one major hurdle.

Many people are calling this open government’s “beta period.” Federal agencies took the landmark step in January by releasing their first “high-value” datasets on Data.gov, but some advocates say these datasets are not “high value” enough. Agencies also published their plans for open government—some were better than others—and implementation of these promises has indeed been incremental. Privacy has been an issue in many cases, but it’s often difficult to know the reasons why an agency decides not to release a dataset. Verdict: Mostly right.

(23) The Open Government agenda will be the bright spot in the Administration’s tech policy, which will otherwise be seen as a business-as-usual continuation of past policies.

As we noted above, the Obama administration has had a pretty good record on open government issues. Probably the most controversial tech policy change has been the FCC’s adoption of new network neutrality rules. These weren’t exactly a continuation of Bush administration policies, but they also didn’t go as far as many activist groups wanted. And we can think of any other major tech policy changes. Verdict: Mostly right.

Our score: 7 right, 8 mostly right, 1 half right, 2 mostly wrong, 4 wrong.

Two Stories about the Comcast/Level 3 Dispute (Part 2)

In my last post I told a story about the Level 3/Comcast dispute that portrays Comcast in a favorable light. Now here’s another story that casts Comcast as the villain.

Story 2: Comcast Abuses Its Market Power

As Steve explained, Level 3 is an “Internet Backbone Provider.” Level 3 has traditionally been considered a tier 1 provider, which means that it exchanges traffic with other tier 1 providers without money changing hands, and bills everyone else for connectivity. Comcast, as a non-tier 1 provider, has traditionally paid Level 3 to carry its traffic to places Comcast’s own network doesn’t reach directly.

Steve is right that the backbone market is highly competitive. I think it’s worth unpacking why this is in a bit more detail. Let’s suppose that a Comcast user wants to download a webpage from Yahoo!, and that both are customers of Level 3. So Yahoo! sends its bits to Level 3, who passes it along to Comcast. And traditionally, Level 3 would bill both Yahoo! and Comcast for the service of moving data between them.

It might seem like Level 3 has a lot of leverage in a situation like this, so it’s worth considering what would happen if Level 3 tried to jack up its prices. There are reportedly around a dozen other tier 1 providers that exchange traffic with Level 3 on a settlement-free basis. This means that if Level 3 over-charges Comcast for transit, Comcast can go to one of Level 3’s competitors, such as Global Crossing, and pay it to carry its traffic to Level 3’s network. And since Global Crossing and Level 3 are peers, Level 3 gets nothing for delivering traffic to Global Crossing that’s ultimately bound for Comcast’s network.

A decade ago, when Internet Service Retailers (to use Steve’s terminology) were much smaller than backbone providers, that was the whole story. The retailers didn’t have the resources to build their own global networks, and their small size meant they had relatively little bargaining power against the backbone providers. So the rule was that Internet Service Retailers charged their customers for Internet access, and then passed some of that revenue along to the backbone providers that offered global connectivity. There may have been relatively little competition in the retailer market, but this didn’t have much effect on the overall structure of the Internet because no single retailer had enough market power to go toe-to-toe with the backbone providers.

Do Not Track: Not as Simple as it Sounds

Over the past few weeks, regulators have rekindled their interest in an online Do Not Track proposal in hopes of better protecting consumer privacy. FTC Chairman Jon Leibowitz told a Senate Commerce subcommittee last month that Do Not Track is “one promising area” for regulatory action and that the Commission plans to issue a report in the fall about “whether this is one viable way to proceed.” Senator Mark Pryor (D-AR), who sits on the subcommittee, is also reportedly drafting a new privacy bill that includes some version of this idea, of empowering consumers with blanket opt-out powers over online tracking.

Details are sparse at this point about how a Do Not Track mechanism might actually be implemented. There are a variety of possible technical and regulatory approaches to the problem, each with its own difficulties and limitations, which I’ll discuss in this post.

An Adaptation of “Do Not Call”

Because of its name, Do Not Track draws immediate comparisons to arguably the most popular piece of consumer protection regulation ever instituted in the US—the National Do Not Call Registry. If the FTC were to take an analogous approach for online tracking, a consumer would register his device’s network identifier—its IP address—with the national registry. Online advertisers would then be prohibited from tracking devices that are identified by those IP addresses.

Of course, consumer devices rarely have persistent long-term IP addresses. Most ISPs assign IP addresses dynamically (using DHCP) and a single device might be assigned a new IP address every few minutes. Consumer devices often also share the same IP address at the same time (using NAT) so there’s no stable one-to-one mapping between IPs and devices. Things could be different with IPv6, where each device could have its own stable IP address, but the Do Not Call framework, directly applied, is not the best solution for today’s online world.

The comparison is still useful though, if only to caution against the assumption that Do Not Track will be as easy, or as successful, as Do Not Call. The differences between the problems at hand and the technologies involved are substantial.

A Registry of Tracking Domains

Back in 2007, a coalition of online consumer privacy groups lobbied for the creation of a national Do Not Track List. They proposed a reverse approach: online advertisers would be required to register with the FTC all domain names used to issue persistent identifiers to user devices. The FTC would then publish this list, and it would be up to the browser to protect users from being tracked by these domains. Notice that the onus here is fully on the browser—equipped with this list—to protect the user from being uniquely identified. Meanwhile, online advertisers would still have free rein to try any method they wish to track user behavior, so long as it happens from these tracking domains.

We’ve learned over the past couple of years that modern browsers, from a practical perspective, can be limited in their ability to protect the user from unique identification. The most stark example of this is the browser fingerprinting attack, which was popularized by the EFF earlier this year. In this attack, the tracking site runs a special script that gathers information about the browser’s configurations, which are unique enough to identify the browser instance in nearly every case. The attack takes advantage of the fact that much of the gathered information is used frequently for legitimate purposes—such as determining which plugins are available to the site—so a browser which blocks the release of this information would surely irritate the user. As these kinds of “side-channel” attacks grow in sophistication, major browser vendors might always be playing catch-up in the technical arms race, leaving most users vulnerable to some form of tracking by these domains.

The x-notrack Header

If we believe that browsers, on their own, will be unable to fully protect users, then any effective Do No Track proposal will need to place some restraints on server tracking behavior. Browsers could send a signal to the tracking server to indicate that the user does not want this particular interaction to be tracked. The signaling mechanism could be in the form of a standard pre-defined cookie field, or more likely, an HTTP header that marks the user’s tracking preference for each connection.

In the simplest case, the HTTP header—call it x-notrack—is a binary flag that can be turned on or off. The browser could enable x-notrack for every HTTP connection, or for connections to only third party sites, or for connections to some set of user-specified sites. Upon receiving the signal not to track, the site would be prevented, by FTC regulation, from setting any persistent identifiers on the user’s machine or using any other side-channel mechanism to uniquely identify the browser and track the interaction.

While this approach seems simple, it could raise a few complicated issues. One issue is bifurcation: nothing would prevent sites from offering limited content or features to users who choose to opt-out of tracking. One could imagine a divided Web, where a user who turns on the x-notrack header for all HTTP connections—i.e. a blanket opt-out—would essentially turn off many of the useful features on the Web.

By being more judicious in the use of x-notrack, a user could permit silos of first-party tracking in exchange for individual feature-rich sites, while limiting widespread tracking by third parties. But many third parties offer useful services, like embedding videos or integrating social media features, and they might require that users disable x-notrack in order to access their services. Users could theoretically make a privacy choice for each third party, but such a reality seems antithetical to the motivations behind Do Not Track: to give consumers an easy mechanism to opt-out of harmful online tracking in one fell swoop.

The FTC could potentially remedy this scenario by including some provision for “tracking neutrality,” which would prohibit sites from unnecessarily discriminating against a user’s choice not to be tracked. I won’t get into the details here, but suffice it to say that crafting a narrow yet effective neutrality provision would be highly contentious.

Privacy Isn’t a Binary Choice

The underlying difficulty in designing a simple Do Not Track mechanism is the subjective nature of privacy. What one user considers harmful tracking might be completely reasonable to another. Privacy isn’t a single binary choice but rather a series of individually-considered decisions that each depend on who the tracking party is, how much information can be combined and what the user gets in return for being tracked. This makes the general concept of online Do Not Track—or any blanket opt-out regime—a fairly awkward fit. Users need simplicity, but whether simple controls can adequately capture the nuances of individual privacy preferences is an open question.

Another open question is whether browser vendors can eventually “win” the technical arms race against tracking technologies. If so, regulations might not be necessary, as innovative browsers could fully insulate users from unwanted tracking. While tracking technologies are currently winning this race, I wouldn’t call it a foregone conclusion.

The one thing we do know is this: Do Not Track is not as simple as it sounds. If regulators are serious about putting forth a proposal, and it sounds like they are, we need to start having a more robust conversation about the merits and ramifications of these issues.

Broadband Politics and Closed-Door Negotiations at the FCC

The last seven days at the FCC have been drama-filled, and that’s not something you can often say about an administrative agency. As I noted in my last post, the FCC is considering reclassifying broadband as a “common carrier” service. This would subject the access portion of the service to some additional regulations which currently do not apply, but have (to some extent) been applied in the past. Last Thursday, the FCC voted 3-2 along party lines to pursue a Notice of Inquiry about this approach and others, in order to help solidify its ability to enforce consumer protections and implement the National Broadband Plan in the wake of the Comcast decision in the DC Circuit Court. There was a great deal of politicking and rhetoric around the vote. Then, on Monday, the Wall Street Journal reported that lobbyists were engaged in closed-door meetings at the FCC, discussing possible legislative compromises that would obviate the need for reclassification. This led to public outcry from everyone who was not involved in the meetings, and allegations of misconduct by the FCC for its failure to disclose the meetings. If you sit through my description of the intricacies of reclassification, I promise to give you the juicy bits about the controversial meetings.

The Reclassification Vote and the NOI
As I explained in my previous post, the FCC faces a dilemma. The DC Circuit said it did not have the authority under Title I of the Communications Act to enforce the broadband openness principles it espoused in 2005. This cast into doubt the FCC’s ability to not only police violations of the principles but also to implement many portions of the National Broadband Plan. In the past, the Commission would have had unquestioned authority under Title II of the Act, but in a series of decisions from 2002-2007 it voluntarily “deregulated” broadband by classifying it as a Title I service. Chairman Genachowski has floated what he calls a “Third Way” approach in which broadband is not classified as a Title I service anymore, and is not subject to all provisions of Title II, but instead is classified under Title II but with extensive “forbearance” from portions of that title.

From a legal perspective, the main question is whether the FCC has the authority to reclassify the transmission component of broadband internet service as a Title II service. This gets into intricacies of how broadband service fits into statutory definitions of “information service” (aka Title I), “telecommunications”, “telecommunications service” (aka Title II), and the like. I was going to lay these out in detail, but in the interest of getting to the juicy stuff I will simply direct you to Harold Feld’s excellent post. For the “Third Way” approach to work, the FCC’s interpretation of a “telecommunications service” will have to be articulated to include broadband internet access while not also swallowing a variety of internet services that everyone thinks should remain unregulated — sites like Facebook, content delivery networks like Akamai, and digital media providers like Netflix. However, this narrow definition must not be so narrow that the FCC does not have jurisdiction to police the types of practices it is concerned about (for instance, providers should not be able to discriminate in their delivery of traffic simply by moving the discrimination from their transport layer of the network to the logical layer, or by partnering with an affiliated “ISP” that does discrimination for them). I am largely persuaded of Harold’s arguments, but the AT&T lobbyists present the other side as well. One argument that I don’t see anyone making (yet) is that presuming the transmission component is subject to Title II, the FCC would seem to have a much stronger argument for exercising ancillary jurisdiction with respect to interrelated components like non-facilities-based ISPs that rely on that transmission component.

The other legal debate involves an even more arcane discussion about whether — assuming there is a “telecommunications service” offered as part of broadband service — that “telecommunications service” is something that can be regulated separately from the other “information services” (Title I) that might be offered along with it. This includes things like an email address from your provider, DNS, Usenet, and the like. Providers have historically argued that these were inseparable from the internet access component, and the so-called “Stevens Report” of 1998 introduced the notion that the “inextricably intertwined” nature of broadband service might have the result of classifying all such services as entirely Title I “information services.” To the extent that this ever made any sense, it is far from true today. What consumers believe they are purchasing is access to the internet, and all of those other services are clearly extricable from a definitional and practical standpoint (indeed, customers can and do opt for competitors for all of them on a regular basis).

But none of these legal arguments are at the fore of the current debate, which is almost entirely political. Witness, for example, John Boehner’s claim that the “Third Way” approach was a “government takeover of the Internet,” Fred Upton’s (R-MI) claim that the approach is a “blind power grab,” modest Democratic sign-on to an industry-penned and reasoning-free opposition letter, and an attempt by Republican appropriators to block funding for the FCC unless they swore off the approach. This prompted a strong response from Democratic leaders indicating that any such effort would not see the light of day. Ultimately, the FCC voted in favor of the NOI to explore the issue. Amidst this tumult, the WSJ reported that the FCC had started closed-door meetings with industry representatives in order to discuss a possible legislative compromise.

Possible Legislation and Secret Meetings
It is not against the rules to communicate with the FCC about active proceedings. Indeed, such communications are part of a healthy policymaking process that solicits input from stakeholders. The FCC typically conducts proceedings under the “permit but disclose” regime in which all discussions pertaining to the given proceeding must be described in “ex parte” filings on the docket. Ars has a good overview of the ex parte regime. The NOI passed last week is subject to these rules.

It therefore came as a surprise that a subset of industry players were secretly meeting with the FCC to discuss possible legislation that could make the NOI irrelevant. This issue is made even more egregious by the fact that the FCC just conducted a proceeding on improving ex parte disclosures, and the Chairman remarked:

“Given the complexity and importance of the issues that come before us, ex parte communications remain an essential part of our deliberative process. It is essential that industry and public stakeholders know the facts and arguments presented to us in order to express informed views.”

The Chairman’s Chief of Staff Edward Lazarus sought to explain away the obligation for ex parte disclosure, and nevertheless attached a brief disclosure letter from the meeting attendees that didn’t describe any of the details. There is perhaps a case to be made that the legislative options do not directly fall under the subject matter of the NOI, but even if this position were somehow legally justifiable it clearly falls afoul of the policy intent of the ex parte rules. Harold Feld has a great post in which he describes his nomination for “Worsht Ex Parte Ever“. The letter attached to the Lazarus post would certainly take the title if it were a formal ex parte letter. The industry participants in the meetings deserve some criticism, but ultimately the problems can only be resolved by the FCC by demanding comprehensive openness rather than perpetuating a culture of loopholes.

The public outcry continues, from both public interest groups and in the comments on the Lazarus post. If it’s true that the FCC admits internally that “they f*cked up”, they should do far more to regain the public’s trust in the integrity of the notice-and-comment process.

Update: The Lazarus post was just updated to replace the link to the brief disclosure letter with two new links to letters that describe themselves as Ex Parte letters. The first contains the exact same text as the original, and the second has a few bullet points.

Android Open Source Model Has a Short Circuit

[Update: Google subsequently worked out a mechanism that allows Cyanogen and others to distribute their mods separate from the Google Apps.]

Last year, Google entered the mobile phone market with a Linux-based mobile operating system. The company brought together device manufacturers and carriers in the Open Handset Alliance, explaining that, “Together we have developed Android™, the first complete, open, and free mobile platform.” There has been considerable engagement from the open source developer community, as well as significant uptake from consumers. Android may have even been instrumental in motivating competing open platforms like LiMo. In addition to the underlying open source operating system, Google chose to package essential (but proprietary) applications with Android-based handsets. These applications include most of the things that make the handsets useful (including basic functions to sync with the data network). This two-tier system of rights has created a minor controversy.

A group of smart open source developers created a modified version of the Android+Apps package, called Cyanogen. It incorporated many useful and performance-enhancing updates to the Android OS, and included unchanged versions of the proprietary Apps. If Cyanogen hadn’t included the Apps, the package would have been essentially useless, given that Google doesn’t appear to provide a means to install the Apps on a device that has only a basic OS. As Cyanogen gained popularity, Google decided that it could no longer watch the project distribute their copyright-protected works. The lawyers at Google decided that they needed to send a Cease & Desist letter to the Cyanogen developer, which caused him to take the files off of his site and spurred backlash from the developer community.

Android represents a careful balance on the part of Google, in which the company seeks to foster open platforms but maintain control over its proprietary (but free) services. Google has stated as much, in response to the current debate. Android is an exciting alternative to the largely closed-source model that has dominated the mobile market to date. Google closely integrated their Apps with the operating system in a way that makes for a tremendously useful platform, but in doing so hampered the ability of third-party developers to fully contribute to the system. Perhaps the problem is simply that they did not choose the right location to draw the line between open vs. closed source — or free-to-distribute vs. not.

The latter distinction might offer a way out of the conundrum. Google could certainly grant blanket rights to third-parties to redistribute unchanged versions of their Apps. This might compromise their ability to make certain business arrangements with carriers or handset providers in which they package the software for a fee. That may or may not be worth it from their business perspective, but they could have trouble making the claim that Android is a “complete, open, and free mobile platform” if they don’t find a way to make it work for developers.

This all takes place in the context of a larger debate over the extent to which mobile platforms should be open — voluntarily or via regulatory mandate. Google and Apple have been arguing via letters to the FCC about whether or not Apple should allow the Google Voice application in the iPhone App Store. However, it is yet to be determined whether the Commission has the jurisdiction and political will to do anything about the issue. There is a fascinating sideshow in that particular dispute, in which AT&T has made the very novel claim that Google Voice violates network neutrality (well, either that or common carriage — they’ll take whichever argument they can win). Google has replied. This is a topic for another day, but suffice to say the clear regulatory distinctions between telephone networks, broadband, and devices have become muddied.

(Cross-posted to Managing Miracles)