July 25, 2017

AdNauseam, Google, and the Myth of the “Acceptable Ad”

Earlier this month, we (Helen Nissenbaum, Mushon Zer-Aviv, and I), released a new and improved AdNauseam 3.0. For those not familiar, AdNauseam is the adblocker that clicks every ad in an effort to obfuscate tracking profiles and inject doubt into the lucrative economic system that drives advertising-based surveillance. The 3.0 release contains some new features we’ve been excited to discuss with users and critics, but the discussion was quickly derailed when we learned that Google had banned AdNauseam from its store, where it had been available for the past year. We also learned that Google has disallowed users from manually installing or updating AdNauseam on Chrome, effectively locking them out of their own saved data, all without prior notice or warning.

Whether or not you are a fan of AdNauseam’s strategy, it is disconcerting to know that Google can quietly make one’s extensions and data disappear at any moment, without so much as a warning. Today it is a privacy tool that is disabled, but tomorrow it could be your photo album, chat app, or password manager. You don’t just lose the app, you lose your stored data as well: photos, chat transcripts, passwords, etc. For developers, who, incidentally, must pay a fee to post items in the Chrome store, this should cause one to think twice. Not only can your software be banned and removed without warning, with thousands of users left in the lurch, but all comments, ratings, reviews, and statistics are deleted as well.

When we wrote Google to ask the reason for the removal, they responded that AdNauseam had breached the Web Store’s Terms of Service, stating that “An extension should have a single purpose that is clear to users”[1]. However, the sole purpose of AdNauseam seems readily apparent to us—namely to resist the non-consensual surveillance conducted by advertising networks, of which Google is a prime example. Now we can certainly understand why Google would prefer users not to install AdNauseam, as it opposes their core business model, but the Web Store’s Terms of Service do not (at least thus far) require extensions to endorse Google’s business model. Moreover, this is not the justification cited for the software’s removal.

So we are left to speculate as to the underlying cause for the takedown. Our guess is that Google’s real objection is to our newly added support for the EFF’s Do Not Track mechanism[2]. For anyone unfamiliar, this is not the ill-fated DNT of yore, but a new, machine-verifiable (and potentially legally-binding) assertion on the part of websites that commit to not violating the privacy of users who choose to send the DNT header. A new generation of blockers including the EFF’s Privacy Badger, and now AdNauseam, have support for this mechanism built-in, which means that they don’t (by default) block ads and other resources from DNT sites, and, in the case of AdNauseam, don’t simulate clicks on these ads.

So why is this so threatening to Google? Perhaps because it could represent a real means for users, advertisers, and content-providers to move away from surveillance-based advertising. If enough sites commit to Do Not Track, there will be significant financial incentive for advertisers to place ads on those sites, and these too will be bound by DNT, as the mechanism also applies to a site’s third-party partners. And this could possibly set off a chain reaction of adoption that would leave Google, which has committed to surveillance as its core business model, out in the cold.

But wait, you may be thinking, why did the EFF develop this new DNT mechanism when there is AdBlock Plus’ “Acceptable Ads” programs, which Google and other major ad networks already participate in?

That’s because there are crucial differences between the two. For one, “Acceptable Ads” is pay-to-play; large ad networks pay Eyeo, the company behind Adblock Plus, to whitelist their sites. But the more important reason is that the program is all about aesthetics—so-called “annoying” or “intrusive” ads—which the ad industry would like us to believe is the only problem with the current system. An entity like Google is fine with “Acceptable Ads” because they have more than enough resources to pay for whitelisting[3] . Further, they are quite willing to make their ads more aesthetically acceptable to users (after all, an annoyed user is unlikely to click)[4]. What they refuse to change (though we hope we’re wrong about this) is their commitment to surreptitious tracking on a scale never before seen. And this, of course, is what we, the EFF, and a growing number of users find truly “unacceptable” about the current advertising landscape.

 

[1]  In the one subsequent email we received, a Google representative stated that a single extension should not perform both blocking and hiding. This is difficult to accept at face value as nearly all ad blockers (including uBlock, Adblock Plus, Adblock, Adguard, etc., all of which are allowed in the store) also perform blocking and hiding of ads, trackers, and malware. Update (Feb 17, 2017): it has been a month since we have received any message from Google despite repeated requests for clarification, and despite the fact that they claim, in a recent Consumerist article, to be “in touch with the developer to help them resubmit their extension to get included back in the store.”

[2] This is indeed speculation. However, as mention in [1], the stated reason for Google’s ban of AdNauseam does not hold up to scrutiny.

[3]  In September of this year, Eyeo announced that it would partner with a UK-based ad tech startup called ComboTag to launch the“Acceptable Ads Platform” with which they would act also as an ad exchange, selling placements for “Acceptable Ad” slots.  Google, as might be expected, reacted negatively, stating that it would no longer do business with ComboTag. Some assumed that this might also signal an end to their participation in“Acceptable Ads” as well. However, this does not appear to be the case. Google still comprises a significant portion of the exception list on which “Acceptable Ads” is based and, as one ad industry observer put it, “Google is likely Adblock Plus’ largest, most lucrative customer.”

[4]  Google is also a member of the “Coalition for Better Ads”, an industry-wide effort which, like “Acceptable Ads”, focuses exclusively on issues of aesthetics and user experience, as opposed to surveillance and data profiling.

 

NYC to Collect GPS Data on Car Service Passengers—Good Intentions Gone Awry or Something Else?

During the holiday season, New York City through its Taxi & Limousine Commission (the “TLC”) proposed a new rule expanding data reporting obligations for car service platform companies including Uber and Lyft. If the rule is adopted, car services will now have to report the GPS coordinates of both passenger pick-up and drop-off locations to the city government. Under NY’s Freedom of Information Law, that data in bulk will also be subject to full public release.

This proposal is either a classic case of good intentions gone awry or a clandestine effort to track millions of car service riders while riding roughshod over passenger privacy.

The stated justification for the new rule is to combat “driver fatigue” and improve car service safety. While the goal is laudable and important, the proposed data collection does not match the purpose and makes no sense. Does anyone really think GPS data measures a driver’s hours on the job or is relevant for the calculation of a trip’s duration? If the data collection were really designed to address driver fatigue, then the relevant data would be shift length (driver start/stop times, ride durations, possibly trip origination), not pick up/drop off locations.

The reporting, though, of this GPS data to the city government poses a real and serious threat to passenger privacy. The ride patterns can be mined to identify specific individuals and where they travel. In 2014, for instance, The Guardian reported that the TLC released anonymized taxi ride data that was readily reverse engineered to identify drivers. A 2015 paper shows that mobility patterns can also be used to identify gender and ethnicity. Numerous examples—from the Netflix release of subscriber film ratings  that were reverse engineered to identify subscribers to the re-identification of patients from supposedly anonymous health records—show that bulk data can often be identified to specific individuals. Disturbingly, the TLC proposal only makes one innocuous reference to protecting “privacy and confidentiality” and yet includes neither any privacy safeguards against identification of individual passengers from ride patterns nor any exemption from the NY State Freedom of Information Law.

If this weren’t worrisome enough for privacy, here’s the flashing red light. The TLC proposal mentions in passing that the data might be useful for “other enforcement actions.” But, the examples given for “other enforcement actions” do not map to the data being collected. For instance, the proposal says the GPS data “will facilitate investigating passenger complaints or complaints from a pedestrian or other motorist about unsafe driving, including for incidents alleged to have occurred during or between trips, by allowing TLC to determine the location of a vehicle at a particular time.” The pick-up and drop-off locations will not work for this goal. Likewise, the proposal says that “[b]y understanding when for-hire trips to and from the airports occur TLC can better target resources to ensure that passengers are picked up at the airport only by drivers authorized to do so.” This too is a strange justification to collect individual passenger records for every ride throughout the city! This goal would be satisfied much more effectively by seeking aggregate drop-off data for the particular areas of concern to the TLC.

This vague enforcement language and the mismatch between the proposal and the articulated goals strongly suggests that the rule may be a smokescreen for a new mass surveillance program of individuals traveling within New York City. Only two years ago, the NY Police Department was caught deploying a controversial program to track cars throughout the city using EZ Pass readers on traffic lights. This proposed new rule looks like a surreptitious expansion of that program to car service passengers. The TLC rule, if adopted, would provide a surveillance data trove that makes an end run around judicial oversight, subpoenas, and warrants.

It’s time to put the brakes on the city’s collection of trip location data for car service rides.