June 24, 2024

Deconstructing Google’s excuses on tracking protection

By Jonathan Mayer and Arvind Narayanan.

Blocking cookies is bad for privacy. That’s the new disingenuous argument from Google, trying to justify why Chrome is so far behind Safari and Firefox in offering privacy protections. As researchers who have spent over a decade studying web tracking and online advertising, we want to set the record straight.

Our high-level points are:

1) Cookie blocking does not undermine web privacy. Google’s claim to the contrary is privacy gaslighting.

2) There is little trustworthy evidence on the comparative value of tracking-based advertising.

3) Google has not devised an innovative way to balance privacy and advertising; it is latching onto prior approaches that it previously disclaimed as impractical.

4) Google is attempting a punt to the web standardization process, which will at best result in years of delay.

What follows is a reproduction of excerpts from yesterday’s announcement, annotated with our comments.

Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent – to a point where some data practices don’t match up to user expectations for privacy.

Google is trying to thread a needle here, implying that some level of tracking is consistent with both the original design intent for web technology and user privacy expectations. Neither is true.

If the benchmark is original design intent, let’s be clear: cookies were not supposed to enable third-party tracking, and browsers were supposed to block third-party cookies. We know this because the authors of the original cookie technical specification said so (RFC 2109, Section 4.3.5). 

Similarly, if the benchmark is user privacy expectations, let’s be clear: study after study has demonstrated that users don’t understand and don’t want the pervasive web tracking that occurs today. 

Recently, some other browsers have attempted to address this problem, but without an agreed upon set of standards, attempts to improve user privacy are having unintended consequences.

This is clearly a reference to Safari’s Intelligent Tracking Prevention and Firefox’s Enhanced Tracking Protection, which we think are laudable privacy features. We’ll get to the unintended consequences claim.

First, large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.

To appreciate the absurdity of this argument, imagine the local police saying, “We see that our town has a pickpocketing problem. But if we crack down on pickpocketing, the pickpocketers will just switch to muggings. That would be even worse. Surely you don’t want that, do you?”

Concretely, there are several things wrong with Google’s argument. First, while fingerprinting is indeed a privacy invasion, that’s an argument for taking additional steps to protect users from it, rather than throwing up our hands in the air. Indeed, Apple and Mozilla have already taken steps to mitigate fingerprinting, and they are continuing to develop anti-fingerprinting protections.

Second, protecting consumer privacy is not like protecting security—just because a clever circumvention is technically possible does not mean it will be widely deployed. Firms face immense reputational and legal pressures against circumventing cookie blocking. Google’s own privacy fumble in 2012 offers a perfect illustration of our point: Google implemented a workaround for Safari’s cookie blocking; it was spotted (in part by one of us), and it had to settle enforcement actions with the Federal Trade Commission and state attorneys general. Afterward, Google didn’t double down—it completely backed away from tracking cookies for Safari users. Based on peer-reviewed research, including our own, we’re confident that fingerprinting continues to represent a small proportion of overall web tracking. And there’s no evidence of an increase in the use of fingerprinting in response to other browsers deploying cookie blocking.

Third, even if a large-scale shift to fingerprinting is inevitable (which it isn’t), cookie blocking still provides meaningful protection against third parties that stick with conventional tracking cookies. That’s better than the defeatist approach that Google is proposing.

This isn’t the first time that Google has used disingenuous arguments to suggest that a privacy protection will backfire. We’re calling this move privacy gaslighting, because it’s an attempt to persuade users and policymakers that an obvious privacy protection—already adopted by Google’s competitors—isn’t actually a privacy protection.

Second, blocking cookies without another way to deliver relevant ads significantly reduces publishers’ primary means of funding, which jeopardizes the future of the vibrant web. Many publishers have been able to continue to invest in freely accessible content because they can be confident that their advertising will fund their costs. If this funding is cut, we are concerned that we will see much less accessible content for everyone. Recent studies have shown that when advertising is made less relevant by removing cookies, funding for publishers falls by 52% on average.

The overt paternalism here is disappointing. Google is taking the position that it knows better than users—if users had all the privacy they want, they wouldn’t get the free content they want more. So no privacy for users.

As for the “recent studies” that Google refers to, that would be one paragraph in one blog post presenting an internal measurement conducted by Google. There is a glaring omission of the details of the measurement that are necessary to have any sort of confidence in the claim. And as long as we’re comparing anecdotes, the international edition of the New York Times recently switched from tracking-based behavioral ads to contextual and geographic ads—and it did not experience any decrease in advertising revenue.

Independent research doesn’t support Google’s claim either: the most recent academic study suggests that tracking only adds about 4% to publisher revenue. This is a topic that merits much more research, and it’s disingenuous for Google to cherry pick its own internal measurement. And it’s important to distinguish the economic issue of whether tracking benefits advertising platforms like Google (which it unambiguously does) from the economic issue of whether tracking benefits publishers (which is unclear).

Starting with today’s announcements, we will work with the web community to develop new standards that advance privacy, while continuing to support free access to content. Over the last couple of weeks, we’ve started sharing our preliminary ideas for a Privacy Sandbox – a secure environment for personalization that also protects user privacy. Some ideas include new approaches to ensure that ads continue to be relevant for users, but user data shared with websites and advertisers would be minimized by anonymously aggregating user information, and keeping much more user information on-device only. Our goal is to create a set of standards that is more consistent with users’ expectations of privacy.

There is nothing new about these ideas. Privacy preserving ad targeting has been an active research area for over a decade. One of us (Mayer) repeatedly pushed Google to adopt these methods during the Do Not Track negotiations (about 2011-2013). Google’s response was to consistently insist that these approaches are not technically feasible. For example: “To put it simply, client-side frequency capping does not work at scale.” We are glad that Google is now taking this direction more seriously, but a few belated think pieces aren’t much progress.

We are also disappointed that the announcement implicitly defines privacy as confidentiality. It ignores that, for some users, the privacy concern is behavioral ad targeting—not the web tracking that enables it. If an ad uses deeply personal information to appeal to emotional vulnerabilities or exploits psychological tendencies to generate a purchase, then that is a form of privacy violation—regardless of the technical details. 

We are following the web standards process and seeking industry feedback on our initial ideas for the Privacy Sandbox. While Chrome can take action quickly in some areas (for instance, restrictions on fingerprinting) developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time. They require significant thought, debate, and input from many stakeholders, and generally take multiple years.

Apple and Mozilla have tracking protection enabled, by default, today. And Apple is already testing privacy-preserving ad measurement. Meanwhile, Google is talking about a multi-year process for a watered-down form of privacy protection. And even that is uncertain—advertising platforms dragged out the Do Not Track standardization process for over six years, without any meaningful output. If history is any indication, launching a standards process is an effective way for Google to appear to be doing something on web privacy, but without actually delivering. 

In closing, we want to emphasize that the Chrome team is full of smart engineers passionate about protecting their users, and it has done incredible work on web security. But it is unlikely that Google can provide meaningful web privacy while protecting its business interests, and Chrome continues to fall far behind Safari and Firefox. We find this passage from Shoshana Zuboff’s The Age of Surveillance Capitalism to be apt:

“Demanding privacy from surveillance capitalists or lobbying for an end to commercial surveillance on the internet is like asking old Henry Ford to make each Model T by hand. It’s like asking a giraffe to shorten its neck, or a cow to give up chewing. These demands are existential threats that violate the basic mechanisms of the entity’s survival.”

It is disappointing—but regrettably unsurprising—that the Chrome team is cloaking Google’s business priorities in disingenuous technical arguments.

Thanks to Ryan Amos, Kevin Borgolte, and Elena Lucherini for providing comments on a draft.


  1. You mentioned
    “As for the ‘recent studies’ that Google refers to, that would be one paragraph in one blog post presenting an internal measurement conducted by Google. There is a glaring omission of the details of the measurement that are necessary to have any sort of confidence in the claim.”
    This is not objective. The Google’s blog post cited a technical report, which include details to support the effect of tracking ads. They also noticed the independent study (as you mentioned) which gives the contradictory conclusion “tracking only adds about 4% to publisher revenue”. Google commented:
    “We believe that the difference in results may be partially attributed to the fact that their analysis was performed on a single publisher (in contrast to the larger scale of previous studies), and in part due to the inherent challenges associated with the nature of observational studies… Researchers at Google are in communication with the authors to better understand their methodology and the difference in results.”

  2. Alan Perkins says

    > If an ad uses deeply personal information to appeal to emotional vulnerabilities or exploits psychological tendencies to generate a purchase, then that is a form of privacy violation — regardless of the technical details.

    Most ads appeal to emotional “vulnerabilities” or exploit psychological “tendencies” to generate a purchase, whether or not there are technical details involved. E.g. billboards do exactly the same. They simply work on the assumption that they’ll be relevant to some rather than all of the audience. The ad itself needs to be legal, no matter what the media.

    Once you accept advertising is OK then relevance becomes the issue – do you as the viewer want to see relevant or irrelevant ads? Do publishers want to show their readers relevant or irrelevant ads? Do advertisers want to pay for their ads to be seen by people who are or aren’t interested in them?

    If emotional vulnerabilities or psychological tendencies are the concern then one ought to look to what’s being advertised. Cigarettes, alcohol, drugs, gambling, diet supplements and many other products and services are prohibited in Google Ads, which is in part because of the perceived vulnerabilities of the customer. In the main, Google uses targeted ads to sell things like insurance, when people need insurance, or travel, when people are ready to take a vacation. I don’t see much of a problem with that, certainly not on the scale you are suggesting. It definitely needs watching and they definitely need to be held to account but, when it comes to ad networks, Google is one of the best behaved, not the worst!

  3. Ad van Loon says

    For marketing purposes there really is no need to process personal data. What is essential for marketers is that they can provide meaningful offers to members of their target audience(s). The current thinking is that, in order to do so, they need to collect personal data on each and every member of those target groups. However, what they really need is information in regard to the preferences of those persons. They can invite members of their target audience(s) to connect to them anonymously (for example, by scanning a QR code). Despite the anonymity of the connection, a communication channel is created and those who chose to connect can therefore be asked to make their preferences known (without sharing their personal data). The advantage of this alternative approach is that people remain in control of their personal data, that they will only receive offers in which they are actually interested in and only for as long as they are interested in those offers. For marketers, the advantage is that they can be certain that their marketing messages are 100% desired, 100% relevant and reach 100% of the target group. Because of this they can expect a higher conversion rate. And, as no personal data is collected, it is in full conformity with the EU General Data Protection Regulation.

  4. Brett Glass says

    You’re fired! Your university and department receive funds from Google, and so this truthful article is likely to cause you to be dismissed or encouraged to leave. But such is the cost of intellectual honesty. You’ll find better jobs elsewhere.

  5. Mike Linksvayer says

    > And it’s important to distinguish the economic issue of whether tracking benefits advertising platforms like Google (which it unambiguously does) from the economic issue of whether tracking benefits publishers (which is unclear).

    Could someone explain how tracking *unambiguously* benefits ad platforms other than via benefits to advertisers and hence to publishers (via higher ad rates)?

    Commenter Roberto above gives two reasons, which amount to one reason: advertisers are convinced that tracking is valuable, even if it isn’t. Is there evidence of this? If it is true, how is it unambiguous that ad platforms benefit but not publishers? Because the supply of publishers/content is highly elastic, so anything that increases ad rates unambiguously benefits platforms, with no or unclear benefits to publishers? Seems somewhat plausible to me, but for unambiguous I’d need to see some evidence. Is there research showing his?

    (Above is mostly a question of idle curiosity, as my personal opinion is that even if it were true that tracking benefits publishers, I would want to suppress tracking, because I want content made to sell ads to disappear; the existence and availability of such content is a bad. Taxing ads would also help.)

  6. Thanks for calling attention to Google’s proposal and the point for responses to it. My main reservation is the tendency to moosh together a) Google’s transparent business model – nearly entirely reliant on user data aggregation and repackaging for sale, using Chrome as both a direct data collector and vehicle for Google sites and services, with b) the huge proliferation of tracking cookies (and other tactics like “fingerprinting”) by third-party agents and services themselves, which all major browsers permit, though some are curtailing some techniques to some extent. In a lot of cases, those third-party services are only offered conditional on permitting the privacy intrusion.

    A secondary consideration has to be that some users are privacy-concerned, some users are indifferent or even eager to sell personal data for perceived pay-offs, and some users are simply unaware – children, the elderly or otherwise functionally challenged. Generally, the EU’s GDPR style of mandatory opt-in – with adjustments – seems to be the most practical approach to advising and soliciting competent users. Any ordinarily competent end-user using Chrome without bothering to respond to Google’s flashing pop ups reminding them about privacy settings, and without installing third-party extensions to clean or whitelist cookies selectively, knows what they could be doing and are not doing.

  7. One additional note I wanted to add: Google warns about the threat of “fingerprinting” if cookies are blocked. However, they offer no solutions to the threat of “fingerprinting” when they talk about their vaporware Privacy Sandbox.

    Also, I don’t think one needs Google to help target advertising. If you run a website that focuses on cooking, you don’t need an algorithm to tell you to advertise pots, pans, cookbooks, etc.

  8. Thank you so much for this great and easy read!
    An insightful yet short comment on the Google and tracking casa.

    I just tripped over this half sentence, I guess there might be an „any“ missing in front of the „more“?
    ‚they wouldn’t get the free content they want more.‘

  9. Yuhong Bao says

    Will you consider covering my essay/overview on Google:

  10. Albert Daweson says

    Privacy is important and thank you for your dedication to push the frontier of privacy forward.

    You make a strong argument yet in the end, you have a completely oppositional stance towards Google’s arguments and don’t give an inch to their POV. So, this feels like you’re just Google bashing and fail to enter into a potentially crucial monetization discussion over the health of the web (problem?, no problem here!).

    This seems to be the crucial sentence from your piece:

    “And it’s important to distinguish the economic issue of whether tracking benefits advertising platforms like Google (which it unambiguously does) from the economic issue of whether tracking benefits publishers (which is unclear).”

    The conclusion of this line of thinking is that the only interest in tracking is for Google to surveil the user. Consider this thought experiment:

    If Google is not able to deliver better monetizing advertising to publishers using that data, what value does Google gain collecting additional data from the user? Don’t you think Google could add many billions to its market capitalization if it gave up on all the regulatory risks and brand threats associated with its personalized ads products? Why wouldn’t they just gather less data? In this climate, that actually seems like the right thing to do if your assertions are true. If you are going to conjecture that this data only has value to Google, you should also explain how exactly Google benefits from it because without a good thesis for that, Google’s behavior looks irrational.

    Google definitely has a lot of data and deep perspective on the dynamics of this environment since it uniquely sees both user and advertiser behavior. They can’t/won’t share it externally much beyond what they have. You obviously should be vigilant and expect self-serving arguments from them. But if you care about the web as an open ecosystem, you may also want to be more open-minded about their arguments. Apple doesn’t care a whit about the health of the web (it would prefer if everyone used an app, for which they control distribution). Mozilla is irrelevant and just trying to hang on (and ultimately most of their funding comes from Google anyway).

    Media on the web may actually be in a precarious position . What’s your solution?

    • Tracking cookies do what to preserve “the web as an open ecosystem” exactly? I can’t think of how they make the web open or an ecosystem.

    • My two cents: it is totally possible that these practices benefit google even though they do not benefit much the publishers. First: it suffices that google is perceived as the best one, they can monetize this perception even if the benefits for the customers are low or nothing. Second, the competition here is on raspe thin gains on the number of user clicks. It is totally possible that these practice do indeed contribute only in a small way to the relevant metrics, but is what it is needed to overcome google competitors. In this situation google would indeed gain much even if the practices were only marginally useful to publishers.

    • > Media on the web may actually be in a precarious position . What’s your solution?

      This was already answered in the article: there’s no evidence advertisers have a hard requirement on tracking-based advertising. They do not need it for television, newspaper, much of internet advertisement, etc. New York Times disabled tracking-based advertising in EU, and did not see significant negative outcomes.

    • The trouble is there *was* a healthy web prior to the surveillance web, so there is no doubt about its possibility.

    • Apple doesn’t care a whit about the health of the web? Can you source that claim, please? Apple open sourced its Safari browser engine, WebKit, which itself was based on the open source KHTML engine. I think they care about the health of the web very much, as no one would claim apps are the best way to consumer all of the web content in existence. No one. Sounds like you made a strawman argument to me.

  11. Johnny Ryan says


    • Your reasoning is fallacious: you compare cookies and fingerprints to pickpocketing and muggings. While the latter two are equivalent, the former are not. You can’t just “take further steps to block fingerprinting”. Cookies provide a standard, fingerprinting encompasses a range of obscure techniques to implement tracking. Blocking all of them would be very difficult.

      I agree that third party cookies should be blocked, but don’t think that’s only removed one tracking technique. Click-streams, local storage, image data, location, resolutions and other specific data points are and will continue to be collected.

      • Miles, I think you’re seeing too much into this simple analogy. It all boils down to «the existence of a second problem doesn’t justify not fighting the first problem» and, in this specific point, both pairs are equivalent.