Google’s Threatening Argument against Blocking Third-Party Cookies

Because Google cares so much about your privacy, they have announced a new standards initiative to increase privacy on the web.  They call it the Privacy Sandbox. You will be shocked to learn that it is a mixed bag, and has proposals that are deeply problematic for privacy.  There’s a good discussion of some of the pros and cons at EFF. Here I just want to remark on a very bad, quite self-serving argument that they make in their proposal which maintains that blocking thir-party cookies is bad for privacy.

Third-party cookies are basically files that get put on your computer by a party other than the site your are visiting.  So you visit bighats.com and there is a cookie from, say, Google that allows them to learn what you’re up to there.  This helps people like, um, Google, place ads that target your behavior.  Recently, browsers such as Firefox have started blocking third-party cookies by default.  Google thinks this is bad.  Of course it is bad for them, but the surprising part of the argument is that they maintain it is bad for privacy.  They say:

…large scale blocking of cookies undermine people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.

Basically, the argument is that you shouldn’t protect yourself from cookies because then companies will track you by more dastardly means.  While there is an argument here, there is also the hint of a threat.  Surely Google doesn’t mean to say that if you don’t let them track you this way they’ll track you in a way that gives you less choice, so shut up and stop blocking?  Nah. They wouldn’t say that.  Even though they are the world’s biggest user of this sort of information and therefore the biggest threat if things get worse privacy-wise, surely they’re just making an argument, not a threat.

Even if we give them the benefit of the doubt, the fact that this argument is made by dictators and oppressors throughout history should give us pause.  The form:  You’d better accept Bad because otherwise Worse! (“Don’t remove the star of David, because then we’ll have to really make sure we know where you are.”)  The obvious response is that we should have the option to protect ourselves both from the Bad and the Worse!  Of course if the worse really is inevitable without the bad, the argument might be persuasive, but it clearly isn’t inevitable that we will be fingerprinted if we don’t accept third-party cookies.  If a company with Google’s influence decided that’s not how things should go, I doubt things would go that way.    In addition, there are ways one could foil things like fingerprinting.  Not only can one block fingerprinting, but I suspect it’s not difficult to generate masks that generate a false fingerprint.  This doesn’t seem to be an insoluble problem.

Our response to a mass move to more coercive forms of tracking shouldn’t be to volunteer to be tracked, but to raise hell about those more coercive forms, demanding policy protections or simply boycotting websites that employ things like fingerprinting.  The fact that Google makes self serving arguments like this–arguments that sound suspiciously  like threats–should make you think twice about playing in their sandbox.