Google has announced a new initiative that aims to “fundamentally enhance privacy on the web.”
The proposal — dubbed “Privacy Sandbox” — is a stab at preventing extensive tracking of users on the web through cookies and other methods like tracking pixels and browser and device fingerprinting.
In creating a new standard that puts users in control over their data, Google hopes it would strike a balance between personalization and privacy.
“Technology that publishers and advertisers use to make advertising even more relevant to people is now being used far beyond its original design intent — to a point where some data practices don’t match up to user expectations for privacy,” Chrome‘s engineering director Justin Schuh said.
Schuh stressed the lack of a common anti-tracking standard among browser makers is having unintended consequences, e.g. fall in publisher revenues.
But if all of it sounds like you’ve heard before, it’s because the development follows exactly a week after Apple outlined a similar anti-tracking policy that strikes the heart of how digital advertising functions today.
Google wants a middle ground
The search giant doesn’t want to unilaterally block all cookies that are used to keep tabs on your every move as you hop from one site to the other.
Google notes that advertising is still the way for a more open web, citing a study that shows publishers lose an average of 52 percent of their advertising revenues when readers block tracking cookies.
But it doesn’t want advertisers and marketers to embrace practices like fingerprinting either. Fingerprinting is what happens when information such as a device hardware configuration and browser settings are used to identify and track a user.
While you can opt out of third-party cookie tracking through features built in browsers like Chrome, Safari, and Firefox, you can’t prevent companies from fingerprinting you — unless you keep changing the configuration of your device on a regular basis, which isn’t feasible.
Instead, it’s proposing a new solution called Privacy Sandbox that protects your privacy while also offering advertisers a way to show you targeted ads without resorting to privacy-violating practices like fingerprinting.
Differential privacy to the rescue — again
To minimize data leakage associated with device fingerprinting, Google is taking a page out of differential privacy (DP) — a statistical technique that makes it possible to collect and share aggregate information about users, while safeguarding individual privacy.
This is achieved by injecting “random noise,” thereby producing a result that isn’t quite exact, but accurate enought to glean insights. Although this noise prevents information leakage, it doesn’t do so entirely.
As the total leakage only increases everytime data is queried from a database, it necessitates the need for adding more more noise to minimize the leakage.
The tradeoff between accuracy and privacy — which manifests as a “privacy budget” — underpins the very idea of differential privacy. In the words of Johns Hopkins University professor Matthew Green:
The total allowed leakage is often referred to as a ‘privacy budget,’ and it determines how many queries will be allowed (and how accurate the results will be). The basic lesson of DP is that the devil is in the budget. Set it too high, and you leak your sensitive data. Set it too low, and the answers you get might not be particularly useful.
In order to prevent fingerprinting, Google intends to leverage the privacy budget to limit API calls from websites to reveal “enough information to narrow a user down to a group sufficiently large enough to maintain anonymity.” Once the budget is exhausted, websites will no longer be able to obtain any further information.
Eventually, this will be available as an open-source browser extension and will allow you to see three things: the kinds of data being collected about you (and by whom and why), the advertiser responsible for the ad you’re seeing, and what caused it to appear.
That’s not all. Google also appears to be following Apple’s footsteps for a privacy-preserving ad tracking method called Conversion Measurement with Aggregation. It seeks to limit advertisers’ cross-site tracking, at the same time, measure the effectiveness of their ad campaigns on the web without compromising your privacy.
Putting user in control of data
What Google, ultimately, is outlining is a privacy-focused initiative that puts users front and center — a tool that gives them the ability to see what data is collected and control how it is used.
For a company whose business model is built on the foundation of tracking people’s activities online and then share that (anonymized) data with advertisers — which then use that information for targeted advertising — the move is bound attract a fair bit of skepticism.
At this stage, Privacy Sandbox remains just a concept. But Google is seeking extensive feedback from browser developers, privacy advocates, publishers, and advertisers to take it forward.
“While Chrome can take action quickly in some areas (for instance, restrictions on fingerprinting) developing web standards is a complex process, and we know from experience that ecosystem changes of this scope take time,” Schuh said. “They require significant thought, debate, and input from many stakeholders, and generally take multiple years.”
Why now?
Google already has a long list of initiatives — federated learning, private join and compute, private set intersection, and confidential computing — all geared around improving privacy and security.
But Google, at its heart, is still an advertising company. If anything, the rash of proposals are emblematic of a wider public and regulatory scrutiny faced by big tech to be more transparent in their data practices.
What’s more, Apple’s WebKit anti-tracking policy — which equates online tracking as a security vulnerability — has raised the stakes, forcing Google to respond with similar privacy-focused solutions or risk losing customer trust.
While one cannot deny it’s a ploy on part of the search giant to retain users on Google Chrome (and its larger ecosystem), the fact that it’s joining the tracking debate can only be a good thing.
Just as Facebook is struggling to convince its users that its pivot to privacy after a string of data scandals is real, Google will have to do everything it can to bridge that trust gap and encode privacy into their design in a manner that instills transparency and trust.