How Apple and Google are tackling their covid privacy problem

How Apple and Google are tackling their covid privacy problem


You can read our most essential coverage of the coronavirus/covid-19 outbreak for free, and also sign up for our coronavirus newsletter. But please consider subscribing to support our nonprofit journalism.

The basic system uses your phone’s Bluetooth to anonymously track who you have been in close proximity to (you can read more detail in our previous report). If you opt in to the system, your phone will spot when you’ve been near other people who get diagnosed with covid-19, as long as they also use the system. You won’t know their identity and they won’t know yours, but your phone will flash a notification letting you know you’ve been at risk of exposure. 

It will be mid-May before the first true rollouts begin, but the information so far suggests that the Apple-Google system is clever and scalable. But it’s a system very much in the process of being built, with lots of questions unanswered. 

One of the biggest is the issue of privacy and trust. Will people download the apps built through Apple and Google’s collaborations with public health agencies? Will people trust that the apps are accurate? Will they believe that their data will be protected? Are they going to be concerned that this surveillance system—after all, contact tracing is ultimately a form of surveillance—will come back to haunt them?

Elsewhere around the world, governments have already been building and using surveillance technology to fight the pandemic, including contact tracing apps. But they have often come with trade-offs.

These systems may work, but they are also profoundly invasive.

China, an authoritarian regime with a long history of maximalist surveillance, required citizens use an app that dictated whether they would be quarantined or allowed to move freely; its data is shared with police. In South Korea, a democracy that was the scene of early outbreaks, a pandemic surveillance system allowed the government to access smartphone location, credit card histories, immigration records, and CCTV footage from around the country. Taiwan has built “electronic fences” that track location to make sure that people are staying in place during quarantine. 

READ ALSO  The Pros and Cons of Social Media

These systems may work, but they are also profoundly invasive.

“These systems also can’t be effective if people don’t trust them,” the ACLU’s Jennifer Granick says. “People will only trust these systems if they protect privacy, remain voluntary, and store data on an individual’s device, not a centralized repository.”

Built-in benefits

The Apple-Google system has some advantages over these other approaches. Because the companies control the operating systems and the phones people own, they can actually build a more private and more usable coronavirus tracing technology.

For example, Singapore’s TraceTogether app is technically similar to the Apple-Google version, using Bluetooth to monitor any contact made by people diagnosed with coronavirus. But because it is a third-party app, it has enormous disadvantages: for example, iPhone TraceTogether users must keep their phone unlocked at all times as they move in order for the system to work. That’s something Apple can easily build around and then, with Google, scale to make accessible to virtually every government on earth.

But that scale is part of the problem. If the benefit is that the two companies can use the data of 3 billion people, the drawback is that they can use the data of 3 billion people. Building contact tracing and surveillance without being creepy is not easy.

The answer to this is that the Apple-Google system will not be a monolith: its implementation will vary from country to country. 

The companies say they are helping government public health agencies in North America, Europe, and Asia build their own apps that utilize the same underlying technology. Those governments will have their own rules, but the app will require explicit user consent to start tracking, and the user can always turn it off—either permanently or temporarily. And the companies are in part responsible for the outcomes, including potential abuses as well as medical successes.

READ ALSO  UPnP flaw exposes millions of network devices to attacks over the Internet



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com