Advertisement

Op-Ed: Coronavirus tracing apps are coming. Here’s how they could reshape surveillance as we know it

An illustration of a warning on a cell phone
(Nicole Vas / Los Angeles Times)
Share via

Last week, the world got a preview of how Google and Apple’s contact tracing project might look and function. Some privacy and security experts have expressed cautious optimism that the effort could be a potentially useful tool to aid public health contact tracers while protecting privacy.

The project modifies the iOS and Android systems to allow government health agencies to build apps that use a mobile phone’s Bluetooth communication capabilities. These apps would make it possible for a person who tests positive for the coronavirus to send out an “exposure” notification to the phones of other app users to alert them that their phones had been in the vicinity of the infected person’s phone during a given period. People getting this information could decide to self-isolate or get tested. The app would not reveal anyone’s identity.

To protect privacy, the system only uses Bluetooth, does not collect location data, hides a user’s identity, requires permission to collect proximity data or upload data from the phones of people who test positive for COVID-19, with all the data stored on a user’s phone unless the user decides to notify others. Additionally, the companies will require users to enter a unique code provided by health authorities to declare themselves as infected.

Advertisement

But even the most privacy protective contact tracing apps have weak points. As many have pointed out, anonymous cellphone-based tracing can never be a substitute for the detailed work that trained human contact tracers have to do. Even putting critical questions of effectiveness aside, there are at least three concerns to keep in mind about relying on technology to mitigate the COVID-19 crisis.

First, there are only so many things tech companies can control. Google and Apple are promising to serve as staunch gatekeepers of the system they are creating. They plan to allow only government health authorities to create the apps that can use the tracing capabilities. To protect civil liberties, the companies say they will not allow government agencies to mandate use of the app (presumably, by denying them system access). But, of course, that doesn’t prevent others like employers and schools, who aren’t bound by the companies’ terms of use for app developers, from requiring app participation as a condition of employment or entrance.

It’s also unclear how well Apple and Google will be able to police the app operators to ensure that the apps comply with the rules. How can policymakers help guarantee system-wide fidelity when it’s so easy for things to fall through the cracks?

Advertisement

Second, governments will want these tools for their own purposes. Google and Apple are creating a playbook for governments on how our phones can be repurposed for all kinds of surveillance. Apple and Google have been adamant about their intentions to restrict this system only to help mitigate the COVID-19 pandemic, and I believe them. But even large and powerful companies are subject to political pressure.

France is already asking Apple and Google to make changes in their system that some say would weaken privacy protections. The values and incentives of the tech industry and government will not always be aligned.

It was impossible to maintain any kind of social distance and there was no way to protect oneself from COVID-19.

Will Apple and Google and every other software developer be able to resist indefinitely governments’ attempts to change the design of these tools? Apple successfully beat back the FBI’s request for a modified iOS that would allow them to bypass encryption protections, but can we always count on such resistance? Apple reportedly dropped its plan to allow users to encrypt their backups in the cloud after the FBI complained. This dam will not hold indefinitely. Whether safeguards can help keep government interventions aligned with human values like privacy should be part of this discussion.

Advertisement

Finally, this technology, once deployed, will not be “rolled back.” We are repeatedly told that contact tracing apps and COVID-19-related surveillance are temporary measures for use until the pandemic passes. That’s likely to be a fantasy.

Surveillance inertia is remarkably difficult to resist. Norms get set and practices and tools become entrenched. And who can say when this will wind down? We’re still dealing with the supposedly temporary surveillance authorized almost 20 years ago in the wake of after 9/11. Rollbacks are rare and highly unlikely because the tools we build today will create a path dependency that will shape our future data and surveillance practices.

There are significant opportunity and switching costs to such a heavy investment in these contact tracing apps. What if a tech-first approach ends up less effective than hoped? Will industry and government have the resolve and humility to double-back and try a different approach?

Silicon Valley tries to make all tasks easier. Tech companies see the costs associated with searching, sharing and sorting as things to be eliminated. But in the wake countless privacy lapses on social platforms and an unending wave of data breaches, it’s clear that making tasks easier, even important ones, can cause great collateral harm.

Good privacy engineering is one piece of the puzzle for contact tracing apps. Perhaps even more difficult is weighing the long-term consequences of how these tools will be used after the pandemic ends.

Woodrow Hartzog is a professor of law and computer science at Northeastern University.

Help the Los Angeles Times reimagine what California should look like after the COVID-19 pandemic ends.

Advertisement