The views expressed by contributors are their own and not the view of The Hill

Take my word for it: Privacy and COVID alert apps can coexist


Since the COVID-19 pandemic began, technologists across the country have rushed to develop digital apps for contact tracing and exposure notifications. New York, New Jersey, Pennsylvania, and Delaware have all recently announced the launch of such apps, announcements which generated excitement. But the advent of these tools has also created questions. Chief among them: Do these apps protect privacy?

The short answer is that it really depends on the app — but clearly these four were all designed to be privacy protective. Each of these states worked with an Irish company to build their apps using technology developed by Google and Apple earlier this year. The exposure notification system designed by those two companies was built to prioritize privacy while allowing state health authorities to embed it into apps.

The details can get dense, but basically the exposure notification system assigns whatever device it’s running on with a random ID that changes every 10-20 minutes to ensure it cannot be used to track the user’s identity or location. Android and iOS devices with the app use Bluetooth technology to exchange these random IDs with other devices that come near them for at least 15 minutes. After someone tests positive for COVID-19, a public health representative will give them a code to enter into their app, allowing the app to upload their random IDs anonymously and add them to a list of IDs associated with people who have tested positive. All the mobile devices running these alert apps regularly download this list of IDs and check to see whether they have seen any of them recently. If a device recognizes an ID on the list, it alerts its user that they may have been exposed and provides information about what to do next. This allows users to receive a COVID exposure notification if they’ve been near someone who tested positive — without tracking the location or identity of any of the app’s users.

Google and Apple added a few other features to protect user privacy. Users must make the choice to turn on the exposure notification technology, and they are also able to turn it off at any time. Users of the app are also not identified to each other, Apple, or Google. They are similarly not identified to their public health authority unless they choose to identify themselves. Google and Apple have further said that they require their government partners’ apps to comply with privacy and security requirements, and that they will shut down the system on a regional basis as it’s no longer needed.

To date, exposure notification apps have struggled to reach critical mass. In May it was reported that contact tracing apps had been adopted by only 38 percent of people in Iceland and 20 percent in Singapore. Yet researchers estimate that adoption rates must be at least 60 percent for these apps to be effective. Given our uniquely American sense of individualism and skepticism toward the government, health officials and tech partners will need to work doubly hard to get our adoption rates up — including privacy protections like those provided by these Northeastern state apps is a step in the right direction.

Culture isn’t the only variable. It’s also likely that certain segments of the population will be hesitant to download these apps because of their experiences with — and concerns about — government and law enforcement. As I’ve noted elsewhere, people of color could very well fear using apps that don’t seem secure, even though people of color have been disproportionately hit by COVID-19. Undocumented workers and immigrants also have understandable worries, and many rural Americans who feel they’re disconnected from the virus could deem it unnecessary to invite the government into their phones. To these people I say that — from what I’ve seen — you have nothing to fear about the COVID Alert apps from New York, New Jersey, Pennsylvania, and Delaware. Public officials in these states must nonetheless explain how this technology works and earn the trust of their citizens.

There is a tradeoff between privacy and effective public health policy.

These new apps prioritize privacy, but as a result they are not able to provide detailed information that might assist public health authorities or answer questions raised by people who receive exposure notifications.

I’ve installed the COVID Alert PA app on my phone, and I know that if I received a notification, I would want some information about where I was when I was exposed so that I might learn the extent of the risk. Was I indoors or out? Was I or the person who tested positive wearing a mask? How far apart were we? Neither the apps nor the public health authorities will have this information — because building an app that could answer questions like those would require sacrifices to privacy.

There is no getting around that dilemma, and it’s something governments and citizens alike have to grapple with. But ensuring that people actually download an app requires that they feel secure, and the apps in question wisely made that a priority.

The COVID Alert apps could be best described as experiments, because ultimately, we don’t know if they’ll work. They are relatively inexpensive and low risk, but they require large numbers of people to adopt them before they have any chance of being effective.

I genuinely hope people will see that their privacy is safe with these apps, and that they will go ahead and install them on their phones. If we can combine mask-wearing with widespread testing and the adoption of these apps, we will do much better at reducing the spread of the novel coronavirus.

Lorrie Cranor is the director and Bosch distinguished professor in security and privacy technologies of CyLab and the FORE Systems professor of computer science and of engineering and public policy at Carnegie Mellon University.