As the COVID-19 pandemic continues, the world has turned to the tech industry to help mitigate the spread of the virus and, eventually, help transition out of lockdown. Earlier this month, Apple and Google announced that they are working together to build contact-tracing technology that will automatically notify users if they have been in proximity to someone who has tested positive for COVID-19. However, reports show that there is a severe lack of evidence to show that these technologies can accurately report infection data. Additionally, the question arises as to the efficacy of these types of apps to effectively assist the marginal populations where the disease seems to have the largest impact.  Combined with the invasion of privacy that this involves, the U.S. needs to more seriously interrogate whether or not the potential rewards of app-based contact tracing outweigh the obvious—and potentially long term— risks involved.

First among the concerns is the potential for the information collected to be used to identify and target individuals. For example, in South Korea, some have used the information collected through digital contact tracing to dox and harass infected individuals online. Some experts fear that the collected data could also be used as a surveillance system to restrict people’s movement through monitored quarantine, “effectively subjecting them to home confinement without trial, appeal or any semblance of due process.” Such tactics have already been used in Israel.

Apple and Google have taken some steps to mitigate the concerns over privacy, claiming they are developing their contact tracing tools with user privacy in mind. According to Apple, the tool will be opt-in, meaning contact tracing is turned off by default on all phones. They have also enhanced their encryption technology to ensure that any information collected by the tool cannot be used to identify users, and promise to dismantle the entire system once the crisis is over.

Risk

Apple and Google are not using the phrase “contact tracing” for their tool, instead branding it as “exposure notification.” However, changing the name to sound less invasive doesn’t do anything to ensure privacy. And despite the steps Apple and Google are taking to make their tool more private, there are still serious short and long term privacy risks involved.

In a letter sent to Apple and Google, Senator Josh Hawley warns that the impact this technology could have on privacy “raises serious concern.” Despite the steps the companies have taken to anonymize the data, Senator Hawley points out that by comparing de-identified data with other data sets, individuals can be re-identified with ease. This could potentially create an “extraordinarily precise mechanism for surveillance.”

Senator Hawley also questions Apple and Google’s commitment to delete the program after the crisis comes to an end. Many privacy experts have echoed these concerns, worrying what impact these expanded surveillance systems will have in the long term. There is plenty of precedent to suggest that relaxing privacy expectations now will change individual rights far into the future. The “temporary” surveillance program enacted after 9/11, for example, is still in effect today and was even renewed last month by the Senate.

Reward?

Contact tracing is often heralded as a successful method to limit the spread of a virus. However, a review published by a UK-based research institute shows that there is simply not enough evidence to be confident in the effectiveness of using technology to conduct contact tracing. The report highlights the technical limitations involved in accurately detecting contact and distance. Because of these limitations, this technology might lead to a high number of false positives and negatives. What’s more, app-based contact tracing is inherently vulnerable to fraud and cyberattack. The report specifically worries about the potential for “people using multiple devices, false reports of infection, [and] denial of service attacks by adversarial actors.”

Technical limitations aside, the effectiveness of digital contact tracing also requires both large compliance rate  and a high level of public trust and confidence in this technology. Nothing suggests Apple and Google can guarantee either of these requirements. The lack of evidence showing the effectiveness of digital contact tracing puts into question the use of such technology at the cost serious privacy risks to individuals.

If we want to appropriately engage technology, we should determine the scope of the problem with an eye towards assisting the most vulnerable populations first and at the same time ensure that the perceived outcomes can be obtained in a privacy perserving manner.  Governments need to lay out strict plans for oversight and regulation, coupled with independent review. Before comprising individual rights and privacy, the U.S. needs to thoroughly asses the effectiveness of this technology while implementing strict and enforceable safeguards to limit the scope and length of the program. Absent that, any further intrusion into our lives, especially if the technology is not effective, will be irreversible. In this case, the cure may well be worse than the disease.