As the COVID-19 pandemic continues, the world has turned to the tech industry to help mitigate the spread of the virus and, eventually, help transition out of lockdown. Earlier this month, Apple and Google announced that they are working together to build contact-tracing technology that will automatically notify users if they have been in proximity to someone who has tested positive for COVID-19. However, reports show that there is a severe lack of evidence to show that these technologies can accurately report infection data. Additionally, the question arises as to the efficacy of these types of apps to effectively assist the marginal populations where the disease seems to have the largest impact. Combined with the invasion of privacy that this involves, the U.S. needs to more seriously interrogate whether or not the potential rewards of app-based contact tracing outweigh the obvious—and potentially long term— risks involved.
First among the concerns is the potential for the information collected to be used to identify and target individuals. For example, in South Korea, some have used the information collected through digital contact tracing to dox and harass infected individuals online. Some experts fear that the collected data could also be used as a surveillance system to restrict people’s movement through monitored quarantine, “effectively subjecting them to home confinement without trial, appeal or any semblance of due process.” Such tactics have already been used in Israel.
Apple and Google have taken some steps to mitigate the concerns over privacy, claiming they are developing their contact tracing tools with user privacy in mind. According to Apple, the tool will be opt-in, meaning contact tracing is turned off by default on all phones. They have also enhanced their encryption technology to ensure that any information collected by the tool cannot be used to identify users, and promise to dismantle the entire system once the crisis is over.
Apple and Google are not using the phrase “contact tracing” for their tool, instead branding it as “exposure notification.” However, changing the name to sound less invasive doesn’t do anything to ensure privacy. And despite the steps Apple and Google are taking to make their tool more private, there are still serious short and long term privacy risks involved.
In a letter sent to Apple and Google, Senator Josh Hawley warns that the impact this technology could have on privacy “raises serious concern.” Despite the steps the companies have taken to anonymize the data, Senator Hawley points out that by comparing de-identified data with other data sets, individuals can be re-identified with ease. This could potentially create an “extraordinarily precise mechanism for surveillance.”
Senator Hawley also questions Apple and Google’s commitment to delete the program after the crisis comes to an end. Many privacy experts have echoed these concerns, worrying what impact these expanded surveillance systems will have in the long term. There is plenty of precedent to suggest that relaxing privacy expectations now will change individual rights far into the future. The “temporary” surveillance program enacted after 9/11, for example, is still in effect today and was even renewed last month by the Senate.
Contact tracing is often heralded as a successful method to limit the spread of a virus. However, a review published by a UK-based research institute shows that there is simply not enough evidence to be confident in the effectiveness of using technology to conduct contact tracing. The report highlights the technical limitations involved in accurately detecting contact and distance. Because of these limitations, this technology might lead to a high number of false positives and negatives. What’s more, app-based contact tracing is inherently vulnerable to fraud and cyberattack. The report specifically worries about the potential for “people using multiple devices, false reports of infection, [and] denial of service attacks by adversarial actors.”
Technical limitations aside, the effectiveness of digital contact tracing also requires both large compliance rate and a high level of public trust and confidence in this technology. Nothing suggests Apple and Google can guarantee either of these requirements. The lack of evidence showing the effectiveness of digital contact tracing puts into question the use of such technology at the cost serious privacy risks to individuals.
If we want to appropriately engage technology, we should determine the scope of the problem with an eye towards assisting the most vulnerable populations first and at the same time ensure that the perceived outcomes can be obtained in a privacy perserving manner. Governments need to lay out strict plans for oversight and regulation, coupled with independent review. Before comprising individual rights and privacy, the U.S. needs to thoroughly asses the effectiveness of this technology while implementing strict and enforceable safeguards to limit the scope and length of the program. Absent that, any further intrusion into our lives, especially if the technology is not effective, will be irreversible. In this case, the cure may well be worse than the disease.
There’s been a lot of talk about privacy lately, whether it’s about how social media is tracking and selling your every move online, or video-conferencing privacy breaches, or regulations such as GDPR or CCPA. And now, with COVID-19, there are numerous conversations about the balance between effective mitigation through contract tracing and privacy rights (eg: is it ok for the government to know my health status and track me if I’m positive?).
For Companies — Privacy Builds Trust and Trust Builds Value.
Conversations about privacy are healthy and important. And as a business, those conversations should be starting early in your strategic planning. If you do it right, you can build brand value. If you do wrong, or only do it when pressed by your clients or the press, you have an uphill battle. Just ask Zoom.
Privacy by Design creates the framework for building a brand based on respect
The best thing, therefore, is to get ahead of the curve, and institute a concept called Privacy by Design into your systems and operations planning. Privacy by Design is a set of foundational principles originally developed by the former Privacy Commissioner of Ontario, Ann Cavoukian, and has subsequently been incorporated into the E.U’.s privacy regulations, GDPR.
Privacy as the Default is key
A full review of the Privacy by Design principles are beyond the scope of this blog; they can be reviewed here. One of the principles I would like to review is the concept of Privacy as the Default. As the name implies, this principle states that all aspects of the system and operational workflows assume privacy first. For every piece of personal or sensitive information, we first ask why we need it in the first place. Is it actually crucial to the client’s use of our product or our ability to serve the client?
If we decide we need the data, we should then seek to limit how much and for how long we need to keep the data. And we should be transparent with our clients as to why and how their data will be used and disposed of and to whom and under what conditions it may be shared.
Differentiation in a digital age is harder than ever. Fortunately, you can demonstrate that you respect your clients and improve your brand value by being proactive with regards to privacy.
Much has been written about the security and privacy issues with the Zoom videoconferencing application. What may be written more about over the next few months (and in numerous case studies) is how Zoom is responding to those issues.
With a user base which has more than doubled since the beginning of the year, Zoom has benefited greatly from the WFH global environment. It is incredible that it has been able to sustain its operability during this growth. But it’s perhaps more impressive that the company, and its CEO in particular, is focusing seriously and aggressively on privacy. This is particularly notable in an era that is unfortunately also fraught with profiteering, scamming and passing the buck.
It hopefully is a wake up call for any company to take it’s privacy issues seriously and to recognize that by doing so, you are not only securing public trust, you are creating brand value.
In 1982, Tylenol responded to its own crisis, when some of its products were tampered leading to poisoning, by pulling every bottle off the shelves and owning the issue. Since then, their response has been a PR crisis case study.
I think Zoom is on its way to becoming a case study as well.
With stay-at-home orders in place across the globe, online video communication services have seen a skyrocket in use. In particular, the video platform Zoom is on a tear. The company’s shares are on the rise, and mobile app is currently #1 in the Apple app store. Families and friends use it to connect, and entire school systems rely on it to continue classes online. But with the increased use comes increased scrutiny.
According to the New York Times, the New York Attorney General is now looking into Zoom’s security practices. The letter, sent from the state’s Attorney General’s office, expresses concern “that Zoom’s existing security practices might not be sufficient to adapt to the recent and sudden surge in both the volume and sensitivity of data being passed through its network.” Zoom’s privacy issues have also been noted by Consumer Reports, Forbes and Doc Searles.
Zoom for Telehealth and Legal Counsel
Worries about Zoom’s privacy standards are of particular concern for industries that require confidentiality, such as medicine, therapy, and legal counsel.
Telehealth services have quickly become commonplace as more and more people are staying at home. The company does provide a HIPPA-compliant version of their services. However, the recent compliance waiver for telehealth allows health care providers to opt for the far cheaper but less secure version of the software. Now, many insurers are allowing health care providers to bill for telehealth visits. This has opened up the floodgates for patients to meet with doctors and therapists over Zoom.
Given the concerns over Zoom’s privacy practice, it is an open questions whether doctor patient confidentiality and attorney client privileges can be properly guaranteed. For example, Zoom boosts the use of end-to-end encryption, but recent reports show this is not entirely accurate. While Zoom does use end-to-end encryption in certain settings, video meetings use another form of encryption that does not restrict the company’s ability to access those communications.
And, of course, Zoom alone can’t stop the trolls from invading your zoom meetings, especially those you’ve shared publicly. Even the FBI is warning about zoom bombing. There are ways to limit that, though by being prudent with your Zoom settings.
Zoom Responds, but is it Enough?
Despite these minor changes, it may not enough to protect user privacy and guarantee confidentiality for industries that require it. Zoom is, indeed, booming and it’s hard to see it receding dramatically in a post-Covid world. Let’s hope it takes all the reasonable steps it should to respect privacy along its ride.
The coronavirus has unquestionably changed the way we live. It has also forced us into strange and, until just a few weeks ago, unthinkable ethical dilemmas. To visit loved ones is worth genuine ethical reflection. Modern nations, especially in the West, are built on an ethics of individual freedoms and the right to privacy. However, the current global health crisis is forcing us to rethink just how fundamental those ethics should be. While we already feel this with regards to the freedom of movement, we are just beginning to contemplate how the coronavirus can and should effect our right to privacy.
Contact Tracing and Enforced Quarantine
In order to limit the spread of the coronavirus, experts emphasize the importance of tracking every contact infected patients have had with others. Countries such as China, Singapore, South Korea, and Taiwan have all taken aggressive measure trace all potential contact infected people have had. These measures are widely considered to be a large reason why these countries have been successful in lowering the rate of transmission. However, the aggressive measures taken have come at the cost of individual privacies.
Taiwan and Singapore, for example, regularly post detailed information about everyone who test positive, including where they live and work, what train stations they have used, and what bars and restaurants they frequent. South Korea now has an app that allows users to track the exactly movement of those infected.
Countries are also using location data to enforce quarantine for those infected. Israel, for example, is now using data collection techniques previously used for counterterrorism efforts to identify anyone potentially exposed to the virus. The government uses this information to send text messages to those exposed ordering them to quarantine.
European and the U.S. Response
As the coronavirus spreads to Europe and the U.S., lawmakers are exploring the use of similar techniques. Italy now uses location data to monitor whether people are obeying quarantine orders. In the U.S., the White House is reportedly in conversations with tech companies to use anonymized location data to track the spread of the virus. HIPPA regulations are being waived to allow doctors and mental health providers to more freely use telecommunication to speak with patients. Companies in Italy, Austria, and Germany have also announced that they will provide location data to governments.
However, with privacy regulations such as the GDPR, it is unclear how aggressively European countries will be able to use personal information. The European Data Protection Board (EDPB) released a statement urging governments to continue to abide by privacy regulations in place. At the same time, however, the EDPB conceded that countries may suspend such regulations “when processing is necessary for reasons of substantial public interest in the area of public health.”
Relaxing the right to privacy has garnered mixed responses by government officials and security experts. Many have pointed out that while the measures taken are extreme, personal information such as location data is highly effective in limiting the spread of the coronavirus. “We are stretched very thin in most states,” said the director of the Center for Global Health at Oregon State University, “so this kind of technology can help every state to prioritize, given their limited resources, which communities, which areas, need more aggressive tracking and testing.”
Others are concerned how this could endanger those whose information is made public. In South Korea, some have used information released by the government to identify infected individuals and attack them online. This has led officials to question how the government uses this information, worrying it will discourage others from getting tested for fear of being publicly exposed.
While nearly all countries have explained suspending the right to privacy is a temporary measure for the benefit of the public health, many worry it will have a permanent effect on how governments and countries view privacy concerns. After 9/11, for example, the U.S. used highly invasive surveillance measures that have since become common place among law enforcement agencies. According to the New York Times, privacy experts worry something similar could happen after the current crisis.
What restrictions we, as a society, can tolerate, and what effect this will have after the current crisis remains an open question. However, it may also involve a false choice. There are technologies to both assist contract tracing and preserve anonymity. Privacy by Design does not have to be put on pause as we develop these tools. In fact, if we want to encourage wide adoption, it might be required.
One can argue about the steps taken so far with regards to the coronavirus, but perhaps no other report has had an impact on what the United States is now doing to curb the spread of the virus than the report published on March 16 by the UK’s Imperial College COVID-19 Response Team. In plain, stark language, the report warns of the dangers of doing nothing and emphasizes that if we want to minimize mortality rate “combining all four interventions (social distancing of the entire population, case isolation, household quarantine and school and university closure) is predicted to have the largest impact.”
Key to this is case isolation and household quarantine, both of which are containment measures. Containment requires, at minimum identification (you have to know who is symptomatic to make sure they are isolated and you have to know who the symptomatic were in contact with to make sure they are quarantined) and communication (you have to know whether you’ve been in contact with someone if you are to self-quarantine).
The technologies exist to help both identification and communication, but at a potential cost to privacy. There’s the impact on privacy to the symptomatic individual, those with whom they have been in contact, and even locations (towns, neighborhoods, stores) through which the person traveled. These risks are not insubstantial. In the case of individuals, it could result in stigmatization, harassment, and even physical threats (if not harm); in the case of locations, it could result in severe economic losses and stigmatization itself. The key to leverage technology with containment is to identify potential privacy risks and embed privacy practices into the technology to minimize those risks.
The MIT Media Lab is doing just this. Yesterday, they released an open-source application called Private Kit: Safe Paths which uses your phone to track your location data and uses that to trace where symptomatic individuals have been and share that information to others so that they can determine whether they may have been in contact with those individuals. And, the app does it in a privacy-preserving way. The app works like this: it first logs your phone’s location data, but keeps it on your phone so that you retain possession of it. If you are diagnosed, you have the choice to consent to sharing your location data with health officials who can make it public. Ultimately, the app will share symptomatic location data with others without the middleman of a health authority so that one can see if they have been in recent contact with anyone who has been symptomatic. It’s a powerful tool that has the potential to have a material impact on containment efforts.
Of particular interest, is the whitepaper MIT developed on this application that outlines the various privacy risks pertaining to containment and how Private Kit addresses them. The report provides an instruction lesson to any organization conduct privacy risk assessments or evaluating privacy controls relative to GDPR or CCPA regulations or to better serve the needs of its constituents.
When confronted with the enormity of something like the coronavirus, its both critical and refreshing to know that we don’t have to throw out our rights to deal with it. After all, in battling something like this virus, we are not only defending our selves, we are preserving the very freedoms that define who we are.