What's At Stake
The COVID-19 pandemic is a global health emergency of unprecedented scale. In responding to the ongoing crisis, governments and businesses have used a wide range of digital tools and techniques to limit the spread of the virus. Lawmakers and public health officials continue to debate the risks and benefits of digital contact tracing, while individuals are encouraged or required to practice social distancing by working, connecting, and learning online. As these systems are developed and increasingly relied upon, it is vital to preserve the privacy rights of all individuals. EPIC is closely monitoring the privacy issues posed by the COVID-19 pandemic, from educational institutions implementing online learning platforms, to employers adopting privacy-invasive monitoring of employees, to an increased reliance on telemedicine.
Federal and state governments must not use the pandemic to justify expanded systems of data collection, location tracking, or other law enforcement techniques that undermine democratic values. The tools used to monitor the spread of the coronavirus or to collect personal health data must have strict privacy safeguards. Privacy and public health are complementary goals, and Privacy Enhancing Technologies can be deployed to serve the public interest while also protecting individual rights.
EPIC is working to ensure that private and public sector responses to COVID-19 safeguard the privacy and civil liberties of all people. Through advocacy, oversight, and litigation, EPIC is ensuring that the coronavirus pandemic does not lead to erosion of individual rights.
Government & Private Sector Data Collection
Federal, state, and local governments are expanding data collection in response to the coronavirus pandemic, often in conjunction with private sector companies. It is essential that government agencies and private companies implement standards to safeguard privacy as they collect personal data, deploy data analytics, and implement digital contact tracing systems for public health purposes. The onus is on governments, companies, and the entities deploying digital contract tracing tools and collecting personal data to ensure that these systems are necessary, effective, lawful, and protective of privacy.
- Government agencies and companies must be transparent about the collection and use of personal and aggregate data—if collection is necessary at all—and observe legal and ethical limits on how technology may be used to address the pandemic.
- Government agencies must not use the pandemic to justify the unnecessary collection of large volumes of data about people's movements and locations.
- Government agencies must not permanently expand government surveillance or undermine democratic values in response to the pandemic.
- Government agencies and companies should use privacy enhancing technologies to limit the collection of personally identifiable information.
The coronavirus pandemic has changed the way employers monitor the workplace and may put employees’ privacy at risk.
In workplaces such as factories, warehouses, essential establishments, and home offices, employers are implementing surveillance technologies to track worker productivity and efficiency. Employers are increasingly adopting machine learning technology to track the movements of its employees, whether in onsite warehouses or through mobile applications remotely. Companies are also increasingly using facial recognition technology paired with infrared cameras to identify anyone with an elevated temperature.
For individuals working from home, employers are implementing technology that can monitor remote activities such as computer usage data, keystroke logs, and web activity. Some of these workplace monitoring technologies rank a worker’s productivity based on their computer activity. The demand for workplace monitoring software and surveillance cameras has increased since the pandemic and employers are tracking more personal information more than ever. As the pandemic eases and the workforce returns to shared workplaces, employers may continue to implement surveillance tools.
- Employers should discontinue the use of unnecessary surveillance equipment.
- Where surveillance technology is implemented, employers must be transparent with employees about the reason for the surveillance and articulate, among other things, what data is collected, how the data will be used, who the data will be disclosed to, and how long the data will be retained.
- Employees should also be involved in this decision-making process, and employers should respect their employees’ reasonable expectations of privacy.
As students across the country shift to remote learning during the coronavirus pandemic, new technology is being deployed in their homes—often without meaningful accountability. The shift towards remote education and the increased reliance on online learning tools like video conferencing platforms and test-proctoring software raise privacy and security risks.
Online education platforms—if not properly regulated—may collect personally identifiable information and sell student data. Zoom, a popular video conferencing platform for students, was revealed to have security vulnerabilities that allowed strangers to join video calls and to harass students. Zoom has implemented security settings to fix this issue, but other online educational tools may have similar vulnerabilities. EPIC petitioned the FTC in February 2020 to conduct a rulemaking about the use of AI in commerce, which would apply to AI-based educational tools in both physical and virtual classrooms.
- Digital education tools must comply with the Family Education Rights and Privacy Act (FERPA), the Children’s Online Privacy Protection Rule (COPPA), and relevant state laws.
As public health officials encourage social distancing to slow the spread of the coronavirus, more people are turning to video conferencing platforms for work, family, and personal communications. Zoom has quickly become one of the most popular video conference platforms to maintain social contact, but significant privacy and security issues have come to light in the past few years.
In a detailed complaint to the Federal Trade Commission, EPIC warned that Zoom had “exposed users to remote surveillance, unwanted videocalls, and denial-of-service attacks.” The Federal Trade Commission, however, has failed to take action against Zoom. Platforms like Zoom—now more than ever—need to have strong privacy and security safeguards.
- The FTC should investigate Zoom's privacy practices, as called for in EPIC's complaint.
Although the use of personal health data can enable governments to better respond to an emergency like the coronavirus pandemic, governments must take steps to safeguard individual privacy in the process.
Many governments and healthcare professionals are relying on online web portals and telehealth systems during the pandemic, but not all of these entities and platforms are covered under the Health Insurance Portability and Accountability Act (HIPPA).
The Department of Health and Human Services recently weakened privacy protections for telemedicine, stating that it will not take enforcement action against healthcare providers that violate HIPPA when consulting with patients remotely so long as providers act in “good faith.”
Transferring data to third parties that may not be HIPPA-compliant creates serious risks to medical privacy. EPIC has long advocated for strong confidentiality protections for medical records. Even during a global health crisis, patient privacy and public health policy require robust protections.
- The collection and use of health data must be carefully limited, as health information can reveal intimate details of a person’s life.
- HIPPA should be updated to cover companies that collect health data.
The COVID-19 pandemic may threaten the legitimacy of elections if voters must decide between feeling safe and participating in the democratic process. EPIC is working to ensure that the 2020 election is secure and fair for all citizens, even during a national emergency.
The safest way to ensure the proper administration of elections is to allow universal vote by mail—not online voting, which is inherently insecure. Several states have postponed or pushed back primaries and expanded vote-by-mail options because of the coronavirus. And even as the world is experiencing a global pandemic, foreign actors continue their efforts to compromise our democratic institutions.
- Allow universal vote-by-mail for the 2020 election.
Law Enforcement Measures
The difficulty in enforcing social distancing during the coronavirus pandemic has led law enforcement to deploy privacy-invading technologies.
Several police departments across the country are utilizing drones to conduct health monitoring of individuals in public and to enforce social distancing rules. Some drone companies claim that their drones are equipped with sensors and computer systems that can track large groups; people coughing and sneezing in crowds; fevers and elevated body temperatures; heart and respiratory rates; and adherence to social distancing guidelines. Other law enforcement agencies have used drones that broadcast announcements in public spaces to enforce social distancing.
The use of drone technology is invasive and poses a threat to constitutional rights. It is unclear what data law enforcement agencies may be collecting with these drones and what they plan to do with the data once collected. Drones can conduct persistent and enhanced surveillance at a distance and can collect a great deal of sensitive and personal data. The deployment of drones during the pandemic is a threat to privacy and risks normalizing the use of these enhanced surveillance systems even after the pandemic.
- Drones should not be used by law enforcement to enforce social distancing.
- Law enforcement’s response to the pandemic should not include using drones to identify, monitor, or collect information on individuals.
The pandemic has wreaked havoc in prison and jail systems. Due to limited space and inadequate medical care, advocates have urged jurisdictions to release as many people from correctional facilities as possible to mitigate the spread of the infection.
Attorney General William Barr announced a Prisoner Release Plan that included algorithmic risk scores as a factor in prioritizing which prisoners are released. Analyses have shown the system consistently estimates risk for white inmates at a lower level than black counterparts. The plan lays out six “non-exhaustive” and “discretionary” factors, which does not indicate how much algorithmic risk scores are being relied on to prioritize the health of some inmates over others. EPIC advocates for algorithmic transparency, fairness, and accountability when using risk assessments.
Entities using risk assessment tools to prioritize offenders for detention or release should
- Articulate the extent to which the results of the tools will be used in decision-making.
- Publish the developer of the tool, the stated purpose of the tool, and the data collected in the tool.
- Delete scores and input data following their use, and only use the data for the specific stated functions.
- Only use static factors that do not require interviews for the tool.
- Publish anonymized results in periodic independent, localized validation studies that evaluate the efficacy of the tool in light of the stated purpose and analyze any disparate impacts based on race, age, ethnicity, gender, sex or other protected classes.