This article is part of a VB special issue. Read the full series: AI and Surveillance.

After saving lives, the most urgent — and hotly debated — problem facing government policymakers in the age of COVID-19 may be how to strike a balance between privacy and public health. The fast-moving and unprecedented story around surveillance tech highlights a long-delayed push for comprehensive consumer data privacy laws, even as privacy advocates grudgingly agree that governments may need to suspend some civil liberties during the pandemic. It’s about a global scramble to stop the spread of COVID-19 and get everyone back to work — without killing privacy or a lot of people in the process.

At a moment when people are giving up rights in exchange for public safety, it’s tough to say what ideal privacy protections even look like. So we spoke with privacy advocates about the kind of legislation we need in a global public health emergency and the types of surveillance both Congress and privacy advocates want to avoid.

The World Health Organization (WHO) says artificial intelligence and big data are an important part of the pandemic solution. Data from smartphones, mobile apps, fitness trackers, and other sources can be fed into predictive AI models to help public health officials evaluate the risks and potential impact of efforts to contain the virus or build up defenses for the next wave.

But the public is rightly suspicious of tech solutionism right now. Around the world, businesses and democratic societies are trying to weigh the tradeoffs between what some fear will be the rise of police states and others hope will allow people to return to some semblance of normal life. Either way, the impact of contact tracing solutions and aggregated data — even if anonymous — will cast a long shadow.

VB Transform 2020 Online - July 15-17. Join leading AI executives: Register for the free livestream.

“I think it’s important to consider that the inertia of these kinds of things, that the decisions we make now, will impact privacy for years to come,” Northeastern University professor of computer science and law Woodrow Hartzog said last week in a conversation about privacy and COVID-19.

Dueling privacy legislation

Recent attempts at legislation could prove instructive for U.S. privacy advocates. This spring, the state of Washington, home to Amazon and Microsoft, fell short of passing its own privacy law for the second year running, due to an impasse over whether to allow individual people to take direct legal action. Now U.S. lawmakers seek national legislation to address lingering privacy problems in the age of COVID-19. Two dueling bills are underway, but they’re entirely partisan.

Late 2019 brought an opportunity to pass nationwide consumer data privacy legislation, akin to the GDPR in the European Union or the California Consumer Privacy Act (CCPA) in California. This would determine how companies like Amazon, Apple, Facebook, and Google share, collect, and treat users’ data. In November 2019, Democratic leaders in the Senate put together a set of privacy principles. Senate Commerce Committee ranking member Maria Cantwell (D-WA) then introduced a federal bill called the Consumer Online Privacy Rights Act (COPRA) that would establish protections similar to those afforded by GDPR.

On May 7, 2020, four Republican U.S. senators, led by Roger Wicker (R-MS), responded with their own bill, dubbed the COVID-19 Consumer Data Protection Act (CDPA), that would place rules on how businesses use data during the health emergency. It was officially introduced the next day.

CDPA would require businesses to obtain consent for proximity and location tracking and to employ data minimization measures, like encryption or anonymization to protect data. And it would force companies to delete collected data once the crisis is over.

The bill, which would supersede or prevent the passage of state data usage laws, also requires transparency reports. Should CDPA pass the Republican-led Senate, it’s unclear what amendments the bill would undergo in the Democratic-majority House of Representatives. One critic of the bill called it deregulation in disguise, and it currently lacks the Federal Trade Commission (FTC) support needed to enforce it.

So what’s the difference between the bills? According to an analysis by Brookings Institution visiting fellow Cameron Kerry, both take inspiration from GDPR and CCPA and include:

  • Privacy rights for individuals
  • Consumer access to modify or delete data
  • Data minimization to prevent collection of personally identifiable information
  • More FTC enforcement powers
  • Algorithmic bias studies
  • Rules for how businesses collect and use personal data

Like facial recognition, privacy concerns seem to elicit a fair amount of agreement between Democrats and Republicans. But the bills differ in the way they define injury from harmful data practices and how they view potential recourse. For instance, COPRA gives individuals the right to file suit for data privacy injury claims, while CDPA includes no such protection.

The bills also approach discrimination in algorithmic decision making in different ways. COPRA requires annual data assessments and explicitly prohibits algorithmic discrimination in housing, employment, lending, and education, while CDPA suggests that instances of discrimination be referred to state agencies. While CDPA contains some elements of the GDPR and CCPA that were put forward in late 2019, Hartzog considers the bill less than original.

“It’s the same playbook we’ve been borrowing from for years, as though we somehow solved privacy and data protection before 2020,” he said. “This is a really great opportunity to use some legal imagination for a problem that in my opinion was not solved before the pandemic.”

Had senators acted when the opportunity presented itself months earlier, we might now have a clearer understanding of how businesses should treat data during both normal circumstances and emergencies. Legislation like GDPR was not designed to regulate the use of anonymized data, but it prompted EU member states to commit to using only anonymized and aggregate mobile phone location data and to deleting the data once this public health crisis is over.

Remote ‘paper hearing’ testimony

Last month, the Senate Commerce Committee brought together advertising, technology, and privacy experts to discuss how big data and AI can join the fight against coronavirus and shape surveillance options. Testimony before this group of 25 senators could shape data privacy laws, expanded government powers, and surveillance in the age of COVID-19. Additionally, should initial efforts fail to slow the spread of the disease, the kinds of data and approaches discussed in the hearing could inform increased surveillance efforts. These matters may also influence reforms underway at the Centers for Disease Control and Prevention (CDC), such as how to use the $500 million for public health data surveillance and infrastructure modernization included in the CARES Act, which passed in March.

Like much of the country, members of Congress have mostly worked remotely in recent weeks, and, as House Speaker Nancy Pelosi put it, Congress can’t just jump on Zoom. So the big data discussion took place through an unprecedented process called a “paper hearing.”

Instead of back-and forth exchanges broadcast live on TV and online, the committee chair, ranking member, and expert witnesses shared opening testimony on the committee website. Then the committee asked experts questions and gave them days to respond. As a result, the paper hearing produced hundreds of pages of testimony.

In opening testimony, committee chair Wicker said data collection during the pandemic “underscores the need for uniform, national privacy legislation.” Meanwhile, ranking member Cantwell called for rules around anonymized data sharing during the pandemic, including elements like measurable outcomes, an end date, and strong consumer rights protections.

When anonymized, big data can be useful to public health policy leaders, who have to decide where containment is needed, predict when and where the next outbreak will occur, or discover COVID-19 clusters. Forms of anonymized data health officials can use to build noninvasive, powerful tools include the following:

One of the experts who testified at the hearing is University of Washington School of Law professor and UW Tech Policy Lab co-director Ryan Calo. Along with other witnesses, he testified that aggregated data can and should inform health policy where appropriate.

In a recent, illuminating piece of research, partners including the United Nations and World Health Organization cataloged the ways AI can aid in the fight against COVID-19. Citing the study, Calo testified that big data and AI initiatives for fighting COVID-19 should take place in three ways: epidemiology and clustering on a societal level; COVID-19 diagnosis on a clinical level; and things like protein structure prediction, drug repurposing, and work on drugs and vaccines on a molecular level.

Avoid mission creep

In his testimony, Calo stressed that a sunset clause is necessary to avoid secondary use of data and protect against mission creep. Perhaps the best known example of this not happening is the Patriot Act, passed in the wake of 9/11.

“Americans and their representatives should be vigilant that whatever techniques we use today to combat coronavirus do not wind up being used tomorrow to address other behaviors or achieve other goals,” Calo said. “To paraphrase the late Justice Robert Jackson, a problem with emergency powers is that they tend to kindle emergencies.”

Federal agencies should have a clear reason for collecting information, Calo said, and he believes judicial oversight and FTC Act enforcement should be part of the plan. This is in line with privacy legislation from Cantwell and Wicker, who both advocate for a more powerful FTC to enforce data privacy laws. In response to a question from Sen. Richard Blumenthal (D-CT), Calo said, “In my view, nothing short of federal privacy legislation that contains concrete safeguards against violating the privacy expectations of consumers and empowers the Federal Trade Commission, state attorney generals, and (ideally) individual litigants to police against abuse will be adequate.” CDPA, again, does not include power for individual litigants to bring lawsuits for data misuse.

The Center for Democracy and Technology’s Privacy and Data project director Michelle Richardson testified that, except for researchers who use data to prepare for future outbreaks, companies and governments should destroy collected data once the health crisis has passed. Richardson joined Calo and other experts in urging Congress to stipulate that the government explicitly state any intended use before obtaining data from businesses.

Evaluate contact tracing apps

Beyond laments over the lack of existing data privacy law, issues around contact tracing apps took center stage. For context, the hearing started on April 9, a day before Apple and Google announced their unprecedented partnership to create interoperability between their popular mobile operating systems to power COVID-19 contact tracing apps. Such apps may include decentralized proximity tracking and alerts if users have crossed paths with an infected individual, alongside a range of other resources, such as where to get tested or seek medical attention and how to self-report COVID-19 symptoms or diagnoses.

Meanwhile, countries around the world are launching their own contact tracing or proximity tracking apps. In places like Singapore and South Korea, tracking app results are delivered to government officials, who promptly round up people who violate strict quarantines.

According to testimony from the Future of Privacy Forum senior counsel Stacey Gray, privacy-preserving apps are typically voluntary and decentralized, have transparent source code, and process information locally on devices. A decentralized app keeps all your data, and the processing of that data, on your phone. With a centralized approach, information is stored in a central repository.

Privacy advocates from the EFF to the ACLU endorse decentralization over centralization. Multinational groups of tech, privacy, and health experts are developing apps at Stanford and MIT, and the University of Washington is working with Microsoft volunteers on a privacy-focused app called CovidSafe.

An example Gray endorsed is DP3T, from a pan-European privacy tracing project. DP3T, like the joint Apple-Google approach, uses decentralized Bluetooth signals for contact tracing — an approach approved by European Union Commissioners and several privacy advocates. Another version, MIT’s Private Automated Contact Tracing (PACT) protocol, which combines GPS and Bluetooth, was endorsed by the Centers for Disease Control and Prevention (CDC) in preliminary guidelines for contact tracing apps released in late April.

Decentralization attempts to eliminate the possibility of a single attack exposing all app users, a major concern in an era of frequent data breaches. That’s why U.K. government officials who chose a centralized app that collects location data are now facing a public backlash. Apple and Google made a point of emphasizing that apps using their Bluetooth solution may not use location tracking. The Financial Times reported that the U.K.’s National Health Service (NHS) is currently testing a decentralized app that uses the Apple-Google service.

Adoption will be a major hurdle for these apps. European Union officials predict adoption rates will need to reach 60% in order for contact tracing apps to be effective. An Apple spokesperson talking with reporters in the days after the company’s contact tracing announcement said Apple doesn’t know what an effective threshold will be because it’s never done anything similar before. As a way to maximize adoption, Apple and Google recently announced that just one app per country can use the interoperability API.

The first stable version of Apple and Google’s interoperability API is due out later this month, but a lack of trust may hurt its chances of long-term adoption. A Washington Post-University of Maryland poll found that a majority of U.S. citizens are unwilling to download a contact tracing app made by Apple and Google. However, the same poll demonstrates that people put greater trust in public health authorities, and Apple and Google will make their interoperability API available only to public health officials. In the coming months, Android and iOS will automatically broadcast and record Bluetooth Low Energy contact events so people who test positive can still download an app and warn others, but Apple and Google say no information will leave a device without user permission.

Calo commends decentralized Bluetooth contact tracing apps for being privacy conscious, but he worries about their limitations. Foreign jurisdictions that have been successful in containing the coronavirus, he said, have used widespread testing and contact tracing, imposed mandatory quarantines, used digital badges to prove immunity at checkpoints, and employed other approaches that take a heavy toll on civil liberties.

“I believe that individual surveillance for purposes of combating the spread of COVID-19 will inevitably involve some trade-offs to privacy and civil liberties if they are to be effective. Such trade-offs may be worthwhile, assuming we safeguard privacy and civil liberties by promoting accountability and limiting mission creep or secondary use of that data. But trade-offs will exist,” he said. “Nevertheless, should the United States decide to follow the lead of these foreign jurisdictions, we can and should put safeguards into place, such as judicial oversight, and express limitations on secondary use and mission creep.”

He went on to argue that it’s fair to question whether apps created quickly by small teams, like MIT’s PACT-based Private Kit: Safe Paths app, will keep sensitive information secure. And he expressed doubt about the feasibility of balancing everyone’s interests. “I believe voluntary, self-reported, and self-help approaches to digital contact tracing such as MIT’s are likely to prove ineffective and could perhaps do more harm than good,” he said.

Another serious issue is that there’s no way for an app to detect asymptomatic carriers. More than half of the 3,000 people who were recently tested in San Francisco tested positive for COVID-19 but were asymptomatic, and 90% of those positive cases were people who had to leave home for work. In March, the WHO reported that 80% of cases are mild or asymptomatic.

Be humble and beware sabotage

Dr. Anthony Fauci, the most trusted public official in the U.S on coronavirus matters, talked about the importance of humility at a Senate Health Committee hearing last week. Calo has also cited the need for a healthy dose of humility when applying AI to problems of global health.

Calo’s testimony touched repeatedly on the example of Google Flu Trends, which relied on anonymized data for flu forecasts. The model had some initial success, but years later it was found to be inaccurate. Google’s AI for diabetic retinopathy, which claimed accuracy above 90%, failed when used in the wild last month, offering a similar lesson.

Calo also urged caution when it comes to self-reported data and potential sabotage by adversaries. “A foreign operative who wished to sow chaos, an unscrupulous political operative who wished to dampen political participation, or a desperate business owner who sought to shut down the competition — all could use self-reported instances of COVID-19 in an anonymous fashion to achieve their goals,” he said.

Who powers COVID-19 pings?

Contact tracing apps are testing different notification approaches. In Germany, as part of a federally funded study currently underway, an app lets testing labs directly notify people you’ve been in close contact with if you test positive. In other places, people who test positive are being given unique notification codes to enter themselves. Future of Privacy Forum’s Gray urged Congress to ensure health care professionals approve the triggering of alerts that cue individuals to self-quarantine.

Apple and Google shared a digital identifier code option in sample code and user interface examples for developers making apps for public health agencies. However, an Apple spokesperson told reporters the Exposure Notification API won’t impose a testing confirmation requirement in order to allow doctors to clinically diagnose a patient with COVID-19, in the face of testing kit shortages, and still notify contacts.

Ideally, ample testing would mean notifications are sent only after a diagnostic test verifies a person has COVID-19, but testing capacity continues to be an issue in much of the world, including in the U.S. Despite many states partially reopening or planning to reopen soon, health professionals say U.S. testing capacity must reach a million tests a day if the country is to reopen safely.

Addressing trust issues

Most data collected through consumer-facing tech providers is not currently subject to any national privacy laws, Gray pointed out, and therefore it’s vital that such companies make clear, strong privacy commitments. Gray acknowledges that data can rarely be considered truly anonymous, but National Institute for Standards and Technology (NIST) guidance on de-identification of personal information and privacy-enhancing technologies (PETs) can support privacy-conscious big data initiatives.

“Maintaining consumer trust in the use of sensitive data in a public health emergency is critical, especially when it relies on the voluntary adoption of consumer-facing apps and screening tools,” Gray said. “[C]ompanies and government agencies can help build trust through transparency, by being clear about the collection, use, and sharing of personal data, and sharing technical specifications for de-identification methods. … So not only do privacy and effectiveness not conflict, but they also depend on and reinforce each other.”

Chris Gilliard offers an example of the kind of mistrust governments and tech companies must overcome if voluntary contact tracing solutions are to succeed. Known online as Hypervisible, he’s an English professor at a community college in Michigan and grew up in Detroit. When he isn’t teaching classes, he’s tracking developments at the intersection of privacy, technology, and race. Last fall, he testified before Congress about big data, privacy, and financial services. Gilliard told VentureBeat he closely follows tech surveillance because he thinks not enough people are paying attention to how surveillance impacts the lives of black and brown people in the United States. And right now, he’s concerned with growing calls for more surveillance.

“I think this is an opportunity in the worst way possible for the government to ramp that up and use it not only against undocumented folks and things like that, but to use it more and more for common occurrences,” he said. “I mean, we already see that with Palantir, and Clearview, and on and on, but I think it could be worse.”

Palantir is developing a data platform for the U.S. Department of Health and Human Services, while Clearview AI, which was recently found to have ties with white supremacists, proposed its system — trained with billions of images obtained without permission — to state and federal governments tracking the virus. Gilliard said he doesn’t want to see facial recognition used in contact tracing efforts. Detroit is one of the only major cities in the country known to be using real-time facial recognition.

When it comes to recognizing the faces of people in masks, traditional facial recognition systems, including the kind Apple employs for its Face ID and Google for its Face Unlock, fall short — but that’s changing fast. In China, SenseTime says its technology can recognize people wearing masks. In the U.S., Rank One is currently selling facial recognition to law enforcement agencies that it claims can identify people in masks.

Electronic Frontier Foundation (EFF) senior staff attorney Adam Schwartz is also concerned about the prospect of facial recognition in COVID-19-related surveillance. The EFF joined efforts to ban facial recognition in cities around the country, starting one year ago in San Francisco, just blocks from EFF headquarters.

“We believe in a human right to evade face surveillance by wearing a mask,” Schwartz told VentureBeat. “We see in the context of COVID the same vendors [and] the same law enforcement think tanks that have always been wanting to do face recognition [saying], ‘Oh, we can COVID-wash this. We can justify further expansion of this infrastructure of surveillance because we want to take the temperature or scan the face of everyone walking by and see if they’re supposed to be at home.’ We are adamantly opposed to that.”

Back in Detroit, Gilliard said he’s concerned about a number of developments, such as repurposing house arrest tech to track quarantined COVID-19 cases and placing more prisoners under house arrest to reduce prison populations. He also thinks novel, untested tech like thermal cameras and AI that detects COVID-19 from coughs should be shelved, and he worries that solutions like digital badges for COVID-19 status verification or “voluntary” contact tracing apps could be tied to employment or interstate travel.

According to U.S. Bureau of Labor Statistics data released last week, record unemployment during the pandemic is hitting Hispanic and African Americans the hardest. Black and brown people are also more likely to die from COVID-19 than white people. Led by African American deaths, Gilliard’s hometown of Detroit has one of the highest COVID-19 mortality rates of any U.S. city, while Chicago halted some reopening efforts earlier this month to address the emergency in its black and brown communities. But Gilliard rejects the idea that surveillance will solve this problem.

“When people are clamoring for more surveillance to curb the spread of the virus and allocate treatment, as a black man who lives in Detroit, I’m really not convinced that’s going to work out to my benefit or the benefit of my community,” he said. “Whatever the other side of this looks like, I have a lot of history that says those techniques are going to be used to further harm any marginalized group, whether that’s Muslims or undocumented folks or black folks. I feel like that argument comes out of a position of privilege and a sense that institutions are in our favor and not actively hostile toward us, which is a position I do not share.”

Speaking about trust in tech giants and European contact tracing tech in a panel conversation with Hartzog last week, Italian constitutional law professor Oreste Pollicino supported Gilliard’s historic perspective. “The past counts,” Pollicino said, adding that companies must foster trust regularly, but that private digital powers haven’t always acted in a way deserving of trust. When it comes to matters of trust and surveillance tech, he said, people should “no longer believe in fairy tales.”

Tradeoffs between privacy and protection

In his Senate testimony in April, Calo said each surveillance solution to stop COVID-19 carries a measure of promise or peril — in other words, tradeoffs. Drones may be able to identify people with heightened temperatures in high-density public places, and Spot Mini robots can enforce social distancing in Singapore, but these measures can also make people feel like they are living in a police state.

Another example: Both Calo and Gray endorse businesses sharing data with governments only after it has been anonymized with differential privacy, a technique for adding privacy-preserving noise to data. But noise added to data can reduce accuracy — the more noise, the less accuracy — and reduce the ability to audit models for bias results.

Decentralized apps might be the ticket for privacy advocates, but decentralized leadership doesn’t seem like an effective approach to testing policy or contact tracing. And there are still a lot of unanswered questions. In the weeks and months ahead, we’re going to find out if economies can safely reopen (CDC estimates do not bode well), whether social distancing becomes the new stop-and-frisk in New York, and how effective contact tracing apps can be in preventing a second wave of infections. We’re going to find out which businesses survive, if Congress can solve privacy legislation and data sharing issues, whether stricter approaches are required, and just how much people are willing to trust Apple and Google.

To date, only two U.S. states are using contact tracing apps, but we’re going to find out which app health officials in each country will choose to license the joint Apple-Google API. It’s an untested solution, though we can already draw lessons from nations bending the curve with widespread testing, traditional contact tracing, swift action, and containment.

Consider South Korea, perhaps the best success story among the world’s democratic nations. A country of roughly 50 million people that got its first confirmed coronavirus case on the same day as the United States, by the end of February it had the most confirmed cases outside China. For multiple days last week, however, the nation reported no new confirmed COVID-19 cases. Outbreaks caused the closure of bars and nightclubs following the recent reopening of economic activity, but the nation has seen fewer than 300 deaths.

In South Korea, as well as in Taiwan, governments were ready with emergency powers and data sharing legislation that government leaders had hammered out in the wake of the SARS and MERS outbreaks. That meant public health officials were able to take swift, decisive action that has flattened the curve and saved lives.

Practices like requiring a house of worship to turn over a membership list or imposing heavy fines (or even jail time) for breaking quarantine may seem heavy-handed to Americans. But like one collective trust fall, this crisis is laying naked the state of safety nets, institutions, and businesses around the world. It is revealing that some political leaders are willing to sacrifice workers’ lives by pushing them to return to nonessential jobs prematurely. And it casts a harsh light on the state of inequality in the world today and on deep-seated issues of power and trust.

As businesses and governments take tentative steps to restore economic and social activity, pressure to increase surveillance will almost certainly mount. This is likely to be the case until the death count drops and we achieve herd immunity or a viable vaccine. Recent quarantines for Dr. Fauci, Senate Health committee chair Lamar Alexander (R -TN), and people close to the president and vice president make it clear that reopening the country isn’t as easy as it sounds.

Economist Paul Romer does a nice job of summing up the challenge. He won the Nobel Prize in 2018 for his work in tech and economic policy and suggests the U.S. build its testing capacity so that it’s capable of testing every U.S. citizen every two weeks as a way to restart the economy. In an interview last week with PBS, he said there’s much more at stake than health or economy activity.

“We can rebuild. We can recover income. But if we damage our institutions of rule of law, of democracy, of basic freedoms, that will take a lot longer to rebuild,” he said.

With 100,00 deaths so far or more, the United States has one of the highest COVID-19 mortality rates in the world. It will take weeks to see if lifting lockdowns cause a spike in new cases, and the weight of this medical and economic tragedy on privacy, surveillance, and our lives will grow heavier until we find an exit strategy.

As we weigh the tradeoffs between privacy and public health, we must remember that solutions being developed to fight COVID-19 would do well to limit the kind of surveillance that could undermine civil liberties for decades to come.

Read More: VentureBeat's Special Issue on AI and Surveillance