New York Army National Guard at a COVID-19 mobile testing center, New Rochelle, NY, March 14, 2020. <br>CREDIT: <a href="https://commons.wikimedia.org/wiki/File:New_York_National_Guard_(49667734346).jpg">New York National Guard (CC)</a>.
Guardia Nacional del Ejército de Nueva York en un centro móvil de pruebas COVID-19, New Rochelle, NY, 14 de marzo de 2020.
CRÉDITO: Guardia Nacional de Nueva York (CC).

COVID-19 y el futuro de los datos sanitarios, con Mona Sloane

5 de mayo de 2020

La aplicación del rastreo de contactos y la recopilación de datos sanitarios pueden ser necesarias para que la vida vuelva a la "normalidad" ante la pandemia de COVID-19, pero ¿hay alguna forma de asegurarse de que estas prácticas no se conviertan en "herramientas de opresión"? Mona Sloane, becaria del Instituto para el Conocimiento Público de la Universidad de Nueva York, nos habla de su preocupación por la "normalización" de estas tecnologías y el efecto que estas estrategias podrían tener en las comunidades vulnerables.

ALEX WOODSON: Welcome to Global Ethics Weekly. I'm Alex Woodson from Carnegie Council in New York City.

This week's podcast is with Dr. Mona Sloane, fellow at NYU's Institute for Public Knowledge. She’s also a fellow with NYU's Alliance for Public Interest Technology, and a Future Imagination Collaboratory Fellow at NYU's Tisch School of the Arts.

Mona and I spoke about contract tracing and the collection of health data in the context of the COVID-19 pandemic. We discussed current privacy issues related to this and concerns about how this data could be used in future, especially as it relates to vulnerable communities. We also spoke about the differences between the United States and Europe and the effect that the General Data Protection Regulation or GDPR may have on data collection in the European Union.

For more from Mona, check out her April 1 Daily Beast article "Today's COVID-19 Data Will Be Tomorrow's Tools of Oppression," co-written with Albert Fox Cahn.

And for more on health data and contact tracing, you can check out Carnegie Council’s recent webinar "Health Data, Privacy, & Surveillance: How Will the Lockdowns End?" That was with ETH Zurich's Effy Vayena and Johns Hopkins' Jeffrey Kahn.

For now, calling in from Brooklyn, here's my talk with Dr. Mona Sloane.

Mona, thank you so much for talking this morning. I'm looking forward to this.

MONA SLOANE: Thank you so much for having me.

ALEX WOODSON: I wanted to speak about an article you wrote a month ago for The Daily Beast, "Today's COVID-19 Data Will Be Tomorrow's Tools of Oppression." You wrote that along with Albert Fox Cahn.

The article talked a lot about privacy issues related to contact tracing and health data. In the month since the article came out, I'm sure things have changed in that time period. Where are we now with contact tracing and health data as it relates to the pandemic?

MONA SLOANE: That's a really tricky question. On the one hand, a lot of things have changed, as you said. On the other hand, they haven't. Obviously the pandemic has accelerated, and the pressures of having to manage that are increasing by the day. We have had Apple and Google team up to provide the technological infrastructure for contact tracing for governments, and we have more data on what works and what doesn't.

I would say, though, that the issues that we raised and the questions that we asked in the article are still there and very important—the questions, for example, around what kind of data is collected and how, what does it mean, and what kinds of tools and models will come out of that which can actually become new tools for oppression. That can actually end up exacerbating inequalities that we have and that the pandemic is also exacerbating as we speak.

ALEX WOODSON: I would like to speak specifically about some of these issues. What are the specific privacy concerns you have for people today related to health data and contact tracing right now, for the next month or so, the next two months, as we get into contact tracing and maybe end lockdowns in some of the states?

MONA SLOANE: Obviously the most important concern is the question of what data is collected on your health status, but also what counts as health data I think is a very important question that we don't talk enough about. For example, when you think about advancements that we have in precision medicine where the goal is to have individualized health management plans, a lot of different kinds of data become health data—how often you exercise, what do you eat, where do you live, and these kinds of things.

One of the concerns that I do have is that we're going to see a normalization of the collection of this data as part of collecting health data, and then we're going to see more and more models that infer this data. This could become an issue in and of itself.

ALEX WOODSON: Just to expand on that point, one of the interesting lines in your article says, "Suddenly what movies you watch, where you travel, how you commute to work, and where you eat go from being consumer data into a metric of your COVID-19 exposure." Some of that's a little obvious—how you commute to work; if you take the subway, you might be more likely to get COVID-19—but how do what movies you watch influence your COVID-19 exposure, for example?

MONA SLOANE: That is an example that we use. Obviously movies cannot cause COVID-19 or you cannot tell from the movies you watch if you have COVID-19. What we mean by that is that we're seeing a very fine-grained profiling already happening. We have seen that pop up as part of what Shoshana Zuboff calls "surveillance capitalism" after the burst of the dot-com bubble and the growth of targeted advertising.

What we mean by that is that we have increasingly sophisticated prediction models that use large data to infer the probability of a certain event occurring and that creates links between things that we wouldn't necessarily see as humans when we look at the complexity of the social world. This sort of fine-grained profiling can extend into all kinds of aspects of social life, and we used the example of movies.

For example, entertainment or where you live is a very well-known example. ZIP Code is related to income, which is related to health status, and so on. That can be expanded very much into questions around cultural, social, and economic distinctions, and entertainment choices fall into that.

ALEX WOODSON: I assume a lot of this was already in place before the pandemic. How much of the infrastructure to create this health data is being built out now, and how quickly is it being built out?

MONA SLOANE: I think the infrastructure is, as you say, already in place, and it's incorporated into the structure mainly. I think what we're seeing at the moment is a normalization of these surveillance infrastructures, and we're seeing a normalization of these infrastructures among government agencies and governments at a point in time where we actually, over the last two to three years, were seeing a so-called "techlash," where governments finally introduced new regulation and legislation related to privacy and data protection and so on, for example with the General Data Protection Regulation (GDPR) coming into force in May 2019.

We could anticipate that we are going to see a reverse trend as part of the pandemic. So the critical tech community is actually debating whether COVID-19 is the end of the techlash and whether the work that has been put into raising these issues is all going to be gone because of that.

But I think what we're missing in this discourse is a more holistic view. What we're seeing—and we're seeing it in all kinds of situations and issues—is a fixation on a technological fix to a very complex social, economic, and political problem, which is the spread of the pandemic.

We talk a lot about contact tracing. It's in the news almost every day, but what we're not talking about are the measures that need to go hand in hand with contact tracing so that it actually makes a qualitative difference.

Contact tracing in and of itself is not going to prevent the spread. We also need a system whereby people can actually stay at home, whereby people who are particularly vulnerable can stay at home without losing their livelihood and losing their jobs. We need contact tracing to work in conjunction with widespread testing. In other words, if there is no widespread testing available, then contact tracing won't work. So I'm very concerned that the fixation on the technology is actually going to distract us from these other very important infrastructural and political problems that we need to address if we want to manage the spread of the virus effectively.

ALEX WOODSON: Definitely. I want to get into that a little bit later, but first, you mentioned the GDPR. I would like to talk about that just in the context of how contact tracing will be or is different in the United States as compared to Europe. Obviously the GDPR is for the European Union. The United States doesn't have that same type of privacy law. What does that mean? I know that the GDPR specifically mentions measures for a pandemic. I think you mentioned that. What does it mean when it comes to how these processes go forward in the United States as compared to Europe?

MONA SLOANE: I think that's to be seen, to be honest. I don't think we have an answer to that just yet. There are a number of things, though, that we can observe that we need to state very clearly. One of them is that the GDPR is a very comprehensive data protection framework, sort of an omnibus framework that addresses a whole lot of questions in one regulation and thereby extends into different sectors, for example. It's not a sector-specific approach.

What is interesting about the GDPR is that it explicitly—and you mentioned that I have said this before—addresses questions around a pandemic. There are a number of interesting points or articles in the GDPR that become relevant for questions around data protection in the pandemic.

One of them is Article 10, where it says that the performance of tasks carried out in the public interest takes precedent over individual data protection. Then we can ask, what is the public interest? We could actually argue that containing the virus very clearly is in the public interest.

Another example is Articles 52 and 54, which talk about the control of communicable diseases and the need for collection and data sharing in that context. It's interesting how this regulatory framework already made space for these kinds of considerations. Let's not forget that this was in the making for 10 or 15 years, so it has been quite a while that the GDPR had been stewing before it actually came into the world.

There are in addition to that cultural differences, I would argue, between Europe and the United States, whereby European regulators and European publics are more sensitive to questions around privacy and data protection. That means that the discourse is held in a different way and that governments are under pressure to only consider technologies that set out to protect individual privacy.

Mind you, though, we are talking—and this is one of the weak points of the GDPR, and Professor Sandra Wachter at the Oxford Internet Institute does fantastic work on that. The GDPR focuses on data protection very heavily in the context of data collection and consent. Once you consent to your data being collected, it is being stored, and then inferences can be drawn and models can be built, and the GDPR does not cover that.

It's actually these kinds of models that then can inflict structural harm because, for example, they predict—and this is a fictional example—that racial categories correlate with the likeliness of different health statuses, for example, diabetes, and so on, and that can then be used to make decisions about insurance policies and all that.

ALEX WOODSON: The title of your article is "Today's COVID-19 Data Will Be Tomorrow's Tools of Oppression." What do you see specifically when you look into the future, maybe a couple of years down the road, when this contact tracing and these different thoughts of privacy are a little bit more entrenched in society?

MONA SLOANE: I think I can answer that by saying what my worst fear is and my biggest hope. My worst fear is that what is going to happen is what we put into the title of the article, which is that these kinds of invasive data collection practices and surveillance strategies are being normalized. We do know that they historically have affected marginalized communities way more than affluent communities and that we're going to see a further normalization of the oppression that we're already seeing—especially in the United States—and that this becomes ingrained into government policies.

That said, my biggest hope is that we're taking this moment of the pandemic as an opportunity to fundamentally rethink the policies that we do have in place currently, that we fundamentally and decidedly address issues of social inequality and divisions on a policy level here in the United States and take seriously that the picture couldn't be any clearer at the moment that something needs to happen and that there is no other choice and that technology alone is not going to solve that. So, here we go.

ALEX WOODSON: Are you leaning toward one of those scenarios at the moment? Which one do you see as more likely?

MONA SLOANE: Even Apple for the first time in a decade didn't make a prediction in their investor call. This is a crisis of prediction I would say.

I couldn't tell you. I think as always we're going to see a mix and a lot of tension come up. Let's not forget that this is an election year also, and this is a very, very tense time.

But I will say that I am very hopeful. I will also say that these surveillance strategies are in place. They are happening. They are primarily happening somewhat behind a corporate curtain. What we're just now seeing is that they are very publicly moving into the public eye and into government services. They are there anyway, but now it's on a large scale. I just hope we're not going to drop the ball and that we're going to take this issue as a cue to fundamentally rethink how certain things are done.

I couldn't tell you. I wouldn't want to make a prediction.

ALEX WOODSON: Of course. Maybe a different way of asking it is, do you see a good level of conversation about this? Do you see the right people paying attention to these questions?

MONA SLOANE: Yes, but then I'm also biased because I'm an academic, and I'm fortunate enough to be in a professional community where these kinds of things are heavily debated and have been for a long time.

But what I'm really seeing is that the public, by living digital lives now, is rapidly gaining a kind of digital literacy and is in that discourse as well right now, which I find very encouraging. So I do think we will hopefully will end up in a situation where we have a democratization of the discourse going on because of the circumstances that we are in.

ALEX WOODSON: One more question about the future before we get to some of the ethical discussion.

When you say that these will be "tomorrow's tools of oppression," you mentioned how vulnerable communities will be affected. What specifically do you worry about happening in the future to communities based on giving away their health data?

MONA SLOANE: I do worry about things like these models, for example, being implemented—and they already are—in government services and these automated tools affecting who gets health care, who gets benefits, whose child is taken away. We do know that these communities are interfacing with these systems more frequently and therefore are more affected. These can end up becoming questions around life or death. That is very serious. I would just point you to the great work of Virginia Eubanks, who has written about that quite a bit.

I want to make very clear that this oppression—and again, I also want to say that I am a privileged white person talking about this on behalf of those communities, and I do think it's important that we hear the communities giving their expertise and their lived experience—has always been there. We are just seeing it reflected in the automated tools that we're using. Perhaps the word "just" was not the right word here, but it's a continuation of social structures that have been in place for a very long time.

The difference now is that we're seeing an acceleration because these tools are designed for scale. Automated tools and models are for scale and scaling and this is where the problems then escalate and all of a sudden a lot of people are affected severely.

ALEX WOODSON: To move on to some of the ethics and principles that you were speaking about before, obviously the goal of contact tracing of health data is the goal we all are looking for, to eradicate the pandemic, eradicate COVID-19, so at some level this has to be done. How can we do this ethically? How can the collection of health data be made more ethical?

MONA SLOANE: That's a tricky question because you would have to tell me what you mean by ethical, and that's the crux of the matter. Because there is no clear-cut definition of ethics unless you delve very deeply into, for example, works of moral philosophy, and ethical behavior is not something that is regulated or enforceable by law. It's a very tricky term that has over the past years been used by industry and increasingly policymakers to circumnavigate harder questions around regulation and the more important aspect of ethics, which is the ethics of practice as opposed to ethical machines.

ALEX WOODSON: If someone were asking you for your advice on how to build these systems so that vulnerable communities can't and won't be targeted in the future, what would you say?

MONA SLOANE: I would say that this is an honorable goal, but it's perhaps not something that can be achieved. As I said, these tools are deeply imbedded into our social world and how we organize our social world, and as such are part of the social divisions that we operationalize every day. So we cannot think about making these technologies fair without making society a little bit more equitable and fair. We can't think and do one without the other.

What we can do, though, is—and I would again say it's never that clear-cut; everything is a little bit more messy and a little bit more complex—focus on certain things specifically, such as issues around privacy. Privacy is something that we can to a large extent operationalize technologically, also how we draw inferences and how predictive tools are actually being used, where, and in what context, and how we make those transparent.

There is a lot of work that is currently being done in that space as part of what is called the FAccT—Fairness, Accountability, and Transparency—research agenda. There are a lot of very interesting interdisciplinary research projects happening at the moment.

I think what we're seeing there is an increasingly important discourse, but then again this sort of discourse and these interventions are kind of superficial if they do not also look at innovation in the practice of designing these tools. If these communities are not at the table, we cannot expect these tools to work for them. That's where we need to start. And that's very challenging at a point where governments are under pressure to manage the pandemic.

But again, as I said, the tools in and of themselves are not going to contain the spread at all. It's all about policy intervention and particularly equitable policy intervention.

ALEX WOODSON: What would you say to people who might be okay with a little bit of oppression right now or allowing the government to know things about their health that maybe they wouldn't have allowed them to know a few months ago? Do you have trouble with someone having that kind of attitude? Do you have trouble with millions of people having that kind of attitude about these issues right now?

MONA SLOANE: I'm not quite sure if I follow what you mean by oppression in this context, or whether you actually mean surveillance and data collection, because those are not the same. Could you clarify?

ALEX WOODSON: The idea that the government might require you to download an app that can track where you are, that says you need to be quarantined for 14 days; if you leave, you can get fined; if you leave, authorities will come to your home. That's what I mean about some of these tools. Some people might welcome something like that right now. We're both in New York City right now. Maybe people should be staying home more than they are. But that's kind of what I'm going for.

MONA SLOANE: Of course. But then again, as a social scientist and especially as a sociologist I would then complicate that narrative a little bit and say, "Well, that's all well and good that there are these intentions by government to manage people's movements and therefore the spread of the virus," but we have to also acknowledge that certain people and certain demographics can afford to stay at home, and others simply cannot; that certain demographics are likely to have health insurance and contact a doctor when they feel ill and others aren't; and also that certain demographics are at much higher risk of contracting the virus because they're not, for example, given protective equipment in their workplaces, but they have to go in because they're essential workers. So I do think we cannot have the conversation about contact tracing and mandatory quarantine without acknowledging that and without putting in place strategies that mitigate that.

For example, Taiwan implemented very early contact tracing and actually contact tracing combined with travel data. What happened there was that people were put into mandatory quarantine that was enforced, but the government also continued to pay their salaries for two weeks. I do understand from a researcher who I heard speak at a Stanford conference about this earlier this month that they were even provided with food, so there was sort of a holistic approach in place to help people.

That is not going to eradicate generally the problem that we have with raging inequality in this country and inequitable access to health care specifically and unevenly distributed exposure to risk, especially health risk, but that is a little bit of a different approach in and of itself. Yes, we can all be for contact tracing and saying, "Yes, we should all stay at home," but some people cannot stay at home, and those are people who are often minority groups and vulnerable anyway. This is where contact tracing is not going to solve that and is also perhaps not necessarily even going to help contain the spread, and we haven't even gotten into the technological issues of contact tracing approaches that are proposed at the moment.

ALEX WOODSON: Let's finish off with something that I talked about with Arthur Holland Michel about. You were supposed to be on a panel with him a few weeks ago at Carnegie Council, and hopefully we can get that back together when times are a little different.

We were talking specifically more about surveillance and drone technology and that type of thing. We were talking about how that will continue into the future and how we can make sure that when something is put in place for the pandemic it will end when the pandemic is over, talking about sunset clauses and things like that. One thing he mentioned that he was more worried about was how the prevalence of these technologies—drone surveillance, contact-tracing apps, and things like that—will change the public discourse around the technologies.

I think you have said a few times that we're normalizing these technologies, so I was wondering if you could expand on that a little bit. What are some of your concerns about how the public thinks about these technologies and how that will continue long after the pandemic is over?

MONA SLOANE: I would actually agree, and I think the point we were just talking about before when you asked the question of whether we shouldn't accept a little privacy invasion for the sake of the greater good, as it were.

Yes, I do agree that a big risk is the normalization in the public discourse. Again, as I said, at a point in time where we actually had reached a point where we had a mounting critical discourse on surveillance and invasive data collection and practices and also exploitative and oppressive data tools and models, I think the risk is that, yes, this is normalized but also that we are seeing a normalization of these tools in government services. But then again, this is already happening. We have already seen that. This is actually not necessarily something new.

What I would hope—and I have said this before, and it's what I am actually seeing—is that are gaining a little bit more data science literacy as a public through these kinds of questions that are coming up. We're all reading the tech news. We all are very rapidly gaining a much better understanding of how these technologies work and what the consequences could be. So I do hope that we might end up in a place where we can have a much more informed and much more democratic conversation about this so we can push back against the risks of the normalization of these technologies.

ALEX WOODSON: Absolute final question: I'm just curious. What specifically will you be working on and looking at over the next few weeks or the next month or so? Are you focused mostly on issues related to the pandemic, or are you able to continue some of your pre-pandemic work as well?

MONA SLOANE: That's a very good question. My work continues. My critical work on ethics and artificial intelligence in the context of design and policy definitely continues, for example, with these kinds of questions around contact tracing, but I'm also fortunate in that I have been able to launch a couple of new projects in relation to the pandemic. One is a project that I'm running with Professor Charlton McIlwain, who is the vice provost for faculty engagement at NYU, and Michael Weinberg, who is the executive director of the Engelberg Center at the NYU Law School, around assessing what kind of public interest technologies actually worked in the pandemic and what kinds of technologies worked for what kinds of communities. This is a larger project that will run for a longer period of time.

Our notion of technology within that includes hardware—so, ventilators and hardware innovation—but also data science interventions and visualizations and new software as well. That's a big research project I'm launching as part of the NYU Alliance for Public Interest Technology.

Another new project I have launched together with Professor Eric Klinenberg at NYU, who is a professor of sociology and also director of the Institute for Public Knowledge, where I'm based, and Eli Pariser, who is the co-director of Civic Signals, is a new series called "The Shift," where we're going to bring together thought leaders around different issues to discuss the social implications of our lives moving online and providing a space for the kinds of questions that we need to be asking as part of that and the kinds of questions that we should hold onto as we hopefully eventually resurface from quarantine. We're going to have the first one next week on public space and what it means that we have a loss of physical public space and we're seeing the emergence of public space online.

As a sociologist, I can't complain because there's always stuff that I can look at and work on.

ALEX WOODSON: Great. To our listeners, we are providing links to all of those projects that you mentioned. You have given us a lot to think about. It sounds like you're incredibly busy, and we'll be following you. Thanks a lot, Mona.

MONA SLOANE: Thank you so much, Alex.

También le puede interesar

FEB 9, 2022 - Podcast

¿Dónde está la plaza pública en la era de la información digital? con Stelios Vassilakis

En este episodio del podcast "Iniciativa Inteligencia Artificial e Igualdad", la Senior Fellow Anja Kaspersen y el Presidente de Carnegie Council Joel Rosenthal se sientan ...

APR 5, 2022 - Podcast

IA y procesos colectivos de toma de decisiones, con Katherine Milligan

En este podcast "Inteligencia Artificial e Igualdad", la Senior Fellow Anja Kaspersen y Katherine Milligan, directora del Collective Change Lab, exploran lo que podemos aprender de ...

9 DE DICIEMBRE DE 2021 - Podcast

Ética, gobernanza y tecnologías emergentes: Una conversación con la Iniciativa Carnegie para la Gobernanza del Clima (C2G) y la Iniciativa para la Inteligencia Artificial y la Igualdad (AIEI)

Las tecnologías emergentes con impacto mundial están creando nuevos espacios no gobernados a un ritmo vertiginoso. Los responsables de las iniciativas C2G y AIEI de Carnegie Council debaten...