La tecnología emergente y la guerra en Ucrania, con Arthur Holland Michel

30 de junio de 2022 - 38 min escuchar

En este podcast de la Revista Mundial de Ética, Arthur Holland Michel, Senior Fellow, habla de los sistemas de reconocimiento facial, las municiones de merodeo y los drones en el contexto de la invasión rusa de Ucrania y analiza su uso en el campo de batalla y en la narrativa más amplia del conflicto. A medida que las tácticas de Rusia se vuelven cada vez más brutales, al tiempo que utilizan armas más tradicionales, ¿qué efectos están teniendo realmente estas tecnologías en la guerra?

ALEX WOODSON: Welcome to Global Ethics Review. I'm Alex Woodson from Carnegie Council, the world's catalyst for ethical action.

For this episode on emerging technology and the war in Ukraine, I’m speaking with Carnegie Council Senior Fellow Arthur Holland Michel. He's the founder of the Center for the Study of the Drone at Bard College and was its co-director from 2012 to 2020. More recently, he served as a researcher on the security and technology team at the United Nations Institute for Disarmament Research. He’s the author of Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All.

In this podcast, Michel and I focused mostly on facial recognition systems and loitering munitions and drones in the context of Russia's 2022 invasion of Ukraine. These technologies are not necessarily new. But the way they are being used today, in the first major interstate war in Europe in decades, has been the subject of much discussion in the intelligence community. These systems have also played a major role in the narrative of this war, as it has been reported on in the mainstream press.

Michel and I discuss some ethical questions around these technologies and put them into the context of some larger issues around the war.

For more on Russia’s invasion of Ukraine, you can listen to the most recent episode of our Doorstep podcast with guest Melinda Haring of the Atlantic Council. And Michel has also done several podcasts on ethics and emerging technology. For all of that and more, you can go to carnegiecouncil.org.

For now, here’s my talk with Arthur Holland Michel.

Arthur Holland Michel, thank you so much for speaking with us today. I am glad we got to do this.

ARTHUR HOLLAND MICHEL: Yes, I am glad to be here.

ALEX WOODSON: I just want to start with a general question: From your perspective how is the war in Ukraine different when looking at emerging technology as compared to some recent wars like the wars in Libya and Syria? We know this is the first interstate war in Europe in over 70 years, but how is this different from a technology perspective in your view?

ARTHUR HOLLAND MICHEL: Things move so fast in the technology space that in a way every war is different. That being said, also every war is kind of the same in that they all abide by a certain set of what seem to be almost inviolable natural laws of warfare that do not seem to change regardless of what technology is being used, but we can get into that a little bit later when we talk about specific technologies.

I would say that one of the most significant things about this conflict is exactly what you alluded to, the fact that this is the first war in a long time that is being fought by two significant relatively advanced militaries engaging directly with each other instead of fighting by proxy. A lot of the technologies that are being deployed in this war have been deployed in previous conflicts, but in those cases they were generally deployed by a major power against a force that was much smaller in an asymmetric counterinsurgency-style conflict or by somewhat equivalent forces in a proxy-style war, Libya being a good example of that.

I guess another thing that is different is that there is a lot more public attention on this war than on other recent wars, and there has been in particular a lot of attention paid to the specific technologies that are being used. That is significant because these technologies—as we will see over and over in the course of this conversation I would imagine—have a compelling kind of media appeal to them. They draw a lot of attention, they spark interest, and that can actually have an effect on how the war as a whole is perceived.

ALEX WOODSON: That is definitely something I would like to touch on. I think one of the technologies that has been highlighted in this war—and I saw it highlighted especially a couple of months ago when Russian forces were moving out of the Kyiv area back to Russia after that offensive somewhat stalled and you started to hear about war crimes that were conducted in Bucha and places like that—is facial recognition technology, which became the subject of a lot of articles and a lot of focus.

Is that something new in this war? Have we seen this in other wars? Is deciding which soldiers are to blame using facial recognition technology something that is unique to this war, and how has that evolved over the last few months as atrocities have become maybe more clear as the war has changed?

ARTHUR HOLLAND MICHEL: Facial recognition technology in itself is not completely new. It has existed for a few years now, and during that time there have been a number of conflicts involving pretty sophisticated military powers.

It might be a stretch to say that this is the first conflict where facial recognition is being used or indeed is being used in this way. What again is different is that there has been a lot of media attention around the use of facial recognition, in part because some of the companies involved have actually wanted to promote the fact that their technology is being used in this war as compared to other wars, where companies that are potentially providing these services probably have much less of an incentive to show off about what they were doing.

Certainly some of these use cases that Ukraine has claimed to be deploying facial recognition for are not necessarily what people first expected facial recognition would be used for in conflict. Initially a lot of the talk was about how it could be used to identify potential targets for attack, but in fact what Ukraine claims to be doing is using it, as many have seen, to identify soldiers who have been killed, injured, or captured, in some cases to then use the information about their identities to contact their families directly, and, as you said, to serve as a potential launch point for investigations into alleged crimes, be they war crimes or even cases, for example, of looting.

Ukraine has said that it has conducted I think as of about a month ago almost 14,000 or 15,000 of these searches. What we have less data on is the actual results of these investigations. Very, very early in the war they said that something like 580 families had been notified of the deaths of soldiers who had relatives, but that is outdated information, so it becomes a little bit of a black hole beyond those top-line numbers.

ALEX WOODSON: There is an interesting quote I saw in a Wired article from March, so it may be a little out of date. The CEO of Tactical Systems, a technology company, said: "The more these individuals"—I assume he's talking about Russian soldiers—"are publicly identified and know that the intelligence community is following their movements, the less chance they will commit war crimes."

This is a bit of a shift in terms of the topic. I am not sure if this is something you can speak to. I am not sure if anyone can speak to it, but it is just an interesting thought that these soldiers are fighting now knowing that their faces are out there and they can be identified, and then you hear talk about how morale is low among Russian troops as well. I just wonder if these facial recognition systems, if this technology environment in Ukraine, is having any effect on that level?

ARTHUR HOLLAND MICHEL: That is certainly part of the narrative that we hear time and time again. Indeed the CEO of Clearview, which is the most prominent company to offer facial recognition services to Ukraine, said something very similar in an interview with I think The Washington Post.

To be clear this is not just something that we have heard in this part of the narrative for facial recognition or other emerging surveillance and intelligence technologies in warfare but also in the law enforcement space. For a long time people have claimed that if the population knew that there was facial recognition technology or other surveillance technologies in their city they would be less inclined to commit crimes. That is a claim that is not necessarily borne out by evidence. It would be a hard claim to prove in any kind of evidentiary way, even if there was some basis to it, but it also relies on a few critical and bold assumptions, one being that these alleged or potential war criminals or would-be war criminals would be aware that these technologies are in use. It also rests on the assumption that they believe that the technologies would be effective, which, as many people, especially people who fight in war and see the way emerging technologies often tend to break in warfare, would probably have some pretty good reasons to believe that actually facial recognition and other similar technologies would not be effective.

It also relies on the assumption that there would be consequences if they were caught, which is a big one in the case of a military organization that has certainly internally not shown any instruments of accountability to apply to soldiers who have committed these types of crimes and certainly the international community as well. That gets a little outside of the emerging technology conversation narrowly defined, but it is important to think about those factors when assessing some of the claims about what any emerging technology can and cannot do in conflict, certainly if it doesn't have a track record especially.

ALEX WOODSON: I think for a lot of these questions we have to wait until the end of the war maybe or when the fighting has stopped a little bit to see what their effects are.

I want to move on to some of the combat technologies, some of the technologies that we are actually seeing play out in these battles in Ukraine. One term that I have seen that I was not familiar with at all up until a few weeks ago was the term "loitering munitions." We have previously talked about drones, lethal autonomous weapons, and different types of technologies related to that. I was hoping you could explain exactly what loitering munitions are. I am not sure that is a term that a lot of people are familiar with outside of the intelligence community and outside the technology community. What are loitering munitions, and how are they being used in Ukraine?

ARTHUR HOLLAND MICHEL: A loitering munition is essentially a cross between a drone and a missile. It is a missile in the sense that it has an explosive warhead on it and its function is to fly into a target and destroy that target. It is a drone in the sense that unlike a missile, once it has been launched it can "loiter." That is the key term. It can fly over the battle space, in some cases for quite a long time, and during this time soldiers on the ground who have launched it can see what it is seeing through its camera, so they can assess the broader situation and can perhaps use that time to actually find the target that they are looking for. They may be receiving fire on the ground and know that the fire is coming from a general area but they don't know exactly where it is, so instead of firing a traditional missile into that general vicinity they can fly the loitering munition overhead, find it, take the time to identify it, and then fly it into that object, that tank, that structure, or that individual.

They have existed for quite a long time. I am talking maybe a decade or a decade and a half, maybe even longer if you talk about prototypes. They have also been used in conflicts. They were used to an extent in the war between Armenia and Azerbaijan, for example, as an anti-tank weapon, and they are coming up in discussions about the war in Ukraine because the United States has said that it is supplying a limited number of loitering munitions to Ukraine, a loitering munition called the Switchblade. There is also evidence that Russia has been using its own loitering munitions, again to a fairly limited degree, in operations on the ground.

ALEX WOODSON: That brings me to my next question, which is about lethal autonomous weapons. I am not as well-versed on this subject as you and maybe as a lot of people who may be listening to this podcast or a lot of people who come through Carnegie Council, but I know that there might be some questions about the definition of lethal autonomous weapons or "killer robots."

In a Foreign Policy article from May, it says, "Lethal autonomous weapons are here." They mentioned a Russian system, maybe the same one that you are talking about, the Kalashnikov Zala Aero KUB-BLA loitering munition, and then there is the Turkish-made Bayraktar drone. A lot of it probably just comes down to an individual's definitions or an organization's definitions, but do you consider these lethal autonomous weapons that are being used in Ukraine? You mentioned loitering munitions being used in the war between Azerbaijan and Armenia, but is the way loitering munitions and maybe lethal autonomous weapons being used now different than what we have seen in the past in other conflicts?

ARTHUR HOLLAND MICHEL: You are absolutely right that it all comes down to the question of how you define loitering munitions. All loitering munitions have some automated capability in the sense that they have some guidance systems in them. In some cases, for example, the soldiers on the ground who are operating the system see the target in the video feed, they click on that target, and then the loitering munition autonomously guides itself into that target.

That does not actually reach the threshold of a lethal autonomous weapon in the way that such weapons have been defined, for example, by the International Committee of the Red Cross, where to be a lethal autonomous weapon a system has to select and engage targets without human intervention. So it has to decide from a range of different options and choose a target or a set of targets and "decide" whether or not to attack them and how to do so.

So yes, a loitering munition in that sense would not be autonomous. There have been some claims that loitering munitions do have some autonomous capabilities that might reach the threshold of that definition. There has not been any hard evidence of that, certainly not from actual real-world uses.

The uses that we have seen in Ukraine probably would not necessitate anything that reaches the level of being a lethal autonomous weapon in the strict sense of the term. They are being used against the kinds of targets that they have been used against in previous wars, tanks being a predominant one, and radar sites. In the case of radar sites the loitering munition can actually be guided toward the target on the basis of the target's radar emissions. That makes it even in a way easier to guide the system to the target.

But we have not seen anything in the way of what would appear to be a loitering munition being given carte blanche to assess a range of different targets and to make again a "decision" as to whether to attack them.

I should also note just for the sake of our listeners—and this is a saying you see a lot in the discourse—that other drones like the Bayraktar TB2 that you mentioned are lumped into this broader category of autonomous weapons, but that is actually inaccurate. Drones like the Bayraktar are remotely controlled. They do carry weapons, but these weapons are launched by human operators, and their autonomous capabilities are fairly minimal, which is not to say that they haven't been an influential weapon in the war, but it is a distinction that is important if you are trying to be precise about where we are technologically in warfare.

ALEX WOODSON: We just recorded a podcast a couple of hours ago with Melinda Haring from the Atlantic Council. She was speaking on The Doorstep podcast with our senior fellows Nick Gvosdev and Tatiana Serafin. She gave a pretty grim assessment of the war, describing—we are in late June now—how Russia is just destroying large parts of Eastern Ukraine and trying to make it unlivable for Ukrainians.

Kind of a two-part question here: As the war has shifted to this war of attrition right now in the Donbas, which is what it looks like, how has the use of drones and some of these technologies changed—I am talking about how Ukraine is using them—and what effect do these drones have if any or even facial recognition technologies, some of these emerging technologies, if the strategy of Putin is just to destroy apartment blocks and infrastructure in Ukraine?

ARTHUR HOLLAND MICHEL: In a way, regardless of what Russia's strategy is, the question still stands: What is the effect of these technologies on this war or indeed on all wars? It is also important to consider how these technologies are changing war more broadly.

Not being exactly an expert on this, I am of the thinking to agree that, yes, this war is going to drag on and that the assessment is a grim one. But in terms of technology itself it is important to keep in mind what I mentioned at the very beginning of this discussion, which is that the outsized attention that these technologies receive is not a reliable metric of the effect that they have on warfare. Drones, potentially autonomous loitering munitions, facial recognition, and other forms of artificial intelligence are exciting. They are interesting. They demand and receive attention in the way that other technologies like, say, antitank weapons or plain old artillery simply doesn't. That plays on the public's availability bias to think: Well, okay, these TB2 Bayraktar drones are in the news all the time. That must mean that Ukraine is using them constantly and that they are shifting the course of the war.

That is actually not necessarily true. Indeed Zelenskyy has said that the TB2 drone in particular should not merit as much attention as it is getting. Not in as many words, but he got some question about the TB drone and effectively said: "Forget about the TB2 drone. It's not what matters."

I would potentially go so far as to suggest that it is possible that one could reach a similar conclusion about these other technologies that we have talked about. But yes, they are being used, yes, they are having some effect, but on balance in terms of technologies that could potentially shift the course of the war they are not necessarily up there.

I imagine—again, assuming that Ukraine continues to receive these technologies—that they will continue to be used regardless of how the war evolves. As the tactical contours of the conflict change certainly different capabilities of different technologies become more or less valuable or important, but I cannot imagine a scenario where Russia is still in Ukraine and any one of these technologies just stops being used completely.

Indeed there was a fairly interesting development recently where it seems like Ukraine used a drone to attack an oil refinery inside Russia. It is not the first time that Ukraine has conducted an attack inside Russia, but it does seem be one of the first times that it has done so with a drone, and that just gives a little window into, say, if the conflict were to evolve into more of like a standoff kind of conflict, that drones might potentially still play a role but obviously in a different way than they are potentially playing now. It is a little hard to predict the future.

Another thing I should add that is interesting is that I just taught a class on emerging technology and conflict last week at the Barcelona Institute of International Studies, and one of the things the students very quickly began to realize is that the adversary always has a response to the use of any emerging technology, that there is no emerging technology in recent memory where the adversary upon seeing it has thrown up their hands and said: "Oh, okay. This is just a new reality for us, and we are not going to do anything differently."

You can imagine in the cases of all of the technologies we have talked about that the other side has taken note and has sought to adjust its practices. Say in facial recognition maybe Russian soldiers are using more face coverings. In the case of loitering munitions maybe they are using jamming technology that can be used to potentially take these systems down out of the sky. They might take more care in hiding their positions because of the use of drones, and that obviously affects the balance of how effective these weapons are in warfare as well.

ALEX WOODSON: Definitely. One last aspect of drones that I wanted to mention is, I'm not sure if this is all drones, but I know drones can take video footage as well, and so you can have footage of let's say Ukraine blowing up an oil refinery and that can be used for propaganda purposes as well. I think we saw that a lot at the start of the war. Drones were being touted as this almost miracle for the Ukraine military. I think that has shifted a bit, but it is another interesting aspect of this technology.

ARTHUR HOLLAND MICHEL: It goes again to the narrative that drones and other emerging technologies—especially in the case of drones because they actually produce footage that then can be shared on social media but also other technologies. They fit very well into this narrative of challenging the adversary and changing the tide of the war.

There was even a song made about the Bayraktar drone, there was an animal at the Kyiv Zoo that was named after it. That is not to say that these technologies aren't important. There is plenty of room to say that the narrative is important, that the narrative can have a really important effect on shaping perceptions of the war, on encouraging people, and calling people to arms. It is all very useful for that. It can have tactical effects, but it is always important to put that asterisk next to emerging technologies when trying to assess how they are affecting war on the ground.

ALEX WOODSON: Last question, just another general question to end with: As this war drags on what are you looking at specifically? What are some of the areas of research within emerging technology maybe other than we have spoken about today that you are looking at?

This has been a tough podcast to research just because there is so much information out there. You type "drones" into Twitter and into Google, and you get all these different sources. It is a bit overwhelming. What are some sources that you would recommend to listeners of this podcast who want to learn about these issues?

ARTHUR HOLLAND MICHEL: Yes, certainly. In terms of technologies that I will be thinking about, I am obviously going to continue to monitor all the technologies that we have talked about.

One technology that has already been used extensively and that will continue to play I think a pivotal role is open-source intelligence, that is, the investigation of incidents or of people based on information that is by and large freely available online through, for example, photos that are posted on social media. A lot of the attention on open-source intelligence has been directed toward this very large online community of amateur intelligence analysts who do all of this work out in the open. They post, share, and workshop their results and findings on platforms like Twitter.

Also open-source intelligence is increasingly becoming an important tool for intelligence agencies. You can be certain that whatever open-source intelligence work is being done, for example, on Twitter that there are parallel efforts in Western intelligence agencies, and the results of that are in some cases likely to be passed on to Ukraine.

There has also been a lot of talk about the use of artificial intelligence (AI), for example, for analyzing satellite imagery, of which there is a lot around the war in Ukraine. The use of artificial intelligence can potentially speed that process up and allow agencies to analyze more information.

There are also other deployment possibilities for AI in realms of intelligence analysis, for example, in fusing information and looking for patterns in big data. You can be certain that military, certainly Western intelligence agencies, are using some of those technologies and potentially passing the information on to Ukraine. It may be the case that it has already happened that Ukraine is deploying some of this technology already or maybe is accepting the help of Western countries. The interesting thing about that is that unlike the other technologies that we have talked about today we are not likely to get a lot of information about it because it's very, very secretive, and all the users want to keep it that way.

In terms of tips on reliable sources I think the reliable media outlets have actually done a pretty good job so far. There has been a lot of great coverage, for example, on the Associated Press (AP) or on Reuters, The Economist, which has an excellent defence editor who has a sharp understanding of emerging technologies in conflict.

It is also valuable to go on Twitter just to see what people are saying. That is not going to necessarily be evidentiary information, but it is going to give you a sense of how the narrative is shaping up. I think that is valuable too because across the board—and again, this is a valuable lesson that I shared with my students but also that my students last week reinforced back to me—you really have to do a lot of the analytical work yourself, even if you are reading a story in the Times or the AP, you have to understand things that reporters are not necessarily going to talk about, things about how the adversary always has a countermeasure, the facts about how it is often in the military's interest to overhype, to feed into that narrative about these technologies, and to downplay their potential drawbacks.

Also, just deploy some common sense reasoning around things like physics and geometry. If a drone, for example, is used with remote-control connections to the operator, then that drone is not going to have more than at most a couple of hundred kilometers of range. If you see claims that Ukraine could use drones to penetrate deep within Russian soil and you know that those drones have radio control, just doing your own little bit of reasoning and your own little bit of sleuthing can actually see that that is not necessarily a credible claim, even though the reporters themselves may want it to be true.

I think those tips are even more important perhaps than thinking about specific outlets that you might want to follow.

Finally I would say do not forget about the traditional tools of warfare that are still playing a major role, things like artillery, small arms, rocket-propelled grenades or anti-tank weapons. That stuff is important too, and we should not lose sight of it just because there are drones in the sky.

ALEX WOODSON: Thank you very much. That is some great advice. I just want to reiterate that we have a recent Doorstep podcast that is about the more traditional aspects of this war and how that is going. It is always good to keep that in mind.

Arthur Holland Michel, thank you so much for speaking with us today.

ARTHUR HOLLAND MICHEL: It has been great. Thanks, Alex.

También le puede interesar

28 DE JUNIO DE 2022 - Podcast

The Doorstep: Responder a la guerra de desgaste de Putin, con Melinda Haring del Atlantic Council

Al cumplirse el quinto mes de la invasión rusa de Ucrania, Melinda Haring, del Atlantic Council, vuelve a hablar con los copresentadores de "Doorstep" Nick Gvosdev y Tatiana Serafin ...

Ciberseguridad. CRÉDITO: Pixabay.

APR 12, 2022 - Podcast

El bucle infinito de daños de la tecnología de vigilancia, con Chris Gilliard

En este debate con el investigador principal Arthur Holland Michel, Chris Gilliard explica por qué el arco de la tecnología de vigilancia y la novedosa IA se inclina hacia fracasos que ...

CREDIT: <a href="https://flickr.com/photos/30478819@N08/48899821263/">Marco Verch</a> <a href="https://creativecommons.org/licenses/by/2.0/">(CC)</a>

APR 20, 2020 - Podcast

Ética, vigilancia y pandemia de coronavirus, con Arthur Holland Michel

Mientras los estados de EE.UU. y las naciones europeas contemplan cómo poner fin a la cuarentena COVID-19, el Senior Fellow Arthur Holland Michel analiza todos los aspectos ...

JUN 26, 2019 - Podcast

Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All, con Arthur Holland Michel

Arthur Holland Michel, fundador del Centro para el Estudio del Drone, traza el desarrollo de la Gorgon Stare del Pentágono, una de las ...

CREDIT: <a href="https://www.pxfuel.com/en/free-photo-jrpjv">Pxfuel</a>.

APR 21, 2021 - Podcast

Los límites sociales de la ética de la IA

En los últimos años, el debate sobre la "ética de la IA" ha logrado incorporar principios clave para limitar los riesgos que, de otro modo, se derivarían de la ...

CREDIT: <a href="https://pixabay.com/illustrations/abstract-geometric-world-map-1278080/">Pixabay (CC)</a>.

17 DE DICIEMBRE DE 2020 - Podcast

Los límites técnicos de la ética de la IA

En los últimos años, el debate mundial sobre la "ética de la IA" ha logrado integrar principios clave para limitar los riesgos que de otro modo se derivarían de la ...