AI on the Battlefield: Addressing Ethical and Legal Concerns

25 October 2021 • 
Event
On 20 October 2021, the VCDNP held the fourth webinar in the DET Series, focused on the ethical and legal concerns related to artificial intelligence, featuring remarks by Ambassador Thomas Hajnoczi, Dr. Frank Sauer, and Ms. Laura Bruun.
Share this:

On 20 October 2021 the Vienna Center for Disarmament and Non-Proliferation (VCDNP) held the fourth webinar in the Deterrence and Emerging Technologies (DET) Webinar Series, devoted to the ethical and legal considerations about the military application of artificial intelligence (AI).

The panel of speakers included Ambassador Thomas Hajnoczi (Former Director for Disarmament, Arms Control and Non-Proliferation, Federal Ministry for European and International Affairs of the Republic of Austria), Dr. Frank Sauer (Senior Research Fellow, Universität der Bundeswehr München) and Ms. Laura Bruun (Research Assistant, Emerging Technologies, Stockholm International Peace Research Institute). Ms. Mara Zarka, VCDNP Research Associate and Project/Events Manager, moderated the webinar.

Dr. Frank Sauer, Ambassador Thomas Hajnoczi, Ms. Laura Bruun and Ms. Mara Zarka

Ambassador Hajnoczi set the scene for the session by providing an overview of the rules and principles of international humanitarian law (IHL) applicable to any weapons systems including those in which AI is used. He further stressed that any use of new weapons systems needs to comply with the three fundamental principles of IHL – distinction, proportionality and precaution. Importantly, compliance with the principles necessitates human judgement and has to be seen in a contextual manner, in which it was highlighted that circumstances on the battlefield can rapidly change.

Ambassador Thomas Hajnoczi

Ambassador Hajnoczi drew attention to the ethical dimension of the AI issue. He underscored the “black box” nature of AI in which humans cannot predict the system and the fact that responsibility and accountability – intrinsically human characteristics – cannot be delegated to machines.

He outlined the current state of play in the development of fora and legal instruments to deal with the military application of AI. The progress made by the Group of Governmental Experts (GGE) operating under the Convention on Certain Conventional Weapons (CCW) in Geneva is important but limited due to the procedural arrangements in place. The rule of consensus allows only for the lowest common denominator. Ambassador Hajnoczi stressed the need for the development of legal norms to regulate autonomous systems without delay. He concluded his remarks by stating that not everything that is becoming technically possible should be allowed.

Dr. Frank Sauer addressed the four guiding questions about legal and ethical aspects of the AI military application. Answering the first question – that is whether it is possible to exclude humans from the prosecution of war – he refuted the myths about “robot wars” and “clean wars” claiming them to be unrealistic expectations of the possibility of a bloodless war. However, the quest for cost effectiveness and military effectiveness has considerable impact on military operations, allowing, inter alia, to take soldiers out of harm’s way.

Dr. Frank Sauer

Speaking about scenarios and contingencies in which the exclusion of humans from the prosecution of war may become desirable, Dr. Sauer noted that unlike remotely operated weapons, that already take the operator out of harm’s way, we need to look at scenarios that lend autonomy to the system. Autonomy enables weapon systems to continue operation in incidents where remotely operated systems could not, such as when communication is degraded or denied. Autonomy also results in much swifter reaction times as it removes invariable delays between a remote operators command and the systems response. He indicated that allowing for the completion of the targeting cycle at machine speed is the most important driver of autonomous weapons systems that remove humans from the targeting loop entirely.

The third question addressed by Dr. Sauer was if humans can control drones and robots. He claimed that meaningful human control must not be confused with direct control. The assertion of human control over a weapon system should be treated as the operationalization of control “by design” and “in use”. Elaborating on this, he noted that the system needs to be set-up so that its performance can always be traced back to a human agency and the human operator should be able to foresee the weapons effects on the battlefield, especially when targets are selected and engaged. Dr. Sauer reiterated the importance of the operational context, stressing that some situations require much greater human involvement than others.

Answering the final question about the reliability of communication links that connect humans and command centers with drones and robots on the battlefield Dr. Sauer illustrated that there is no one-size-fits-all solution. Some systems performing critical functions of target selection and engagement may be regarded as that under meaningful human control “by design” and “in use,” such as a combat direction system of a navy frigate operating autonomously for brief periods of time in a uncluttered environment In this case a human does not need to be constantly connected to and controlling a system in order for it to be deemed as being under meaningful human control. However, it does not apply to other systems, such as a gun turret in a robotic tank in an urban environment, that require every single step to be controlled by a human to be considered under human control.

Dr. Sauer argued that the ethical, legal and political risks of not retaining meaningful human control over weapon systems will outweigh their military benefits. He concluded the presentation by stating that if technology increases in both capabilities and opportunities, the obligations to use this technology under the fundamental principles of IHL equally increases.

In her remarks, Ms. Bruun focused on how to ensure compliance with IHL when introducing AI on the battlefield. She briefly outlined the principles and rules of IHL – specific and general rules on means and methods of warfare and rules guiding the conduct of hostilities. She argued that compliance with IHL requires the fulfillment of three conditions. First, the ability to reliably foresee operation, behavior and effects of weapons systems. Second, the ability to administer the operation of weapon systems in a manner that respects the rules of armed conflict. Finally, the ability to trace the operation and effect of the system back to the relevant human agent.

Ms. Laura Bruun

Ms. Bruun pointed to the fact that IHL compliance is applicable to humans and not machines. However, AI complicates IHL compliance since the use of AI comes with a higher degree of unpredictability and lack of explainability. In addition, it is unclear what type and degree of human-machine interaction is required to ensure IHL compliance. The answer to this question depends on whether IHL is considered as an effects-based or process-based regime. To assist in determining IHL compliance and when deliberating what tasks can be delegated to machines, Ms. Bruun outlined several guiding questions to be considered.

She argued that in order to build lawful rules around AI on the battlefield further clarification on what IHL requires from humans and permits from machines is needed.

The full recording of the webinar can be found below:

 


Registration & Questions
We kindly ask you to RSVP using our online registration.
Should you have any questions, please e-mail or call us.

Related Experts

Mara Zarka
Research Associate and Project Manager

Related Content

Arms Control and Emerging Technologies: Challenges and Innovative Concepts

28 May 2019 • 
VCDNP Senior Fellow Angela Kane participated in a panel discussion on arms control in the backdrop of emerging technologies during the launch event of a new project by the University of Hamburg.
Read more

Winners of the Essay Competition on Conventional Arms Control and CSBMs in Europe

10 December 2021 • 
The second OSCE essay competition for young scholars on conventional arms control and confidence- and security-building measures in Europe concluded on 8 December 2021.
Read more
1 2 3 21
cross
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram