While artificial intelligence (AI) and its application in unmanned aerial vehicles, or drones, can be used in new and frightening ways for warfighting, VCDNP Senior Fellow Angela Kane pointed out in a recent interview with the Goethe Institute that there are two sides to this technological coin.
Information and communications technology has been on the UN's agenda since 1998 and the practical applications of AI have only grown since then. The most famous example of this is in warfighting, where drones have been weaponized as lethal autonomous weapons systems (LAWS), also known as "killer robots." A problem Ms. Kane notes in the use of LAWS in war is their potential for those controlling them to dissociate or to make decisions on the use of force with greater ease. These decisions are based on the best of intentions by the drone operator, but are also subject to the operator's biases. "Though the drone is not emotional," she observed, "the person controlling it is."
In this regard, Ms. Kane noted the great need for regulation in the AI space, particularly as it applies to LAWS. However, past attempts to establish regulations in this field have not resulted in progress in negotiations, and the current attempts in the UN context have shown the divisions in the international community on this issue. Ms. Kane expressed concern that efforts to regulate this technology have been overtaken by the development of the technology itself.
However, AI and drones are less famously also used in UN peacekeeping operations, demonstrating that there are peaceful applications for this technology. These applications, Ms. Kane argues, should be expanded. She recalled an example from 2014, when there was a massacre that took place merely nine kilometers away from UN peacekeepers, who were not aware of the incident because there were no roads in the region. This is what prompted the UN to begin using drones for monitoring in difficult-to-access areas.
Ms. Kane was posed the question as to whether regulation in the AI space should be pursued in terms of peace and security or in terms of human rights. Her response was that it should be pursued on both avenues. "We need the political will to make progress in this area," she said. "AI and peace are not aspects the industry focuses on. They can lead the development, but they are also developing whatever they require and want for military purposes."