Some of today’s military technologies raise political, ethical and legal challenges both to the armed forces and to society at large. These challenges require that both civil society and political actors participate in debates with the military in defining a just use of technologies such as armed drones and other robotics. In a webinar, organized on June 9th, 2021 by the Peace Research Institute Oslo (PRIO) and the European Forum on Armed Drones (EFAD), experts discussed the development, use and proliferation of these technologies with a particular focus on the Norwegian position in this debate.

The participants included Ilaria Carrozza, Senior Researcher at the Peace Research Institute Oslo (PRIO); Gerald Folkvord, Political Adviser at Amnesty International Norway; Greg Reichberg, Research Professor at PRIO; Henrik Syse, Senior Researcher at PRIO; and Richard Moyes, Managing Director at Article 36.

In her talk, Ilaria Carrozza provided an overview of ongoing technological developments in Russia and China, warning that these could be used with an adversarial intention and should therefore be carefully monitored by Norway in the years to come. She explained how Russia thus far has focused in particular on the development of artificial intelligence as a tool of information warfare – including AI-based drones and other unmanned systems – with a more limited role played by the private sector, whereas China on the other hand has adopted a ”fusion” strategy of integrating the military and civilian sector, which may give the country an advantage  in the development of drones and other new military technologies.

Gerald Folkvord voiced Amnesty’s concerns over the wider human rights impacts of the modern development of automated weapons and the digitalization of warfare. He outlined how these developments fit into a wider trend of questionable interpretations of international law, the outsourcing of warfare to private contractors and mercenaries, and the militarization of state measures and police tasks, as well as serious obstacles to accountability for any human rights abuses emerging from these practices. He warned that the further automation of weapons will only increase the likelihood of armed conflict and make accountability efforts even more challenging. He finally briefly touched upon the revised European Union (EU) regulations for the export of dual-use equipment, explaining that while these revised regulations constitute an important step, not everything is adequately covered, including surveillance technology.

Greg Reichberg spoke about ongoing discussions on Autonomous Weapons Systems (AWS) which until now have mostly been taking place in expert groups of the United Nations (UN) Convention of Certain Conventional Weapons. While some states have come out in favour of a ban on autonomous weapon systems, Norway has not yet taken a position on this issue. There has been considerable debate regarding the definition of some key terms, including what exactly is understood by ”autonomy”, ”autonomous targeting” and ”meaningful human control”. The ambiguity surrounding these terms has among others led to some states referring to autonomous weapon systems as emerging military technologies that should be banned to prevent future use, whereas others maintain that such systems are already existing for several decades. Nevertheless, in Greg Reichberg’s view, such discussions could actually serve as a stimulus to reach a shared understanding of these definitions.

Henrik Syse’s presentation focused on some of the key terms and ambitions outlined in the strategic recommendations issued by the Norwegian government in 2020 in relation to the development of Artificial Intelligence. He stressed the importance of explanability – in particular to ensure the trustworthiness of AI systems – as well as the necessity to put in place responsibility and accountability mechanisms for any unlawful acts arising from the use of these systems. He also mentioned the draft for the new ethical guidelines for the Norwegian Government Pension Fund currently discussed in parliament, which recommends that companies investing in lethal autonomous weapon systems be excluded from the fund. The Norwegian government has thus far responded that it deems the total exclusion to be too broad as it fails to clearly define what would fall under this category. According to Henrik Syse, this lack of clarity about the boundaries and definitions also constitutes an important challenge for the ”ban killer robots” movement.

Finally, Richard Moyes further reflected on the autonomous weapon discussion taking place within the CCW. Echoing some of the statements previously made regarding the lack of fixed meanings and different terminology used in this discussion, he noted that the high level of civil society engagement, media interest, and diplomatic talks as well as the large amount of policy literature on the issue, appears to have contributed to some of this confusion. He nevertheless viewed the development of a legal instrument as very likely to happen over the next five years. He added that this process would very much benefit from a strong position from Norway on autonomous weapons before the next review conference, which thus far has taken a quiet and non-committal stance.

Missed the webinar? Watch a recording of the session here:

 

Share this via: