On 20 April 2018, the VCDNP, in collaboration with the International Institute for Peace (IIP), organized a seminar entitled "Rapidly Emerging Technologies: What are the Ethical and Legal Challenges?" that brought together experts in the field of artificial intelligence (AI), machine learning and predictive applications. In the aftermath of the Facebook/Cambridge Analytica scandal which unfolded a case of political misuse of Big Data, many started to question how much further advanced technology could go and what the moral and ethical boundaries of its development are. Can artificial intelligence improve the world in which we are living? Are emerging technologies posing a threat to society or are they offering viable solutions to current challenges?
Moderated by VCDNP Senior Fellow Angela Kane, the speakers Sean Leggasick, Co-Lead of the Ethics and Society Group with DeepMind, and Jane Zavalishina, President and co-founder of Mechanica AI, discussed the possible impacts of the rapidly emerging technologies on society, ethics and law. VCDNP Executive Director Laura Rockwood and the IIP President Dr. Hannes Swoboda opened the seminar with comments on the current security challenges associated with the development of new technologies.
During the question and answer period, an engaged audience inquired about the possibility of losing human intellectual capacity as the result of an excessive use of artificial intelligence. Both speakers expressed optimistic views regarding AI implementation. Ms. Zavalishina stressed that AI is a knowledge-based model and therefore the human factor in AI will always be necessary. Mr. Leggasick also noted that emerging technologies can be used as a tool for maintaining human capacity, not a threat to it. Questions were also raised about the use of AI in autonomous weapons and the possibility of unintended consequences connected to the use of emerging technologies. Mr. Leggasick emphasized that, in his opinion, machines should not be given any capacity to identify targets, at least not human ones.