This week at the United Nations in Geneva, faith leaders expressed concerns about lethal autonomous weapons systems.
The discussions were part of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems meeting from August 26 to 30, 2024.
The World Council of Churches (WCC), a member of the Stop Killer Robots coalition since 2019, stressed the need for human control in military technologies, according to a report by Ecumenical News.
In a dialogue, the group emphasized the importance of human life and the ethical issues with machines making life-or-death decisions.
Rev. Kolade Fadahunsi from the WCC’s Commission of the Churches on International Affairs, and a member of the Methodist Church in Nigeria, highlighted the moral dilemmas posed by autonomous weapons.
Speaking at a WCC-hosted event, he referenced the Biblical commandment “You shall not kill,” and raised questions about accountability if a machine were to kill without human oversight.
Representatives from Islam, the Japanese Buddhist movement Soka Gakkai, and the Baha’i faith also participated, underscoring the universal respect for human life and the necessity of its protection.
Jennifer Philpot-Nissen, WCC program executive for Human Rights and Disarmament, noted the coalition’s stance on the ethical implications of reducing humans to numerical values in warfare.
Raza Shah Khan, chief executive of Sustainable Peace and Development Organization from Pakistan, remarked on the inability of autonomous systems to comprehend human values and dignity in conflict.
Simin Fahandej from the Bahá’í International Community’s UN Office called for global cooperation to ensure technological advancements align with fundamental human values.
These discussions support UN Secretary-General António Guterres’ earlier call for nations to negotiate a treaty by 2026 to ban systems operating without human oversight and to ensure compliance with international humanitarian law.