GENEVA — Experts from scores of countries are meeting to discuss ways to define and deal with “killer robots” — futuristic weapons systems that could conduct war without human intervention.
The weeklong gathering that opened Monday is the second of its kind this year at U.N. offices in Geneva to focus on such lethal autonomous weapons systems and explore ways of possibly regulating them, among other issues.
In theory, fully autonomous, computer-controlled weapons don’t exist yet, U.N. officials say. The debate is still in its infancy, and the experts have at times grappled with basic definitions. The United States has argued that it’s premature to establish a definition of such systems, much less regulate them.
Some top advocacy groups say governments and militaries should be prevented from developing such systems, which have sparked fears and led some critics to envisage harrowing scenarios about their use.
As the meeting opened Monday, Amnesty International urged countries to work toward a ban.
Amnesty researcher on Artificial Intelligence Rasha Abdul Rahim said killer robots are “no longer the stuff of science fiction,” warning that technological advances are outpacing international law.
Part of the trouble for campaigners, however, is that the U.N.-supported organization works by consensus, meaning that any one country — like big military powers — could scuttle efforts to reach an international ban.
Amandeep Gill, a former Indian ambassador to the U.N.-backed Conference on Disarmament who is chairing the meeting, said progress is being made. He summarized three general camps of countries: One seeks a formal, legal ban on such weapons; another wants a political, but non-binding agreement; and a last one wants no changes at all.
“We are coming closer to an agreement on what should be the guiding principles — guiding the behavior of states, and guiding the development and deployment of such systems around the world,” Gill told reporters after the meeting opened on Monday. “And this is not an insignificant outcome.”