Fears of ‘killer robots’ that use AI to choose human targets grow as Russia-Ukraine war intensifies
AUTONOMOUS robots have transformed modern warfare as nations race to develop technologies that fire on targets without human input, leaving experts alarmed.
AI-controlled weapons could be more destructive than nukes and mistake civilians for combatants, experts warn.
An AI system powering a drone could scan a battlefield and select targets for destruction, like a gun that aims and fires itself.
Autonomous weapons could reduce the amount of risk human soldiers are exposed to - providing an obvious strategic benefit for a country at war.
But James Dawes, an expert on the weaponization of AI, wrote a harsh review of autonomous weapons and their potential for :
"When selecting a target, will autonomous weapons be able to distinguish between hostile soldiers and 12-year-olds playing with toy guns? Between civilians fleeing a conflict site and insurgents making a tactical retreat?"
Read More on Robots
Worse yet, a robot cannot be held accountable for mistakes in battle - the charge of responsibility has nowhere to go, Dawes argues.
Fortunately, killer robots do not appear to have been widely utilized yet, and there are no confirmed deaths caused by an autonomous weapon.
But a UN report last year said an autonomous killer drone was deployed on a battlefield in March 2020 during the Second Libyan Civil War, .
The battle was between the UN-recognized Government of National Accord and the forces of army general Khalifa Haftar,
Most read in Tech
The UN report stated: "The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition."
It is unknown if the drones killed anyone.
An Obama-era policy position on autonomous weapons has such strict parameters that no killer robots have been submitted for review, the reports.
But other nations have been less restrictive.
A US intelligence found that Russia has more than 150 AI-powered military systems in different stages of development.
"The Russian military seeks to be a leader in weaponizing AI technology," an intelligence officer told .
Russia boycotted a February 2022 conference on autonomous weapons regulation and will abstain from discussions continuing this month.
To experts, the most likely cause of an autonomous weapons arms race is the use of killer robots against the Ukrainians.
“I can guarantee, if Russia deploys these weapons, some people in the US government will ask ‘do we now, or will we later, need comparable capabilities for effective deterrence?’,” diplomacy expert Gregory Allen told New Scientist.
READ MORE SUN STORIES
Attempts to regulate autonomous weapons have met walls put up by threat actors while human rights organizations plead for their ban.
Meanwhile, the Campaign to Stop Killer Robots conducted a survey and found that oppose the use of lethal autonomous weapons.