The danger of autonomous drones is real. According to a United Nations Security Council report, last year for the first time a military drone may have autonomously attacked humans in Libya. No details of the operation have been disclosed or if there were victims, but confirms that these lethal autonomous weapons are already being used on the battlefield despite international efforts to ban them.
A Turkish-made STM Kargu-2 drone it would have been used to “remotely pursue and compromise” the retreating soldiers, according to the report. The use of these drones in the Libyan conflict may open a new chapter for autonomous weapons. Tools programmed so that the AI decides on its own to eliminate its targets, including humans.
This is the Kargu-2, autonomous drones prepared to kill
In June 2020, the Turkish army purchased about 500 Kargu drones from Defense Technologies Engineering and Trade Inc, better known as STM, a Turkish arms company. The first generation of these drones was presented in 2017 and in 2019 the Kargu-2 variant was shown, capable of carrying out swarm-mode attacks and operating autonomously.
Kargu can be translated in Turkish as “hawk”. The company explains that they are designed for “asymmetric warfare and the fight against terrorism”. Weighing about 7 kg, the drone can stay in the air for at least 30 minutes and fly at a speed of about 145 km / h. Improved numbers in the second generation.
The United Nations Security Council report describes the event as follows:
“Logistical convoys and retreating Haftar-affiliated forces were subsequently pursued and remotely attacked by unmanned combat aerial vehicles or lethal autonomous weapons systems such as the STM Kargu-2 and other loitering munitions. Lethal autonomous weapons systems they were programmed to strike targets without requiring data connectivity between the operator and the ammunition: in effect, a true “shoot, forget and find.” Unmanned combat aerial vehicles and small intelligence, surveillance and reconnaissance capabilities of drones available to Haftar-affiliated forces were neutralized by electronic interference thanks to the Koral electronic warfare system. “
Evidence found by the United Nations indicates that STM Kargu-2s used ammunition and are operational in Libya. This is the first time that its use has been detected and its deployment constitutes a breach of paragraph 9 of resolution 1970 (2011).
Source: United Nations Security Council
The drones of the Kargu series can be operated manually, but they are also prepared to operate autonomously thanks to the use of sensors that go from electro-optical and infrared video cameras to a laser imaging system (LIDAR).
Through the KERKES program, the STM company prepared its drones to be able to coordinate autonomously in swarm and move by following the GPS. Along with 20 other drones, the Kargu-2 are programmed with an AI capable of selecting and attacking targets.
At the moment no victim of one of these autonomous weapons has been reported (it would be the first), but in the STM demonstration videos shows how these drones attack a group of mannequins.
How an autonomous drone decides who to attack
Antipersonnel mines can be configured to adapt sensitivity and only detonated with adults. With autonomous drones there is still much debate about what specific parameters determine when they strike.
The machine learning of these drones is programmed with a large data set and allows objects to be differentiated. Also vehicles such as tanks, buses or vans. It is assumed that the different armies can train the drone’s AI to focus on the particularities of the enemy, giving it access to databases with images and information of those who wish to kill.
However, in the same way that we have seen how algorithms have many biases and are not without flaws, these autonomous drones can also go wrong, with fatal consequences.
Experts and organizations ask to ban these autonomous weapons before it is too late
From the European Union they have warned against these killer robots. Also the United States and Russia are aware of the implications of these LAWs (‘Lethal Autonomous Weapons’). However, both the one and the other power, supported mainly by South Korea, Israel and Australia, they have blocked the negotiation to ban them.
The United Nations has intended to ban them since late 2013, but there has been little progress. During this time, personalities such as Brad Smith, president of Microsoft, Elon Musk, CEO of Tesla and DeepMind and up to a total of 116 experts from 26 countries have requested that the development and use of autonomous killer robots be banned. Because “Once this Pandora’s box is opened, it will be difficult to close it”.
In Engadget | China Warns: Danger of Accidental War Over “Smart Weapons” Is Real and Growing