A dangerous first! Robots Autonomously Hunting Humans UN Notification

A United Nations (UN) report in March revealed that a Turkish-made military drone, Kargu-2, hunted and attacked humans in a battle on its own, possibly the first case of a drone hunting humans on its own to date. (Photo/from the official website of STM, a Turkish state-run military technology company)

Kargu-2 drone made by STM, a Turkish state-run military technology company.

The United Nations (UN) report shows that a Turkish-made military drone hunted and attacked humans on its own during a battle, which may be the first case of a drone actively hunting and killing humans to date.

The UN Security Council’s Panel of Experts on Libya published a report in March this year, revealing the first case of a military drone possibly attacking humans on its own.

The report noted that the robot in question was a Kargu-2 drone manufactured by Turkish military technology company STM, which was used in a conflict between Libyan government forces and the Libyan National Army (LNA) led by warlord Khalifa Haftar in March 2020.

The report describes how the LNA retreated after being “pursued” by the drone, noting that the lethal weapon system was designed to attack targets directly without a data link to the operator, suggesting that the drone would attack humans autonomously.

The report noted that the Kargu-2 drone uses computer vision to select and attack targets without the need to connect with remote personnel, and that it is loaded with explosives that can be used to carry out suicide attacks on targets and detonate on impact.

The report does not mention whether any military personnel were killed in the attack, but it does describe the Kargu-2 as a lethal weapons system, suggesting that there may have been casualties.

Zachary Kallenborn, a researcher with the National Consortium for the Study of Terrorism and Responses to Terrorism, said this could be the first case of a drone The first case of an automated attack on a human, “this represents a new chapter in autonomous weapons that use AI artificial intelligence to fight and kill humans.”

One of the concerns in this case is that the deployment of truly autonomous drones represents a military revolution equivalent to the introduction of guns and aircraft, because unlike nuclear weapons, almost any military can easily acquire such weapons.

Another concern is that AI artificial intelligence will not always interpret visual data correctly, Karen Byrne said, “How fragile are object recognition systems? How often are targets misidentified?”