Powered by MOMENTUM MEDIA
defence connect logo

Powered by MOMENTUMMEDIA

Powered by MOMENTUMMEDIA

Out of the loop: Humanity quickens towards autonomous drones

Saker Scout drone. 2023. Photo credits: Ministry of Defense of Ukraine

Human intervention may have been quietly removed from the loop in the Ukraine war with the approved introduction of independently targeting, artificial intelligence-assisted kamikaze drones.

Human intervention may have been quietly removed from the loop in the Ukraine war with the approved introduction of independently targeting, artificial intelligence-assisted kamikaze drones.

The Ministry of Defence of Ukraine approved the use of artificial intelligence-assisted Saker Scout unmanned aerial drones for frontline combat missions in the country’s fight against Russian forces, late last year.

Each drone, with a flight range of 10 kilometres, is reportedly able to independently recognise and record the coordinates of enemy equipment (including camouflaged vehicles) using advanced optics and infrared optics, which is then immediately transmitted to a command post for the appropriate decision.

==============
==============

It can also rely on an inertial guidance system to resist electronic warfare interference and integrates with situational awareness systems currently in service with Ukraine.

“The Saker Scout drone with artificial intelligence has been approved by the Ministry of Defence for use in the Armed Forces,” the Ministry of Defence said (translated by Google Translate.)

“The Saker software, built on artificial intelligence algorithms, will help our troops beat the enemy more effectively.

“The complex includes the flagship reconnaissance drone, as well as several kamikaze drones of the first-person-view type, which are adjusted, including with the help of the flagship drone.”

The possible introduction of “memorised target” kamikaze drones, which can identify and select their own targets, could allow military troops to strike opposing forces at vastly increased ranges (without return trip or control limitations), behind enemy lines (where there is EW interference), and present new opportunities in the development of advantageous “fire-and-forget” drone weaponry.

The use of “fire-and-forget” drones munitions has previously been publicly speculated as early as 2021, according to a United Nations Security Council report published in March that year.

The report detailed an armed conflict between the interim government of Libya (Government of National Accord) and armed groups affiliated with Libyan military officer Khalifa Haftar (Haftar affiliated forces) in 2020.

During the fighting, the UN reported that unmanned combat aerial vehicles and lethal autonomous weapons systems such as the STM Kargu-2 (rotary wing loitering munition) were deployed to hunt down and remotely engage logistics convoys and retreating HAF forces.

“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition,” the report said.

“In effect, a true ‘fire, forget and find’ capability.”

Fighting the inevitable tide of drone advancement are a number of special interest groups encouraging the rapid introduction of new international laws to restrict and stem the adoption of autonomy into weapons systems.

Groups such as Stop Killer Robots, established in 2013 and operating globally with more than 180 member organisations, advocate for legal safeguards to ensure human control remains a link in the use of force.

The group argues that such technology would lead to a digital dehumanisation involving the automatic profiling, pattern matching, and processing of human beings as data. It’s argued that autonomous systems would exacerbate existing inequality, reduce human control of consequences of use, eliminate meaningful judgement and accountability, lower the threshold to war, and lead to a destabilising arms race.

However, this approach primarily relies on wide acceptance of voluntary policing and technological restraint by individual international countries and corporations, while also running in contradiction of the relatively low cost of development and inevitable competitive advantage of such systems.

The group has already reported possible instances of automated targeting systems being utilised during the ongoing fighting in the Gaza Strip in December last year.

“Stop Killer Robots is deeply concerned with reports of the use of automated targeting systems in the Gaza Strip,” a group spokesperson said.

“While the ‘Habsora’ (artificial intelligence-based targeting) system is not an autonomous weapon, its reported use by Israel in the Gaza Strip raises grave concerns over the increasing use of autonomy in conflict, digital dehumanisation, artificial intelligence, automation bias, and human control in the use of force.

“According to the Israeli Defense Forces, ‘Habsora’ purportedly can use AI and ‘rapid and automatic extraction of updated intelligence’ to generate recommended targets. Targeting systems that employ automated decision-making and AI technologies present serious concerns and are part of a worrying trend toward the deployment of autonomous weapons systems.

“The use of AI-based systems to ‘produce targets at a fast pace’ represents a grave example of digital dehumanisation raising serious concerns around the violation of human dignity and compliance with international humanitarian law. Additionally, the potential reduction of people to data points based on specific characteristics like ethnicity, gender, weight, gait, etc raises serious questions about how target profiles are created and in turn how targets are selected.

“There are also serious risks that humans may become over-reliant on automated systems wherein targeting recommendations are produced and acted upon by human operators without understanding of how the recommendations were generated.

“With the UN Secretary-General, the International Committee of the Red Cross, and more than 100 countries having called for a legal instrument on autonomous weapons systems, there is an urgent need for clear prohibitions and regulations on autonomy in weapons systems in international law.”

You need to be a member to post comments. Become a member for free today!