Weaponized AI is the future of warfare

Weaponized AI is expected to be the dominant weapon in the coming wars.

war robot marker

The change has already begun as the two largest armies in the world, America and Russia, are one step before the final phase. The US Army with one his announcement on 25 January 2023 states that it is focused on autonomous AI weapons.

Accordingly and the NATO released an implementation plan, which aims to maintain the alliance's "cutting-edge technology", in weapons dubbed as "killer robots".

Both announcements reflect a critical lesson learned by soldiers around the world from recent military operations in Ukraine and Nagorno-Karabakh: Weaponized AI is the future of warfare.

The military sees a strategic value in roving munitions. These weapons, which are a cross between a bomb and a drone (ground or flying), can hover for long periods while waiting for a target. For now, such semi-autonomous missiles are used with human control in key decisions, but soon they will be able to decide for themselves.

As casualties mount in Ukraine on both sides, so does the push for fully autonomous weapons—robots that can select, hunt and attack their targets on their own, without the need for human supervision.

This month, one Russian manufacturer announced its plans to develop a new combat version of an unmanned ground vehicle to augment existing forces in Ukraine. Fully autonomous drones are already in use for the defense of Ukrainian energy facilities from other drones.

Wahid Nawabi, CEO of the American company AeroVironment Inc, which works with the country's defense ministry and which manufactures the semi-autonomous Switchblade drone, he mentioned that the technology to make these weapons fully autonomous is already feasible.

Proponents of fully autonomous weapons systems argue that the technology will keep soldiers out of harm's way by keeping them off the battlefield. They will also enable rapid military decision-making, enabling radically improved defense capabilities.

Critics, such as Campaign to Stop Killer Robots, have supported a ban on research and development of autonomous weapons systems for more than a decade. They point to a future in which autonomous weapons systems are specifically designed to target people, not just vehicles, infrastructure and other weapons.


They argue that wartime decisions about life and death should remain in the hands of the people. Transferring them to an algorithm is equivalent to ultimate form of digital dehumanization.

The organizations argue that the armies that invest in autonomous weapons systems, including the US, Russia, China, and the European Union, are launching the world into a costly and destabilizing new arms race. A dangerous consequence could be the dangerous new technology to fall into hands terrorists, hackers and others outside government control.

Another issue that arises is who will be held responsible for military operations, especially when they violate international treaties on war crimes against humanity. Although such accusations and trials are usually made only by the victor against the vanquished, and are not impartial, at least until today they can be personified in a commander or weapon operator. But with fully autonomous artificial intelligence responsibilities cannot be shifted to one machine, algorithm or weapon system.

At this time, human beings are held responsible for protecting civilians and limiting damage in battle by ensuring that the use of force is proportionate to military objectives. When AI weapons are deployed on the battlefield, who should be held responsible when needless civilian deaths occur? There is no clear answer to this very important question.

iGuRu.gr The Best Technology Site in Greecefgns

every publication, directly to your inbox

Join the 2.097 registrants.
Artificial Intelligence

Written by Dimitris

Dimitris hates on Mondays .....

One Comment

Leave a Reply
  1. The funny thing is that already the MACHINES ASK US TO PROVE TO THEM THAT WE ARE PEOPLE....
    Now humanity has probably reached its end,
    We have caught DUCK as a tribe….

Leave a reply

Your email address is not published. Required fields are mentioned with *

Your message will not be published if:
1. Contains insulting, defamatory, racist, offensive or inappropriate comments.
2. Causes harm to minors.
3. It interferes with the privacy and individual and social rights of other users.
4. Advertises products or services or websites.
5. Contains personal information (address, phone, etc.).