Israel's Remote-Controlled Warfare in Gaza Raises Ethical Concerns

With unmanned bulldozers, Israel remotely controls the destruction of the Gaza Strip. The country's increasingly automated warfare raises ethical questions. There is a concern that it will become easier to wage war when they no longer come back in body bags, says philosopher professor Helen Frowe.

» Published: May 31 2025

Israel's Remote-Controlled Warfare in Gaza Raises Ethical Concerns
Photo: Tsafrir Abayov/AP/TT

Share this article

The excavator is maneuvering slowly, turning up the red earth as it moves forward. It looks like any bulldozer, but it's not: the driver's cabin is empty. The giant vehicle on the Israeli test site is controlled from a military base in Alabama, USA.

Israel's "robdozer" is an automated version of Caterpillar D9, a four-meter-high and 75-ton heavy bulldozer that the Israeli army has long used to clear the way for ground forces and level the ground. Now, the unmanned model is being used to an increasing extent, writes news agency AP in a report.

Israel's warfare is increasingly remote-controlled, with everything from drones to unmanned combat vehicles and bulldozers, to AI-controlled surveillance.

The weapons minimize the risks for the soldiers, says Helen Frowe, professor and head of the Stockholm Centre for the Ethics of War and Peace, who researches the ethics of war.

And in some cases, in the right hands, it's a great thing, citing Ukraine as an example.

”No moral difference”

But the remote-controlled war raises ethical and legal questions. Many have raised the risk that the battlefield will turn into a video game, that morality and ethics will go up in smoke. A robot has no conscience, and from a distance, no screams are heard.

Helen Frowe wants to tone down that image.

It basically just means that you kill at a longer distance. There is no moral difference.

Research does not support the theory that soldiers using remote control do not understand that they are firing a weapon or demolishing a house, says Frowe. Some studies have even shown that drone pilots are at higher risk of developing PTSD.

The one who controls a drone often follows their target for a long time, for days or weeks, sees them living their lives, sees their children – and probably has more feeling for who they are killing than someone who just drops a bomb, she says.

Through the screen, they also see the aftermath: they see the body, they see what they have done.

The technology is not the problem

Assessors and critics also point out that the new technology creates deep asymmetry between rich countries' well-equipped armies and poorer countries' or groups' combat potential.

Helen Frowe believes that it does not make injustice greater or lesser.

The problem here is not the technology – the problem is that Israel is waging an unjust war. That Israel is using any weapon at all.

Another debated issue is whether remote-controlled weapons and war tools are more or less prone to making mistakes. Some point to the ethically questionable aspect of letting a machine be responsible for killing people and demolishing houses. According to Helen Frowe, the consensus seems to be that machines are often more reliable than humans.

They are not subject to emotional pressure, fear, panic, confusion, and other things that soldiers experience, she says and continues:

A soldier can make mistakes because they are afraid or tired after not having slept for three days. In that way, automated weapons can be much safer.

Can last longer

It seems that Israel has reasoned around the use of the military's AI-driven surveillance system "Lavender", which during the war is said to have detected tens of thousands of potential targets in Gaza. In an investigation by +972 Magazine and Local Call in April 2024, an intelligence officer stated that the military had more trust in a "statistical mechanism" than a grieving soldier after Hamas' October 7 attack.

Everyone, including me, lost loved ones on October 7. The machine did it coolly, without feeling. It made it easier.

People tend to see remote-controlled warfare as "disrespectful" and argue that it makes accountability harder, says Helen Frowe. Morally, it's wrong, she believes.

The one who is responsible is the one who has equipped this army. And the one who has written the algorithm, and those who have paid for it, and those who have given it to the army.

Hundreds of drone attacks

There is, however, a greater concern with remote-controlled war. When the risks for individual soldiers decrease – and fewer come home in body bags – the pressure on warring states also decreases. Thus, wars can last longer.

When you reduce the loss of life, you also reduce the democratic pressure, says Helen Frowe.

It became clear, among other things, when the USA under Barack Obama shifted towards drone attacks. During his time in the White House, Obama ordered over 500 drone attacks against, among others, Yemen, Pakistan, and Somalia – attacks that largely went unnoticed by Americans.

People often get very worried when it comes to sending actual ground troops. When you use drones and make it remote-controlled, people simply don't care as much.

It makes it much "easier" to wage war – and more likely that the war will drag on.

The concept of remote-controlled warfare encompasses different types of attacks carried out at a distance, without the need for soldiers on the ground.

Drone attacks are perhaps the most common. Then, attack drones are controlled by soldiers who follow the process on a screen, often at a great distance, and fire at their targets with a button press.

Other types of military vehicles can also be controlled remotely, such as unmanned tanks and bulldozers. Cyber attacks are also included in the concept: attacks on data systems, networks, and digital infrastructure that can be launched remotely and at great distances.

There are also automated weapon systems, something that is becoming increasingly common. Systems can detect and select attack targets with limited or no human oversight.

AI systems for surveillance also fall under the concept of remote-controlled warfare. During the ongoing Gaza war, an Israeli AI-driven system called "Lavender" is said to have identified tens of thousands of potential targets based on perceived links to Hamas and other Palestinian extremist organizations.

Loading related articles...

Tags

TTT
By TTTranslated and adapted by Sweden Herald
Loading related posts...