AI is now being used across the world to create autonomous weapons and surveillance systems that are entrenching illegal occupations and redefining warfare. A robot should never be able to make the decision about whether a human lives or dies.
Israel’s Use of AI targeting in Gaza and the West Bank
As of July 2025, Israel’s genocidal campaign against Palestinians in Gaza has killed approximately 60,000 people, according to UN OCHA.
The Israel Defense Forces (IDF) has used an AI-assisted targeting systems including ‘Habsora’, ‘Where’s Daddy’ and ‘Lavender’ to rapidly and automatically perform much of the process of determining what to bomb, during its genocidal campaign against Palestinians in Gaza. Meanwhile, AI tracking systems such as ‘Red Wolf’ have also been used to automate apartheid across the occupied Palestinian territory (oPt), including in the West Bank.
Lavender
Anonymous Israeli intelligence officers claim that officers would rubber stamp targets that Lavender designated for attack. There was no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One claimed that the average review time was twenty seconds, simply to confirm that a Lavender-marked target was male, before authorising bombing.
In the early stages of the genocide, Lavender marked 37,000 Palestinians as military suspects, marking their homes for possible airstrikes. Many of these were approved quickly, despite an awareness of a 10% ‘error’ rate in the system. As these are the Israeli intelligence officers’ estimates, it is highly plausible that the error rate is much higher than this, given Israel’s consistent inflation of ‘militants’ in Gaza.
Habsora (The Gospel)
Whereas Lavender focuses on marking individuals to include on a kill list, Habsora (meaning ‘The Gospel’) is designed to mark buildings and structures that the army claims militants operate from. It was developed by Unit 8200 of the Israeli Intelligence Corps.
Habsora also includes an estimate for the number of civilians who may be killed in attacks on private residences that it recommends bombing in advance. These details appear clearly in the target file under the category of ‘collateral damage,’ for intelligence officers to review and approve.
Where’s Daddy?
Another AI targeting system used by Israel in the genocide is ‘Where’s Daddy?’ This software specifically tracked targeted individuals in order to carry out bombings when they had entered their family’s residences. The name of the system is thought to allude to the way in which this practice significantly increased the number of fatalities of family members for the targeted individual as well as the individual itself.
Israeli intelligence officers confirmed that this practice ensured that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity. On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
- A., an IDF intelligence officer.
Red Wolf
Red Wolf is used to track Palestinians using an experimental facial recognition system, which then automatically designates restrictions on their freedom of movement. This is a crucial element of apartheid. It is deployed at military checkpoints in Hebron and East Jerusalem, where it scans Palestinian people’s faces without their consent and then adds their faces to surveillance databases.
Robotic Guns
Since 2022, Israel has reportedly installed artificial-intelligence powered guns above refugee camps in al-Aroub refugee camp, which is located between Bethlehem and Hebron in the oPt. These guns are twin turrets, installed on top of guard towers, which are capable of firing tear gas, stun grenades and sponge-tipped bullets at refugees in the camp. The automated weapons have also been installed at military checkpoints in Hebron.
IDF Large Language Model (LLM)
The Israeli military is developing a large language model (LLM) system, similar to popular AI models such as Chat-GPT, using intercepted Palestinian communications to build the model. The Israeli Unit 8200 trained the AI model to understand spoken Arabic using large volumes of telephone conversations and text messages, obtained through its extensive surveillance of the occupied Palestinian territory. The system is designed to answer questions on individuals who are under surveillance.
The project was initially overseen by Chaked Roger Joseph Sayedoff, a former military intelligence technologist, who admitted to overseeing the project at an AI Conference in Tel Aviv in 2024. It is not yet known if the tool is ready for deployment, as it was still being trained in late 2024.
It is unclear whether this LLM is the same as ‘Genie’, which has been reported in Israeli press since April 2025. Genie is a LLM designed for the Israeli military to use in a similar format to ChatGPT.
