Israel reportedly used ‘Lavender’ AI system to ID thousands of dubious targets in Gaza war
Israel used system to drastically expand targeting of low-level militants, IDF officers claim
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.In the weeks immediately after Hamas’s 7 October surprise attack on Israel, the Israel Defense Forces allegedly intentionally targeted civilian homes and allegedly used an AI-based programme called Lavender to generate targets for assassination, generating scores of bombings based on decisions made with scant human review.
At one point, the system used mass surveillance in Gaza to generate a list of 37,000 bombing targets, including numerous low-level alleged Hamas operatives who wouldn’t typically be the targets of bombing operations, according to a report.
The allegations, uncovered by +972 Magazine and Local Call, are based on interviews with six Israeli intelligence officers who served during the conflict with Hamas in Gaza and were involved in the use of AI to investigate targets.
One officer said his role in the system was as a mere “rubber stamp” on Lavender’s targeting decisions, spending only a few seconds personally reviewing the system’s recommendations.
The officers also described decisions to go after the scores of Hamas targets in their homes while they were alongside civilians, as the location made it easier to confirm their location with intelligence tools. Planners considering strikes allegedly were willing to allow up to 15 or 20 civilians to be potentially killed in the process of pursuing a single low-level Hamas operative.
“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one of the anonymous officers told the publications. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”
“An independent examination by an [intelligence] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law,” the IDF told the outlets in response to the investigation.
Observers criticised the tactics as inhumane.
Tariq Kenney-Shawa, a fellow at Al-Shabaka: The Palestinian Policy Network, called the reports “sickening.”
Alex Hanna of the Distributed AI Research Institute, meanwhile, wrote on X, “This is sick and the future of AI warfare for US Empire.”
Soldiers often trusted the system more than the judgement of their own grieving colleagues after 7 October, one intelligence officer who used Lavender, which was developed by Israel’s elite Unit 8200, told The Guardian.
“This is unparalleled, in my memory,” the officer said, adding. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
In a statement to The Guardian, the IDF denied using AI to generate confirmed military targets, and said Lavender was used to “cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations.”
“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” it added. “Information systems are merely tools for analysts in the target identification process.”
An estimated 33,000 Palestinians have been killed in Israel’s campaign in Gaza, according to the territory’s health ministry, the majority of them civilians.
Israel has faced continued scrutiny for the high civilian death toll of its operations, which have targeted residential areas, hospitals, and refugee camps. The IDF says Hamas frequently stations military activities in civilian areas as a means of using human shields.
The IDF’s targeting tactics have come under a new round of international criticism after Israel killed seven World Central Kitchen aid workers in an airstrike in Gaza on Monday.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments