Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report

A man displays blood-stained British, Polish, and Australian passports after an Israeli airstrike, in Deir al-Balah, Gaza Strip, Monday, April 1, 2024. - Photo: Abdel Kareem Hana (AP)
A man displays blood-stained British, Polish, and Australian passports after an Israeli airstrike, in Deir al-Balah, Gaza Strip, Monday, April 1, 2024. - Photo: Abdel Kareem Hana (AP)

Israel has been using an artificial intelligence system called Lavender to create a “kill list” of at least 37,000 people in Gaza, according to a new report from Israel’s +972 magazine, confirmed by the Guardian. Lavender is the second AI system revealed after the existence of Israel’s The Gospel was first reported on last year, but while The Gospel targets buildings, Lavender targets people.

The new report cites six unnamed Israeli intelligence officers who spoke with +972 about how the country’s military “almost completely relied” on Lavender during the early weeks of the war, despite the fact that it was known to misidentify potential targets as terrorists. Humans who were in the loop—the name for making sure a person is the one making targeting decisions rather than a machine—essentially acted as a “rubber stamp,” according to +972, with Israeli officers devoting about 20 seconds to each decision.

The Lavender AI system reportedly works by analyzing information collected on almost all of the 2.3 million Palestinians in the Gaza Strip “through a system of mass surveillance,” assessing the likelihood of any given person belonging to Hamas in an opaque ranking system. Every Palestinian is given a ranking of 1 to 100 that supposedly determines how likely they are to be a member of the militant terrorist group.

From +972:

Lavender learns to identify characteristics of known Hamas and [Palestinian Islamic Jihad] operatives, whose information was fed to the machine as training data, and then to locate these same characteristics — also called “features” — among the general population, the sources explained. An individual found to have several different incriminating features will reach a high rating, and thus automatically becomes a potential target for assassination.

The Israeli military gave “sweeping approval” for officers to use Lavender for targeting in Gaza according to +972, but there was no requirement to make a thorough check on “why the machine made those choices or to examine the raw intelligence data on which they were based.” The humans checking on Lavender’s targeting decisions were mostly just checking to make sure the target was male, even though at least 10% of targets had no possible association with Hamas, according to “internal checks.” It’s not clear how those internal checks were conducted or whether the percentage was much higher.

Most of the targets were bombed in their homes, according to +972. Another automated system used in conjunction with Lavender, dubbed “Where’s Daddy?” has been used to strike targets inside their family homes.

“We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” an anonymous Israeli intelligence officer told +972. “On the contrary, the IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

The new report also claims targets identified by Lavender were only junior militants, which meant the Israeli military preferred to use unguided munitions or “dumb bombs,” so as not to waste expensive bombs on relatively inconsequential targets. The result has been entire families wiped out, according to +972.

Israel also loosened its threshold for the number of civilians it was acceptable to kill as “collateral damage,” a report that’s consistent with previous leaks published in Haaretz about new rules of engagement after October 7.

From +972:

In an unprecedented move, according to two of the sources, the army also decided during the first weeks of the war that, for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians; in the past, the military did not authorize any “collateral damage” during assassinations of low-ranking militants. The sources added that, in the event that the target was a senior Hamas official with the rank of battalion or brigade commander, the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander.

Israel launched the war in Gaza after the terrorist attacks of October 7, 2023, that killed roughly 1,200 Israelis and saw Hamas kidnap about 240 people. More than 32,600 Palestinians have been killed in Gaza since the start of the war, according to the United Nations, with the majority identified as women and children. And it’s estimated thousands more buried under the rubble of the decimated territory haven’t been counted. Israel reports 255 of its soldiers have been killed since the start of the war.

Lavender had previously been used only as an “auxiliary tool” before October 7, according to +972, but entire kill lists with tens of thousands of people were adopted wholesale after the terrorist attacks.

“At 5 a.m., [the air force] would come and bomb all the houses that we had marked,” one of the anonymous Israeli sources told +972. “We took out thousands of people. We didn’t go through them one by one—we put everything into automated systems, and as soon as one of [the marked individuals] was at home, he immediately became a target. We bombed him and his house.”

Israel released a lengthy statement to the Guardian about the Lavender AI system on Wednesday, insisting that Hamas uses Palestinians in Gaza as “human shields,” while Israel respects international law.

“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the Israelis statement reads.

“According to IDF directives, analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” the statement continues.

And while the new report from +972 acknowledges Lavender’s use has largely been scaled back since the start of the war, part of the reason is that the Israeli military is simply running out of civilian homes to target since “most homes in the Gaza Strip were already destroyed or damaged.”

For the latest news, Facebook, Twitter and Instagram.