Israel /

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

// 972mag.com

According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.

Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad, including low-ranking ones, as potential bombing targets.

One source stated that human personnel often served only as a "Rubber stamp" for the machine's decisions, adding that, normally, they would personally devote only about "20 seconds" to each target before authorizing a bombing – just to make sure the Lavender-marked target is male.

First, we explain the Lavender machine itself, which marked tens of thousands of Palestinians using AI. Second, we reveal the "Where's Daddy?" system, which tracked these targets and signaled to the army when they entered their family homes.

The result, the sources testify, was that the role of human personnel in incriminating Palestinians as military operatives was pushed aside, and AI did most of the work instead. According to four of the sources who spoke to +972 and Local Call, Lavender – which was developed to create human targets in the current war – has marked some 37,000 Palestinians as suspected "Hamas militants," most of them junior, for assassination.

"We didn't know who the junior operatives were, because Israel didn't track them routinely ," explained senior officer B. to +972 and Local Call, illuminating the reason behind the development of this particular target machine for the current war.

The sources said that the approval to automatically adopt Lavender's kill lists, which had previously been used only as an auxiliary tool, was granted about two weeks into the war, after intelligence personnel "Manually" checked the accuracy of a random sample of several hundred targets selected by the AI system.

The assumption in the army was that if the target was a woman, the machine had likely made a mistake, because there are no women among the ranks of the military wings of Hamas and PIJ. "A human being had to for just a few seconds," B. said, explaining that this became the protocol after realizing the Lavender system was "Getting it right" most of the time.

"Tens of thousands. This happened a few weeks later, when the [Israeli] brigades entered Gaza, and there were already fewer uninvolved people in the northern areas." According to this source, even some minors were marked by Lavender as targets for bombing.

Once Lavender has marked a target for assassination, army personnel have verified that they are male, and tracking software has located the target in their home, the next stage is picking the munition with which to bomb them.

One source said that when attacking junior operatives, including those marked by AI systems like Lavender, the number of civilians they were allowed to kill alongside each target was fixed during the initial weeks of the war at up to 20.

The sources who spoke to +972 and Local Call explained that there was sometimes a substantial gap between the moment that tracking systems like Where's Daddy? alerted an officer that a target had entered their house, and the bombing itself – leading to the killing of whole families even without hitting the army's target.