- April 8, 2024
From ‘Lavender’ to ‘Where’s Daddy?’: How Israel is using AI tools to hit Hamas militants – Times of India
NEW DELHI: Israeli military operations employing artificial intelligence (AI) to target Hamas militants in Gaza have intensified scrutiny as civilian casualties rise. A report by Israeli outlets +972 Magazine and Local Call highlighted the use of two AI systems, “Lavender” and “Where’s Daddy?” in the conflict. “Lavender” is tasked with identifying suspected militants and their residences, while “Where’s Daddy?” tracks these individuals to their homes, facilitating strikes when they return.
The Lavender system is engineered to identify individuals suspected of being part of the military branches of Hamas and Palestinian Islamic Jihad (PIJ), targeting even those with lower ranks for potential aerial bombardments.According to reports by +972 and Local Call, in the initial stages of the conflict, the military predominantly depended on Lavender. This reliance led to the system labeling up to 37,000 Palestinians, suspecting them as militants — and their residences — for potential airstrikes.
The Lavender platform operates alongside another AI system, “The Gospel”. A key distinction lies in the nature of their targeting: while The Gospel identifies buildings and facilities purportedly used by militants, Lavender focuses on individuals, designating them for potential elimination. The “Where’s Daddy?” system monitored these targets and alerted the military upon their return to their family residences.
One intelligence officer revealed, “The IDF bombed [Hamas operatives] in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” This approach has often resulted in civilian casualties, acknowledged as “collateral damage”, the report said.
However, the “Lavender” system’s misidentification issues, admitting a 10% error rate, sometimes target individuals with no links to militant groups or those coincidentally sharing names or devices with militants, the report claimed.
Brianna Rosen, a senior fellow at Just Security, suggests the margin of error might be even higher. Rosen criticized Israel’s targeting criteria and the AI’s error rates, exacerbating civilian risks. Officers admitted the minimal human oversight in the target identification process, often reduced to a mere “rubber stamp” of the AI’s selections.
The report also touches on Israel’s use of cheaper, unguided “dumb” bombs for targeting junior operatives, a practice President Joe Biden warned could diminish international support due to its “indiscriminate bombing.” The IDF insists that its operations aim to reduce civilian harm “to the extent feasible” and denies using AI systems to predict whether someone is a terrorist.
Meanwhile, the Israeli military has announced its withdrawal from Khan Younis, a city in southern Gaza, signaling the completion of a crucial stage in its ground operation against Hamas militants. This move has reduced the Israeli troop presence in the area to one of the lowest levels since the six-month conflict began.
However, defense officials said on Sunday that the troops were simply regrouping in preparation for an advance into Rafah, Hamas’ final stronghold. Israel has been threatening a ground offensive in Rafah for weeks, but the city is home to approximately 1.4 million people, which is more than half of Gaza’s total population. The possibility of an offensive has raised international concern, including from the United States, Israel’s closest ally, which has insisted on seeing a viable plan to ensure civilian safety.
Despite the withdrawal, the Israeli military officials, who spoke on the condition of anonymity due to army policy, emphasized that a “significant force” remained in Gaza to continue targeted operations, including in Khan Younis, the hometown of Hamas leader Yehya Sinwar. The withdrawal marks a significant point in the ongoing conflict between Israel and Hamas, which has now reached the six-month mark.
(With inputs from agencies)
The Lavender system is engineered to identify individuals suspected of being part of the military branches of Hamas and Palestinian Islamic Jihad (PIJ), targeting even those with lower ranks for potential aerial bombardments.According to reports by +972 and Local Call, in the initial stages of the conflict, the military predominantly depended on Lavender. This reliance led to the system labeling up to 37,000 Palestinians, suspecting them as militants — and their residences — for potential airstrikes.
The Lavender platform operates alongside another AI system, “The Gospel”. A key distinction lies in the nature of their targeting: while The Gospel identifies buildings and facilities purportedly used by militants, Lavender focuses on individuals, designating them for potential elimination. The “Where’s Daddy?” system monitored these targets and alerted the military upon their return to their family residences.
One intelligence officer revealed, “The IDF bombed [Hamas operatives] in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” This approach has often resulted in civilian casualties, acknowledged as “collateral damage”, the report said.
However, the “Lavender” system’s misidentification issues, admitting a 10% error rate, sometimes target individuals with no links to militant groups or those coincidentally sharing names or devices with militants, the report claimed.
Brianna Rosen, a senior fellow at Just Security, suggests the margin of error might be even higher. Rosen criticized Israel’s targeting criteria and the AI’s error rates, exacerbating civilian risks. Officers admitted the minimal human oversight in the target identification process, often reduced to a mere “rubber stamp” of the AI’s selections.
The report also touches on Israel’s use of cheaper, unguided “dumb” bombs for targeting junior operatives, a practice President Joe Biden warned could diminish international support due to its “indiscriminate bombing.” The IDF insists that its operations aim to reduce civilian harm “to the extent feasible” and denies using AI systems to predict whether someone is a terrorist.
Meanwhile, the Israeli military has announced its withdrawal from Khan Younis, a city in southern Gaza, signaling the completion of a crucial stage in its ground operation against Hamas militants. This move has reduced the Israeli troop presence in the area to one of the lowest levels since the six-month conflict began.
However, defense officials said on Sunday that the troops were simply regrouping in preparation for an advance into Rafah, Hamas’ final stronghold. Israel has been threatening a ground offensive in Rafah for weeks, but the city is home to approximately 1.4 million people, which is more than half of Gaza’s total population. The possibility of an offensive has raised international concern, including from the United States, Israel’s closest ally, which has insisted on seeing a viable plan to ensure civilian safety.
Despite the withdrawal, the Israeli military officials, who spoke on the condition of anonymity due to army policy, emphasized that a “significant force” remained in Gaza to continue targeted operations, including in Khan Younis, the hometown of Hamas leader Yehya Sinwar. The withdrawal marks a significant point in the ongoing conflict between Israel and Hamas, which has now reached the six-month mark.
(With inputs from agencies)