May 25, 2024 11:22 (IST)
Follow us:
facebook-white sharing button
twitter-white sharing button
instagram-white sharing button
youtube-white sharing button
Bangladeshi MP Anwarul Azim was honey-trapped; killers were paid Rs. 5 cr for his gory murder: Cops | NIA arrests another key conspirator in Bengaluru Rameshwaram Cafe blast case | Delhi court convicts Medha Patkar in criminal defamation case filed by Delhi LG VK Saxena | 2 cops suspended for 'not following protocol' in Pune Porsche accident case | Sadhus hit Kolkata streets in protest against Mamata Banerjee's attack on Kartik Maharaj
Lavender and Gospel: Israeli armed forces opened AI arsenal to destroy thousands of Hamas operatives
AI
Photo courtesy: IDF X page

Lavender and Gospel: Israeli armed forces opened AI arsenal to destroy thousands of Hamas operatives

| @indiablooms | 28 Apr 2024, 03:29 pm

The Israeli military used advanced artificial intelligence (AI) systems, named Lavender and Gospel, in its bombing operations in Gaza, reports suggest.

These AI systems, developed by Israel's elite intelligence division Unit 8200, have been instrumental in the IDF's targeting strategy, leading to discussions about the ethical and legal implications of their use.

Lavender, specifically, functions as an AI-powered database aimed at identifying targets of the Hamas and Palestinian Islamic Jihad (PIJ). Developed using machine learning algorithms, Lavender processes large volumes of data to pinpoint those categorized as "junior" militants in these groups.

According to reports from Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call, Lavender initially identified around 37,000 Palestinian men linked to Hamas or PIJ.

This use of AI for target identification marks a significant departure from the traditional methods employed by Israeli intelligence agencies Mossad and Shin Bet, which relied more heavily on human decision-making.

In practice, soldiers often made rapid decisions, sometimes taking as little as 20 seconds, based on Lavender's information to determine whether to target identified individuals, primarily focusing on ascertaining the gender of the target.

Despite Lavender's error margin of up to 10 per cent, soldiers frequently followed its information without question.

However, according to the reports, the program often targeted individuals with minimal or no affiliation with Hamas.

Gospel is another AI system that functions by automatically generating targets through AI recommendations. In contrast to Lavender, which focuses on identifying human targets, Gospel is said to identify structures and buildings as targets.

"This is a system that allows the use of automatic tools to produce targets at a fast pace and works by improving accurate and high-quality intelligence material according to the requirement. With the help of artificial intelligence, and through the rapid and automatic extraction of updated intelligence - it produces a recommendation for the researcher, with the goal being that there will be a complete match between the machine's recommendation and the identification performed by a person," the IDF said in a statement.

The specific data sources fed into The Gospel are not publicly revealed. However, experts opined that AI-driven targeting systems like this one typically analyze diverse data sources. These may include drone imagery, intercepted communications, surveillance data, and behavioural patterns of both individuals and groups.

Support Our Journalism

We cannot do without you.. your contribution supports unbiased journalism

IBNS is not driven by any ism- not wokeism, not racism, not skewed secularism, not hyper right-wing or left liberal ideals, nor by any hardline religious beliefs or hyper nationalism. We want to serve you good old objective news, as they are. We do not judge or preach. We let people decide for themselves. We only try to present factual and well-sourced news.

Support objective journalism for a small contribution.