The Washington Post has shed light on the Israeli army’s use of artificial intelligence (AI) tools to flood Gaza with bombs during its ongoing war on the besieged enclave.
In a report by Elizabeth Dwoskin, the newspaper revealed that “Israel established an AI factory and unleashed it during the current war on Gaza.” Years ago, Tel Aviv developed an intelligence unit, transforming it into a testing ground for AI, sparking debates among top military leaders over whether humans could maintain control over it.
The Israeli army exploited a meticulously built database over the years, detailing the addresses of homes, tunnels, and other vital infrastructure associated with Hamas.
“The target bank was executed quickly,” the report stated, “and to sustain the pace of war, the Israeli army relied on a sophisticated AI tool called ‘Hasbora,’ capable of generating hundreds of additional targets almost instantly,” according to two individuals familiar with the operation.
AI-Powered Target Generation
The use of artificial intelligence enabled the Israeli army to replenish its target bank continuously, ensuring an uninterrupted campaign. This reliance on AI is a prime example of how a decade-long program has placed advanced AI tools at the center of Israeli military intelligence operations.
Experts cited by The Washington Post acknowledged that the Israeli military openly admits to using such programs, which are among the most advanced AI initiatives globally. However, the report uncovers previously unmentioned details regarding the internal workings of the learning algorithms and the secretive history of their development.
The AI program was reportedly behind the rapid escalation of violence in Gaza, which led to the deaths of over 45,000 people, half of whom were women and children, according to figures from Gaza’s Ministry of Health.
Increased Civilian Casualties
Individuals familiar with the Israeli army’s practices, including soldiers who served in Gaza, stated that “the army increased the acceptable ratio of civilian casualties beyond the normal threshold.” Automation and AI significantly accelerated target generation and broadened the scope of operations.
The report is based on interviews with numerous individuals familiar with the AI systems, most of whom requested anonymity. Steven Feldstein, a senior fellow at the Carnegie Endowment for International Peace, commented: “What’s happening in Gaza represents a continued evolution in the way wars are being waged.”
The Israeli army dismissed accusations that AI usage endangered civilian lives, stating: “The more effectively information is gathered, the more precise the operations. If anything, these tools have reduced collateral damage while enhancing the precision of human-led operations.”
However, a military intelligence officer disclosed that officers must sign off on recommendations generated by the “big data processing systems,” including tools like Hasbora, which do not make decisions autonomously.
AI in Intelligence Operations
Transformations within the Israeli intelligence unit, known as Unit 8200, accelerated in 2020 under the leadership of Yossi Saril. Saril revamped the unit’s intelligence-gathering methods, advocating for the development of AI tools like Hasbora. These systems rely on predictive algorithms that allow soldiers to quickly extract information from a vast data repository referred to internally as “the pool.”
By analyzing intercepted communications, satellite imagery, and social media data, the algorithms identify coordinates for tunnels, missile launch sites, and other military targets.
Recommendations that pass scrutiny by intelligence analysts are added to the target bank by senior officers. Another AI tool, called Lavender, uses predictive modeling to estimate the likelihood of a Palestinian being a member of an armed group, enabling the rapid generation of human targets.
These algorithms, bearing names like Alchemist, Depth of Wisdom, Hunter, and Flow, have raised internal concerns among Unit 8200 officers. Some officers questioned the reliability of machine-learning decisions and the potential flaws inherent in such systems.
Automation’s Impact on Decision-Making
Internal reviews found that some AI systems processing Arabic-language data contained errors, struggling to understand key colloquial phrases. According to the Israeli army, machine learning technologies are used to predict the outcomes of attacks, estimate civilian casualties, and ensure adherence to international law.
However, concerns over the quality of AI-derived intelligence have shifted the military’s operational culture. Historically, Israeli intelligence valued individual judgment, but this has now been overshadowed by a culture prioritizing technological ingenuity, according to three sources.
Under Saril’s leadership, Unit 8200 restructured to prioritize engineers, sidelining Arabic-language specialists and dismissing leaders resistant to AI adoption. Some intelligence groups focusing on traditional data analysis were disbanded entirely.
The AI Factory’s Role in Bombing Gaza
By Israel’s own admission, AI played a central role in targeting operations in Gaza. In the days following the October 7 attacks, U.S.-made Mark 80 bombs weighing 2,000 pounds rained down on Gaza. In a press release on November 2, 2023, the Israeli army announced that “Hasbora” had facilitated the bombing of 12,000 targets in Gaza.
Accompanied by dramatic music and footage of exploding buildings, the press release highlighted “unprecedented collaboration,” where ground, air, and naval forces were provided real-time intelligence from the AI-powered target factory, enabling “hundreds of attacks within moments.”
Israeli historian Adam Raz, who has interviewed soldiers and officers about Unit 8200’s use of AI, estimated that the army was striking two targets per minute at the height of the bombardment—”an astonishing rate,” he said.
In the early days of the war, the target factory operated at full capacity, with around 300 soldiers working around the clock. Analysts were tasked with verifying targets recommended by Hasbora and Lavender, a process that could take anywhere from three minutes to five hours.
Reduced Verification Standards
At the start of the war, the requirement for two human intelligence confirmations to validate Lavender’s predictions was reduced to just one, according to two sources familiar with the process.
One soldier revealed that in some cases, poorly trained troops in the Gaza Division attacked human targets without verifying Lavender’s predictions. At times, the only required confirmation was that the target was male, according to another source.
The soldier stated, “It started with Lavender, then we did the intelligence.” To expedite the tracking of individuals flagged by Lavender as potential Hamas members, the army used real-time images of people in their homes through undisclosed methods. Custom-designed facial recognition tools then compared these images to a database of known Hamas members.
While matches appeared accurate, some soldiers expressed concern about the army relying solely on technology without confirming whether the individuals were still active Hamas members.
Sunna Files Free Newsletter - اشترك في جريدتنا المجانية
Stay updated with our latest reports, news, designs, and more by subscribing to our newsletter! Delivered straight to your inbox twice a month, our newsletter keeps you in the loop with the most important updates from our website