In late 2023, Israel was aiming to assassinate Ibrahim Biari, the Northern Gaza Strip's top Hamas commander, who supported the planning of the October 7th massacre. However, Israeli intelligence could not find Mr. Biari, whom he believed was hidden in a network of tunnels under Gaza.
So Israeli officers turned to new military technology injected with artificial intelligence, Israeli and American officials explained the incident. This technology was developed 10 years ago, but was not used in combat. Biari's Finding provided new incentives to improve the tools, so the engineers of Israeli Unit 8200, the equivalent of the national security agency, quickly integrated AI into IT.
Shortly afterwards, Israel heard Mr. Biari's call and tested the AI ​​audio tool. Using that information, Israel ordered an airstrike targeting the area on October 31, 2023, killing Billia. More than 125 civilians were also killed in the attack, according to Airwars, a London-based conflict monitor.
Audio tools were just a few examples of how Israel used the war in Gaza to quickly test and deploy AI-backed military technology.
Over the past 18 months, Israel has combined AI and facial recognition software to match partially obscure or injured faces to their real identity, relying on AI to compile potential airstrike targets, creating Arabic AI models, and creating chatbots that can scan and analyze text messages, social media posts, and other Arabic language data.
Many of these efforts were partnerships between enlisted soldiers from 8,200 schools and protective soldiers who work for high-tech companies such as Google, Microsoft and Meta. The Unit 8200 sets up what has become known as the “studio,” and is a hub of innovation and a place to match experts with AI projects, people said.
But even if Israel competes to develop AI Arsenal, the deployment of technology can lead to false identities and arrests, just like civilian deaths, Israeli and American officials said. Some officials suffer from the ethical implications of AI tools, which could lead to increased surveillance and other civilian killings.
European and American defense officials have given a preview of how such technologies will be used in future wars and how they will fade away from those countries that are less active than Israel in experimenting with AI tools in real-time battles.
“We are accused of the Institute of Applied AI at Horon Institute of Technology in Israel and former senior director of the Israeli National Security Council,” said Hadas Rover, director of the Institute of Responsible AI at Israeli Horon Institute of Technology and former senior director of the Israeli National Security Council. “It led to game-changing techniques on the battlefield and benefits that proved important in combat.”
But technology “supposes serious ethical questions,” Rover said. She warned that AI needs checks and balance, adding that humans should make a final decision.
An Israeli military spokesperson said it could not comment on certain technologies due to the “secret nature.” Israel is “committed to the legal and responsible use of data technology tools,” she said, adding that the military is investigating the strike against Billia and “we were unable to provide further information until the investigation was completed.”
Meta and Microsoft declined to comment. Google said, “Employees fulfilling their reserve obligations in various countries around the world. Employees as preventive agents are not connected to Google.”
Israel previously used conflicts in Gaza and Lebanon to experiment and advance military technical tools, including drones, telephone hacking tools and iron dome defense systems.
AI technology was quickly cleared for deployment after Hamas launched a cross-border attack on Israel on October 7, 2023, killing more than 1,200 people and taking 250 hostages, four Israeli officials said. It will lead to cooperation between the “studio” unit 8200 and the reserve, and will quickly develop new AI capabilities, they said.
Avi Hasson, CEO of Startup Nation Central, an Israeli nonprofit that connects investors and businesses, said Meta, Google and Microsoft reserves are becoming important in driving innovation in drone and data integration.
“The reserves provided know-how and access to key technologies that were not available in the military,” he said.
The Israeli army quickly used AI to strengthen its drone fleet. Aviv Shapira, founder and CEO of Xtend, a software and drone company that works with the Israeli military, said that it was used to build drones to lock and track targets from afar using AI-powered algorithms.
“In the past, homing capabilities have been dependent on zero relying on the target's image,” he said. “Now, AI can recognize and track the object itself. It could be a car or person it is moving, with fatal accuracy.”
Shapira said his main clients, the Israeli military and the US Department of Defense, were aware of the ethical implications of AI in the warfare and discussed the responsible use of technology.
One of the tools developed by “Studio” is the Arabic AI model known as the large-scale language model, said three Israeli officials familiar with the program. (The large-scale language model was previously reported by Israeli and Palestinian news site Plus 972.)
Developers previously had a hard time creating such models due to lack of Arabic language data to train technology. When such data was available, it is primarily standard written Arabic, which is more formal than dozens of dialects used in speech Arabic.
The Israeli military had no problems, three officials said. The country has decades of intercepted text messages, transcribed calls, and posts that have been scavenged from social media in spoken Arabic dialects. So, Israeli officials created a large-scale language model in the first few months of the war, and created a chatbot that ran queries in Arabic. They will integrate the tools with a multimedia database, allowing analysts to perform complex searches between images and videos, four Israeli officials said.
When Israel assassinated Hezbollah leader Hassan Nasrara in September, the chatbot analyzed responses from the entire Arabic-speaking world, three Israeli officials said. This technique was differentiated between different Lebanese dialects, measuring public responses and helping Israel assess whether there is public pressure on counter strikes.
Sometimes, the chatbots were unable to identify some modern slang and words that were translated from English to Arabic, two officials said. That required Israeli intelligence agent with expertise in various dialects to review and modify the work, one executive said.
Chatbots can also provide the wrong answer. For example, return a picture of a pipe instead of a gun – two Israeli intelligence agents said. Still, they said AI tools have accelerated research and analysis significantly.
Israel, which was set up at a temporary checkpoint between the north and the south Gaza Strip, began equipping its cameras after the attack on October 7th, attacked its ability to scan and send high-resolution images of Palestinians into AI-assisted facial recognition programs.
This system also had difficulty identifying people with obscure faces. It led to the arrest and questioning of Palestinians who were mistakenly flagged by facial recognition systems, two Israeli intelligence officers said.
Israel also used AI to sift through data accumulated by information personnel about Hamas members. Before the war, Israel built a machine learning algorithm (code name: lavender). It was trained in a database of confirmed Hamas members and was intended to predict who else was part of the group. Although the predictions of the system were incomplete, Israel used it to assist in selecting targets of attack at the start of the war in Gaza.
There were few goals larger than finding and eliminating senior Hamas leaders. Close to the top of the list was Biary, Hamas commander whom Israeli officials believed to have played a central role in planning the October 7 attack.
Israeli military intelligence quickly intercepted Bili's call with other Hamas members, but he could not locate him. So they turned to AI-assisted audio tools that analyze a variety of sounds, such as Sonic bombs and airstrikes.
After recommending an approximate location where Billia was making calls, Israeli military officials were warned that the area, including several apartments, is crowded together, two intelligence officers said. They said they need to target several buildings to ensure that Billia will be assassinated. The operation was a green light.
Since then, Israeli intelligence has used audio tools to spot hostages along with maps and photos of Gaza's underground tunnel maze. Over time, the tool has been improved to find individuals more accurately, two Israeli officials said.