Artificial intelligence is leading to more deaths in Gaza
In April, Google fired 50 workers in a move that revealed a new and ominous dimension of U.S. complicity in Israel’s war in the Gaza Strip. The workers were protesting Project Nimbus, a $1.2 billion contract that Google and Amazon signed to provide the Israeli government — including its military — with cloud computer services.
U.S. support for Israel’s war in Gaza is now so multidimensional that one might well speak of a U.S.-Israel war on Hamas and Palestinian civilians. The digital aid comes in addition to an enormous and unconditional supply of U.S. military funding, the direct
participation of U.S. intelligence and defensive weapons systems, and the provision of diplomatic support and cover for Israel’s behavior. Google’s contribution to the war, however indirect, merits examination.
As Israel conducts one of the most devastating and deadly military campaigns of the 21st century, its operations are a laboratory for strategic thinking. The war in Gaza is mostly an urban guerilla war; Israel is seeking the destruction of an insurgency embedded in a civilian population.
The Pentagon is surely studying what is happening — how effective are the weapons systems we supply to Israel and those that Israel developed on its own. The horrific brutality and destructiveness of Israel’s campaign is a model for one approach to repressing an urban insurgency and national liberation movement. Israel’s innovative use of artificial intelligence is surely drawing the interest of Pentagon strategists.
The Google protesters said the Nimbus contract “allows for further surveillance of and unlawful data collection on Palestinians.”
This surveillance provides the basis of what the dissident Israeli journalist Yuval Abraham (of +972 Magazine), with sources in the IDF, has exposed about the use of several artificial intelligence programs. These programs have enormously heightened the destruction, both human and physical, of Israel’s military campaign in Gaza. Abraham describes these systems as a “danger to humanity.”
Abraham first came to prominence last fall with revelations about an Israeli military program known as “the Gospel,” designed to destroy Gaza’s civilian infrastructure — universities, medical care facilities, libraries, banks, apartment complexes and so on — as a way to apply civilian pressure on Hamas. More recently Abraham has reported on two other AI systems that, together, offer further explanation for the immense suffering the war has produced.
In an interview with the Wall Street Journal in late 2023, Israeli Prime Minister Benjamin Netanyahu said a primary aim of the military campaign was the “complete destruction” of Hamas. Enter artificial intelligence targeting.
Abraham reports that an Israeli AI program called “Lavender,” using data points collected through Israel’s long-term and pervasive surveillance of Gaza’s 2.3 million residents — perhaps aided by Google, based on what the fired protesters have said — has generated a “kill list” of as many as 37,000 Palestinians targeted for assassination. This list is based on data pertaining to social media activity, contacts, employment, phone calls, places visited and so on. Generating the kill list required little human oversight; Lavender’s algorithm allows for an estimated 10% error rate (i.e., up to 10% of those on the kill list might have nothing to do with Hamas).
A second AI program, ironically named “Where’s Daddy?,” provides information about a target’s location and the best time and place for an assassination — usually at home at night when the targets are sleeping with their family. According to Abraham, Israel’s “extremely permissive” bombing policy has resulted in “entire Palestinian families being wiped out inside their houses.”
The Lavender program also assigns a certain level of importance within the Hamas command structure to the targets it generates. The Israeli military has developed a range of permissible “collateral” civilian deaths for various levels within Hamas — e.g., tens of civilian deaths for a basic soldier up to hundreds for a brigade commander.
In a further irony, the military has also decided the many lower-level combatants do not merit the use of costly “smart” bombs in their killing. The military reserves these precise munitions for those higher in the Hamas command structure. The upshot: A great many cheaper, “dumb” bombs with much greater collateral damage are dropped on the dwellings of Hamas cadre, where collateral civilian deaths are supposed to be more limited. Cost-effective economics trumps human life and the preservation of civilian infrastructure.
The U.S. and Israel seem to have adopted by a different route the same strategy as Russian President Vladimir Putin in places like Grozny and Mariupol — wars of maximum death and devastation. Artificial intelligence, finally, serves the same purpose. As a result — and with the complicity of the U.S. government and major digital corporations — Gaza may well soon be made largely uninhabitable.
Michael G. Baylor is a retired professor of history at Lehigh University.