Big serious faces on either side of a platform with unsuspecting people on it

Part 4. Systematized Supremacy

Witnessing how tech is used to conquer and destroy

Part 4. Systematized Supremacy

Witnessing how tech is used to conquer and destroy

This essay is part of “Digitized Divides”, a multi-part series about technology and crisis. This part was written by Safa, and co-developed through discussions, research, framing, and editing by Liz Carrigan, Safa, Louise Hisayasu, Dominika Knoblochová, Christy Lange, Mo R., Helderyse Rendall, and Marek Tuszynski. Image by Liz Carrigan and Safa, with visual elements from Yiorgos Bagakis and La Loma.
The aggregation of personal data of wide groups of people can be used to provide benefits to individuals and society, such as in early cancer detection using machine learning192, wastewater monitoring to detect broad levels of COVID transmissions in a certain area before hospitals get overloaded,193 and satellite radar data being used to monitor illegal logging and support in forest conservation efforts194. But there have also been significant cases of personal data used under the guise of ‘safety and security’ that inform actions and policies resulting in widespread oppression and harm. Data, and the tools it feeds, in and of itself isn’t good or bad, helpful or harmful – it depends on who is collecting it, what is being collected, how it is being collected, and how the information and results are used. Technology can be used to help people, to harm people, but it also isn’t necessarily an either/or situation – it can be used simultaneously for the benefit of one person or group while harming another person or group.
While some may ask whether the benefits of using personal data to implement widespread policies and actions outweigh the harms, this piece posits that comparing the benefits and harms in this balanced, binary, two-sided approach is a misguided way of critical assessment, especially when the harms include violence against civilians. After all, human suffering is never justified and there are no ways to sugarcoat negative repercussions in good faith. Technological bothsidesism makes an attempt at determining the ‘goodness’ or ‘brownie points’ of technology, which is a distraction, because technology itself isn’t good or bad, it is about the humans behind it – the owners and operators behind the machines. Depending on the intentions and aims of those people, technology can be used for a wide variety of purposes.
In the 1985 novel “Ender’s Game”, the young protagonist Ender is recruited to Battle School to train as a child soldier in the fight against what appears to be an insect-like race through what are explained as virtual simulation fights. But the big twist is that there were actually no simulations and the “enemy” within the game was a real-life group who were ultimately not the aggressors and were actually systematically dehumanized which in turn facilitated Ender and his people to commit technologically-assisted annihilation of them. In this story, the people with the technology and oversight were well-aware of what they were doing and regardless of the facts, used their own biases to develop propaganda that radicalized generations of people with the tools to commit atrocious acts. The technology in Ender’s Game caused a distraction, diverting the readers away from the real issues until the end, when it was too late. Humanity has been living in this reality in one way or another for some time – and for the past few years it has been supercharged by AI.

Lucrative and lethal

Israel uses data collected from Palestinians to train AI-powered automated tools,195 including those which have been co-produced by international firms, like the collaboration between Israel’s Elbit Systems and India’s Adani Defence and Aerospace196, that have been deployed in Gaza197 and across the West Bank198. Israeli AI-supercharged surveillance199 tools and spyware200– including Pegasus201, Paragon202, QuaDream203, Candiru204, Cellebrite205 as well as AI weaponry including the Smart Shooter206 and Lavender207 – have received both condemnation from human rights groups208 and interest from governments worldwide209. Graduates of the Israeli military’s elite intelligence unit, Unit 8200, are so coveted by surveillance and military tech companies that there is a term “8200-to-tech pipeline”210. Israeli surveillance is world famous and exported to many places, including South Sudan211 and the United States212.
The United States is also looking into ways to use home-made and imported facial recognition technologies at the US-Mexico border to track the identities of migrant children, collecting data they can use over time. Eileen Guo of MIT Technology Review wrote: “That this technology would target people who are offered fewer privacy protections than would be afforded to US citizens is just part of the wider trend of using people from the developing world, whether they are migrants coming to the border or civilians in war zones, to help improve new technologies.”213 In addition to facial recognition, the United States is also collecting DNA samples of immigrants for a mass registry with the FBI214.
In 2021, US-headquartered companies Google and Amazon jointly signed an exclusive billion-dollar contract with the Israeli government to develop ‘Project Nimbus’, which was meant to advance technologies in facial detection, automated image categorization, object tracking, and sentiment analysis for military use — a move that was condemned by hundreds of Google and Amazon employees215 in a coalition called No Tech for Apartheid216. One AI-powered system called “Where's Daddy” identifies targets based on various criteria, one of which is whether the person is in a WhatsApp group with another suspected individual.217 AI-powered systems, ‘Lavender’ and ‘The Gospel’ (‘Hasbora’), have been referred to as a “mass assassination factory” in Gaza with minimal human oversight where “emphasis is on quantity and not on quality”.218 The Israeli army also has ties with Microsoft219 for machine learning tools and cloud storage220. These examples are brought in here to show the imbalance of power within the greater systems of oppression at play. These tools and corporate-ties are not accessible to all potential benefactors – it would be inconceivable for Google, Amazon, and Microsoft to sign these same contracts with, say, the Islamic Resistance Movement.

‘Smart’ weapons, nightmare fuel

Former US President, Barack Obama is credited for normalizing the use of armed drones in non-battlefield settings.221 The Obama administration described drone strikes as “surgical” and “precise,”222 at times even claiming that the use of armed drones resulted in not “a single collateral death,”223 when that was patently false.224 Since Obama took office in 2009, drone strikes became commonplace and even expanded in US international actions (in battlefield and non-battlefield settings) of the subsequent administrations.225 Critics say the use of drones in warfare gives governments the power to “act as judge, jury, and executioner from thousands of miles away” and that civilians “disproportionately suffer” in “an urgent threat to the right to life”.226 In one example, the BBC described Russian drones as “hunting” Ukrainian civilians227.
In 2009, Human Rights Watch reported on Israel’s use of armed drones in Gaza228 — although drone use was considered a “well-known secret” in Israeli society until more recently229. In 2021, Israel started deploying “drone swarms” in Gaza to locate and monitor targets.230 In 2022, Omri Dor, commander of Palmachim Airbase, said, “The whole of Gaza is ‘covered’ with UAVs that collect intelligence 24 hours a day.”231 In Gaza, drone technology has played a major role in increasing damage and targets, including hybrid drones such as “The Rooster”232 and “Robodogs”233 that can fly, hover, roll, and climb uneven terrain. Machine gun rovers have been used to replace on-the-ground troops.234
The AI-powered Smart Shooter whose slogan is “one-shot, one-hit” boasts a high-degree of accuracy. “The system locks in on the target and follows it using artificial intelligence to process the image in real-time. The bullet exits the gun only when a hit is ensured.”235 The Smart Shooter was installed during its pilot stage in 2022 at a Hebron checkpoint, where it remains active to this day. Israel also employs ‘smart’ missiles, like the SPICE 2000236 which was used in October 2024 to bomb a Beirut high-rise apartment building237.
The Israeli military is considered to be in the top 20 most powerful military forces in the world238 with a military budget second only to Ukraine239. Israel has claimed that it conducts “precision strikes” and does not target civilians240, but civilian harm expert Larry Lewis has said Israel’s civilian harm mitigation strategies have been insufficient, with their campaigns seemingly designed to create risk to civilians.241 The aforementioned technologies employed by Israel have helped their military use disproportionate force242 to kill Palestinians in Gaza en masse, as an IDF spokesperson described: “we’re focused on what causes maximum damage.”243 While AI-powered technologies reduce boots on-the-ground and therefore potential injuries and casualties of the military who deploys them, they greatly increase casualties of those being targeted. The Israeli military claims AI-powered systems “have minimized collateral damage and raised the accuracy of the human-led process,”244 but the documented results tell a different story. According to the UN Office for Coordinated Affairs in the Occupied Palestinian Territories, from October 7th, 2023 until May 7th, 2025, Israel has killed 52,653 Palestinians and injured another 118,897 people.245 Documentation reveals that at least 13,319 of the Palestinian who were killed were babies and children between 0 and 12 years of age.246 The UN's reports of Palestinian casualties are said to be conservative by researchers, who estimate the true death toll to be double247 or even more than triple248. According to one report: “So-called ‘smart systems’ may determine the target, but the bombing is carried out with unguided and imprecise ‘dumb’ ammunition because the army doesn’t want to use expensive bombs on what one intelligence officer described as ‘garbage targets.’”249 Furthermore, 92% of housing units were destroyed in Gaza, as well as 88% of school buildings, and 69% of overall structures across Gaza have been destroyed or damaged.250
In 2024, UN experts deplored251 Israel’s use of AI to commit crimes against humanity in Gaza. Regardless of all of the aforementioned information, that same year, Israel signed a global treaty on AI252 developed by the Council of Europe for safeguarding human rights. Seeing how Israel has killed such a large number of Palestinians using AI-powered tools, and connected to technologies which are used in daily life, such as WhatsApp, is seen by some as a warning sign of what is possible to befall them one day, but is seen by others as a blueprint for efficiently systematizing supremacy and control.
This piece positions that it isn’t just about the lack of human oversight with data and AI tools that is the issue – actually, who collects, owns, controls, and interprets the data and what their biases are (whether implicit or explicit) is a key part in understanding the actual and potential for harm and abuse. Furthermore, focusing exclusively on technology in Israel’s war on Gaza, or any war for that matter, could risk a major mistake: absolving the perpetrator’s responsibility of crimes they commit using technology. When over-emphasizing the tools, it can become all-too-easy to redefine intentional abuses as machine-made mistakes.
When looking at technology’s use in geopolitics and warfare, understanding the power structures is key to gaining a clear overview. Finding the ‘goodness’ in ultra specific uses of technology does little in the attempt to offset the ‘bad’. For the human beings whose lives have been made more challenging and conditions dire as a result of the implementation of technology in domination, warfare, and systems of supremacy, there is not much that can be rationalized for the better. And the same can be said of other entities who use advantages (geopolitical, technological, or otherwise) in order to assert control over others who are in relatively more disadvantaged and vulnerable positions. To divorce the helpful and harmful applications of technology is to lose oversight of the bigger picture of not only how tech could be used one day, but how it is actually being used right now.
Notice: This work is licensed under a Creative Commons Attribution 4.0 International Licence.

Endnotes

192 Hunter, Benjamin; et al. “The Role of Artificial Intelligence in Early Cancer Diagnosis.” Cancers (Basel). 2022 Mar 16;14(6):1524.
193 U.S. Centers for Disease Control and Prevention. “National Wastewater Surveillance System.” 2025.
194 Weisse, Mikaela. “New Radar Alerts Monitor Forests Through the Clouds.” Global Forest Watch Blog, 2021.
200 Priest, Dana; et al. “Private Israeli spyware used to hack cellphones of journalists, activists worldwide.” The Washington Post, 2021.
203 Marczak, Bill; et al. “A First Look at Spyware Vendor QuaDream’s Exploits, Victims, and Customers.” The Citizen Lab, 2023.
204 Marczak, Bill; et al. “Hooking Candiru: Another Mercenary Spyware Vendor Comes into Focus.” The Citizen Lab, 2021.
205 Dowsett, James. “Amid Protests, Georgia Plans to Purchase Israeli Data Extraction Tech.” Organized Crime and Corruption Reporting Project (OCCRP), 2025.
206 Gault, Matthew. “Israel Deploys AI-Powered Turret in the West Bank.” Vice, 2022.
208 Pratt, Simon Frankel. “When AI Decides Who Lives and Dies.” Foreign Policy, 2024.
212 Feldstein, Steven. “Governments Are Using Spyware on Citizens. Can They Be Stopped?” Carnegie Endowment for International Peace, 2021.
213 Guo, Eileen. “The US wants to use facial recognition to identify migrant children as they age.” MIT Technology Review, 2024.
215 Anonymous Google and Amazon workers. “We are Google and Amazon workers. We condemn Project Nimbus.” The Guardian, 2021.
219 Abraham, Yuval. “Leaked documents expose deep ties between Israeli army and Microsoft.” +972 Magazine, 2025.
221 Zenko, Micah. “Obama’s Final Drone Strike Data.” Council on Foreign Relations, 2017.
222 Brady, James S. “Press Briefing by Press Secretary Jay Carney, 1/31/12.” White House, 2012.
223 Woods, Chris. “New questions over CIA nominee Brennan's denial of civilian drone deaths.” The Bureau of Investigative Journalism, 2013.
224 Purkiss, Jessica; et al. “Obama’s covert drone war in numbers: ten times more strikes than Bush.” The Bureau of Investigative Journalism, 2017.
225 Wargaski, Robert. “U.S. Drone Warfare and Civilian Casualties.” Eagleton Political Journal, Rutgers University-New Brunswick, 2022.
226 Doyle, Adriana. “Drone Warfare is Eroding the Right to Life.” Berkeley Political Review, 2024.
229 Bob, Yonah Jeremy. “IDF general: Drones crucial in almost all Israeli missions.” The Jerusalem Post, 2022.
230 Gross, Judah Ari. “In apparent world first, IDF deployed drone swarms in Gaza fighting.” The Times of Israel, 2021.
233 Gilead, Assaf. “Israel recruits robot dogs for Gaza fighting.” Globes, 2023.
237 Hussein, Bilal. “Images capture the exact moments an Israeli bomb strikes a building in Beirut.” The Associated Press, 2024.
238 Koronka, Poppy. “The 20 Most Powerful Military Forces in the World.” Newsweek, 2021.
239 Elmas, Dan Shmuel. “Second only to Ukraine: The cost of Israel's defense burden.” The Jerusalem Post, 2024.
240 Abdulrahim, Raja. “Israel Called Them ‘Precision’ Strikes. But Civilian Homes Were Hit, Too.” The New York Times, 2023.
243 McKernan, Bethan; et al. “‘We’re focused on maximum damage’: ground offensive into Gaza seems imminent.” The Guardian, 2023.
245 UN Office for Coordinated Affairs in the Occupied Palestinian Territories. “Reported impact snapshot | Gaza Strip (7 May 2025).” 2025.
246 Palestine Datasets. “Killed in Gaza.” Accessed January 2025.
247 The Economist. “How many people have died in Gaza?” 2025.
248Khatib, Rasha; et al. “Counting the dead in Gaza: difficult but essential.” The Lancet, 2024.
249 Goodfriend, Sophia. “Why human agency is still central to Israel’s AI-powered warfare.” +972 Magazine, 2024.
250 UN Office for Coordinated Affairs in the Occupied Palestinian Territories. “Reported impact snapshot | Gaza Strip (11 March 2025).”

Read Digitized Divides: