measurements of a face in the clouds

[TEST] ‘Smart’ (or Machiavellian?) Surveillance

Tracking how technology is used to supercharge monitoring and control

‘Smart’ (or Machiavellian?) Surveillance

Tracking how technology is used to supercharge monitoring and control

This essay is part of “Digitized Dilemmas”, a multi-part series about technology and crisis. This part was written by Safa, and co-developed through discussions, research, and framing by Safa, Louise Hisayasu, Dominika Knoblochová, Christy Lange, Mo R., Helderyse Rendall, and Marek Tuszynski. Illustration by Tactical Tech, with visual elements from Yiorgos Bagakis and Alessandro Cripsta.
The history of control and censorship in the world is far and wide, and it neither begins nor ends with the examples cited in this essay. Privacy, control, and surveillance have long been explored in popular culture, as a reflection of the societies they were created in. Ovid’s Metamorphoses from the year 8 CE documented many stories along these topics, such as The House of Rumour, which concluded: “Rumour herself sees everything that happens in the heavens, throughout the ocean, and on land, and inquires about everything on earth.”54 This short story touched on privacy and surveillance, framed in a haunting gossip trap. Skip ahead 1,500 years and these topics recur in Shakespeare’s plays, such as the spying on Hamlet by his boyhood friends55. “A lot of the literature [during Shakespeare’s time] is about how awful informers were and how you might not be able to talk in public without somebody overhearing you.”56 George Orwell’s novel 1984 follows Winston Smith, who lives in a controlling society led by ‘Big Brother’ and works at the ‘Ministry of the Truth.’ Smith eventually gets in trouble when an informant turns him in. From the classics, to Elizabethan England, to the 20th-century to today, storytelling arts such as theater and writing have caught hold of imaginations by capturing the concerns of the people with these timeless troubles.
Surveillance, monitoring, and control have been used historically and currently under the guise of protection and security, but as professor Hannah Zeavin explained: “[c]are is a mode that accommodates and justifies surveillance as a practice, framing it as an ethical “good” or security necessity instead of a political choice.”57

Digitizing one billion times more data

Tactical Tech is based in Berlin, the former capital city of international espionage.58 Residents and visitors are regularly faced with remnants of Germany’s destructive past. Without a doubt, a significant amount of abuse was inflicted by Nazi Germany and German-occupied Europe. From 1933 until 1945 the Gestapo (the Secret State Police) “was a key element in the Nazi terror system.”59 But after World War II ended, the control didn’t go away – it manifested and evolved. The Ministry for State Security (also referred to as the Stasi) was the state security and secret police of the former East Germany (German Democratic Republic or GDR) from 1950 until 1990. They are known as one of the most repressive police organizations to have existed in the world. Upon the dissolution of the Stasi, thousands of protestors occupied their Berlin headquarters and prevented them from destroying their records.60 What survives includes nearly two million photos61 and so many files that if they were laid out flat, they would be more than 111 kilometers long62.
The Stasi also conducted international operations that had lasting effects abroad. They extensively trained the Syrian Mukhabarat (secret police) under Hafez al-Assad “[in] methods of interrogation, infiltration, disinformation and brutal extraction of confessions were meticulously hammered into the minds of Syrian intelligence officials by senior Stasi agents.”63 With the fall of the GDR and Berlin Wall, the Stasi was dissolved and East and West Germany reunified.
While Germany has taken some steps to reckon with its past, surveillance is still ever-present. In 2021, Human Rights Watch raised concerns over two laws that were amended that granted more surveillance powers to the federal police and intelligence services.64 While Germans have experienced a long and persistent history of surveillance, and have gained a reputation for taking privacy issues very seriously, this perspective has changed over time. A 2017 study that surveyed over 5,000 Germans on various privacy-related topics found that “Germans consider privacy to be valuable, but at the same time, almost half of the population thinks that it is becoming less and less important in our society.”65 This is exemplified through the Google Street View debacle, in which Google’s Street View cameras struggled to operate in Germany for years due to privacy concerns, but it finally gained traction in 2023.66
Although the Stasi are world famous for their surveillance and data collection, today’s law enforcement landscape is a smorgasbord of data. The Stasi versus NSA visualization, developed in 2013, shows how data collected from the two entities compares – projecting that “the NSA can store almost 1 billion times more data than the Stasi”67. Using modern technologies like algorithms and access to all of the digitized data from health conditions to search queries and private chats, it is easier than ever to get not just a glimpse but a full picture into the lives of nearly anyone. As Amnesty International reported, “[T]he Stasi archive is a timely warning of the potential consequences of unchecked surveillance. It shows how quickly a system for identifying threats evolves into a desire to know everything about everyone.”68 Tactical Tech’s project The Glass Room has explored this topic through the years, describing: “There is a growing market for technologies that promise increased control, security, and protection from harm. At the same time, they can normalize surveillance at a macro and micro level – from the shape of a child’s ear to satellite images of acres of farmland. Often, those who need the most support may have the least control over how or when their data is being used.”69 The Glass Room’s ‘Big Mother’ exhibit adapts the Big Brother imagery to a more nurturing figure, a mother, exemplifying how people can easily let their guards down when data tracking is framed as helpful and caring. This can be seen in the advertisements for tech products such as devices that help people monitor elderly relatives via an app,70 fertility tracking apps,71 and refugee and asylum-seeker biometrics registries72. The US and Israel are among the world’s biggest suppliers of surveillance tech73 – including the US-based Palantir74 and Israel’s NSO group75 and Elbit systems76 – used by governments in places like the US-Mexico border,77 Central America,78 and Europe79.

Normalizing surveillance in daily life

The so-called ed-tech industry has been gaining traction for years, from even before the COVID-19 pandemic. Ed-tech describes the numerous technological innovations that are marketed to schools that are purported to benefit students, teachers, and school administrators. Not all ed-tech is the same, and there are efforts to bring digitization to schools to reduce the digital divide, especially experienced in more rural and low-income areas80. With that said, some of the digital tools used by school administrators have the potential to act equally as tools of surveillance. These include recording children at daycare81, using AI to analyze body and eye movements during exams82, and monitoring student social media83. So much monitoring is not without consequence, especially for traditionally marginalized groups. One study reported that student surveillance technologies put at higher risk Black, Indigenous, Latine/x, LGBTQ+, undocumented, and low-income students, as well as students with disabilities.84 In 2023, the ACLU interviewed teens aged 14-18 to capture the experiences of surveillance in schools. One participant reflected: “...we treat kids like monsters and like criminals, then ... it’s kinda like a self-fulfilling prophecy.”85 In 2017 the Electronic Frontier Foundation warned: “Ed tech unchecked threatens to normalize the next generation to a digital world in which users hand over data without question in return for free services—a world that is less private not just by default, but by design.”86 Some students and parents have pushed back, and in some cases successfully blocked certain technologies from being used in schools87.
Workers are also feeling watched. From 2020-2022, the number of large employers who used employee monitoring tools doubled.88 And it isn’t only the well-known control mechanisms Amazon uses on their warehouse workers89 – the average office worker may also be affected. A 2023 study of 2,000 employers found over three-quarters of them were using some form of remote work surveillance on their workers90. Employers are keeping track of their employees using methods such as internet monitoring91, fingerprint scanners92, eye movement tracking93, social media scraping94, and voice analysis95, among others. “We are in the midst of a shift in work and workplace relationships as significant as the Second Industrial Revolution of the late 19th and early 20th centuries,” according to the MIT Technology Review. “And new policies and protections may be necessary to correct the balance of power.”96

Even cars can be turned into tools of surveillance. Getting to work and dropping off the kids at school may be taking place in a data mining automobile. In 2023, 84% of car brands were found to sell or share personal data with data brokers and businesses.97 That same year, news broke that Tesla employees had been sharing among them in chat rooms private camera recordings captured in customers’ cars. This didn’t happen only once or twice but many times from 2019-2022. The videos included nudity, crashes, and road-rage incidents – some were even “made into memes by embellishing them with amusing captions or commentary, before posting them in private group chats.”98 In 2024, Volkswagen was responsible for a data breach that left the precise location of hundreds of thousands of vehicles across Europe exposed online for months.99 In the US, researchers found that some license plate reader cameras were found to be live-streaming video and car data online.100

In early 2025, Tesla executives handed over dashcam footage to Las Vegas police to help find the person responsible (who used ChatGPT to plan the attack101) for the Tesla Cybertruck that exploded outside the Trump International Hotel.102 While this particular case and the actions of Tesla executives was generally applauded in the media, it does raise questions about the broader issue of surveillance, the application of the law, and the limits of privacy. Researchers noted about data tracking more broadly that “tactics and tools already used by law enforcement and immigration authorities could be adapted to track anyone seeking or even considering an abortion.”103 Finding more ways to document and track people can also translate in ever more menacing ways under different political administrations and in contexts that have even fewer protections for marginalized groups.


Oppressive systems targeting marginalized identities

Surveillance is a multi-tiered Panopticon experienced in daily life, especially by people who live in cities with a high density of CCTV cameras. A November 2020 report by Netherlands-based company Surfshark assessed the most surveilled cities in the world based on CCTV camera density per square kilometer. The top five cities listed were Chennai, India (657 CCTV cameras per 1km2), Hyderabad, India (480 CCTV cameras per 1km2), Harbin, China (411 CCTV cameras per 1km2), London, England (399 CCTV cameras per 1km2), and Xiamen, China (385 CCTV cameras per 1km2). However they note that Beijing has the most overall cameras with over 1 million CCTV cameras in the whole city.104 A 2023 report by Amnesty International mapped the visible Israeli surveillance system and found one or two CCTV cameras every five meters in Jerusalem’s Old City and Sheikh Jarrah in East Jerusalem105. Notably, the Surfshark researchers mapped CCTV infrastructure in 150 countries but somehow did not manage to map the CCTV cameras in Israel, Jerusalem’s Old City or East Jerusalem, or in the other highly-surveilled Occupied Palestinian Territories, such as Hebron in their report. Israel’s surveillance industry is world famous106, 107, raising questions about why Surfshark did not include it in their report.
Since 2020, Israel’s military-run ‘Wolf Pack’ has been in use – a vast and detailed database profiling virtually all Palestinians in the West Bank, including their photographs, family connections, education, and more108. The ‘Wolf Pack’ includes ‘Red Wolf,’ ‘White Wolf,’ and ‘Blue Wolf’ tools:
  • Red Wolf: The ‘Red Wolf’ system is part of the Israeli government’s official CCTV facial recognition infrastructure to identify and profile Palestinians as they pass through checkpoints and move through cities. It has been reported that Israel’s military uses ‘Red Wolf’ in the Palestinian city of Hebron.109 According to a project by B’Tselem and Breaking the Silence, the Israeli military has set up 86 checkpoints and barriers across 20% of Hebron, referred to as ‘H2’, that is under Israeli military control.110 The checkpoints are hard to avoid in Hebron’s ‘H2’ – Palestinians living there “go through a checkpoint in order to buy groceries and again to bring them home”111 and 88% of children cross checkpoints on their way to and from school.112
  • White Wolf: Another app, called White Wolf, is available to official Israeli military personnel113 who are guarding illegal settlements114 in the West Bank, which allows them to search the database of Palestinians115. Since Israel’s war on Gaza began after the October 7, 2023 attacks by the Islamic Resistance Movement (aka Hamas) on Israelis, Israel has rolled out a similar facial recognition system registry of Palestinians in Gaza116.
  • Blue Wolf: Using the app called ‘Blue Wolf,’ the Israeli military has been carrying out a massive biometric registry of Palestinians, often at checkpoints and by gunpoint, sometimes at people's private homes in the middle of the night117. Israeli soldiers take pictures of Palestinians, including children, sometimes by force.118 Israeli soldiers also note within the app any “negative impressions [they] have of a Palestinian’s conduct when encountering them.”119 One source added “It’s not that the military has said, let’s make the Blue Wolf so [the Palestinians] can pass through more easily. The military wants to enter the people into its system for control.”120
A 2025 article also revealed how the Israeli military was using a large language model (such as that used by tools like ChatGPT) to surveil Palestinians. One Israeli intelligence source stated: “I have more tools to know what every person in the West Bank is doing. When you hold so much data, you can direct it toward any purpose you choose.”121 While the Israeli military is not the only government-sanctioned example of training AI tools on civilian data, it offers an important insight into how the latest technologies can be adopted for widespread monitoring and control.
A key aspect of the aforementioned tools of monitoring and control such as CCTV rely on facial recognition technology, which automatically identifies unique facial data, including measurements such as the distance between the eyes, width of the nose, depth of eye sockets, shape of cheekbone, and length of jawline.122 Facial recognition is used by governments, police, and other agencies around the world, and with significant results. Several cases of this technology’s benefits and harms emerge from the US alone. One unprecedented operation by US law enforcement resulted in hundreds of children and their abusers getting identified in just a three-week period123. This technology has also been used to find missing and murdered Indigenous people (MMIP), helping 57 families find answers in just three years124. While these results are indeed remarkable and reveal the ways in which the application of technologies can be used to help people, there have also been numerous cases of facial recognition being used by US law enforcement in ways that have harmed people. An app called CBP One which is required by asylum seekers at the US-Mexico border includes a requirement for people to register themselves in a facial recognition system. But that system “[fails] to register many people with darker skin tones, effectively barring them from their right to request entry into the US.”125 The systems centralizing data of asylum-seekers and migrants make longitudinal tracking of children possible126. Facial recognition technologies are also used by ICE (the US’s Immigration and Customs Enforcement agency) to monitor and surveil people awaiting deportation hearings127.
In one study on facial recognition systems, MIT researcher Joy Buolamwini found that “darker-skinned females are the most misclassified group (with error rates of up to 34.7%). The maximum error rate for lighter-skinned males is 0.8%.”128 Harvard researcher, Alex Najibi, described that “Black Americans are more likely to be arrested and incarcerated for minor crimes than White Americans. Consequently, Black people are overrepresented in mugshot data, which face recognition uses to make predictions,” explaining how Black Americans are more likely than White Americans to become trapped in cycles and systems of racist policing and surveillance.129 This sentiment is echoed in a report by the project S.T.O.P. - The Surveillance Technology Oversight Project130. The UK131 and China are also among the countries who practice ‘predictive policing.’ One researcher focusing on China describes it as “a more refined tool for the selective suppression of already targeted groups by the police and does not substantially reduce crime or increase overall security.”132 So the issue here is not simply about flawed datasets – it is discrimination that already exists in society, where people who hold positions of power or have police or military force are able to use technology to enhance their oppression of certain groups of people. Larger datasets will not remedy or negate the problem of people acting upon discrimination, racism, or other types of bias and hatred.
Algorithms are made by people (who inherently have their own biases133) and are made using our data134... and the tools that are trained from our data can be used to harm other people. Algorithms are also used by governments, police, and other agencies around the world. Tools and services from Google, Amazon, and Microsoft have all been used by Israel in their war on Gaza135. In the United States, algorithms have been used to score risk levels for people who have committed crimes about their likelihood to commit future crimes. But these algorithms have been found by researchers to be “remarkably unreliable” and include a significant amount of bias in their design and implementation136. In Spain, an algorithm was used to predict how likely a domestic abuse survivor would be to be abused again, with the intention to distribute support and resources to people who need it most urgently, in an overburdened system. But the algorithm isn’t perfect and over-reliance on such flawed tools in high-stakes situations has had dire consequences. In some cases, survivors mislabelled as “low risk” have been murdered by their abusers despite their best efforts to seek help and report the abuse to authorities137. In the Netherlands, tax authorities used an algorithm to help them identify child care benefits fraud, with tens of thousands of lower-income families being penalized, resulting in many falling into poverty and even more than a thousand children wrongfully put into foster care. “Having dual nationality was marked as a big risk indicator, as was a low income [...] having Turkish or Moroccan nationality was a particular focus.”138 Algorithmic bias has been covered in depth by researchers such as Cathy O’Neil139.
Amid all the obscurity, there are some who are trying to shed light. WikiLeaks, founded by Julian Assange140, publishes leaked classified documents online141. The largest of those leaks and in US history happened over WikiLeaks, including hundreds of thousands military and diplomatic documents shared by Chelsea Manning that exposed serious abuses perpetrated by the US in Iraq and Afghanistan142. Frances Haugen exposed thousands of problematic internal Facebook files143 to The Wall Street Journal and the US Securities and Commission Exchange agency144. The Facebook files included internal discussions and reports about how the algorithm used by Facebook was negatively affecting societies around the world145. Edward Snowden shared over 1.7 million highly classified files146 to journalists at The Washington Post and The Guardian that illustrated the extent of which the US NSA spies on their citizens as well as foreign leaders, including US allies such as Germany147. Whistleblowers who expose details about the technological companies, agencies, and systems are giving more information to the public about the consequences of these mechanics in our everyday lives and as they impact wider society – but are also met with controversy and debate.
Resources have been developed that document surveillance apparatuses, such as EFF’s Atlas of Surveillance148 and Decode Surveillance NYC149. Guides like EFF’s Surveillance Defense for Campus Protests150 provides university students with practical tips and an understanding of their legal rights, giving them information as a means of resistance. “Facial recognition for identification is mass surveillance and is incompatible with the rights to privacy, equality, and freedom of assembly [...] Banning facial recognition is a first step toward dismantling racist policing.”151
The terms that are used to describe technology can shape how we think about them. The word “smart” has a positive connotation, in most cases – but when it comes to technology, “smart” is usually interchangeably used with “efficient”. Imagine if instead of calling systems of surveillance “smart”, we called them “machiavellian” – how might that change our discourse, acceptance, and adoption of them? As researcher Carlos Delclós said: “Privacy is not merely invaded; it is obliterated, as human lives are fragmented into datasets optimised for corporate gain,”152 and the same message can be extended to political gain. Regardless of whether we call technology by positive or negative terms, at the end of the day, the technology itself cannot be separated from the operators (i.e. the humans) who deploy it. If the people who use these technologies are also inhabiting societies and working within systems that have documented concerns of discrimination and/or of control, it seems quite possible that the tech will be used to cause harm. We don’t even need to imagine it. We can simply look around with both eyes open.
Notice: This work is licensed under a Creative Commons Attribution 4.0 International Licence.
Endnotes
54 Ovid (transl. by Kline). “Metamorphoses: Bk XII:39-63 The House of Rumour.” Published on The University of Virginia E-Text Center.
55 Shakespeare. “Hamlet: Act 2, Scene 2.” Folger Shakespeare Library.
56 Massey News. “Surveillance in Shakespeare - the plot thickens.” Massey University, University of New Zealand, 2017.
57 Zeavin, Hannah. “How Parenting Tech Opens the Door to State Surveillance.” WIRED, 2023.
58 German Spy Museum Berlin. “The capital of Spies in Cold War Berlin.” Accessed January 18, 2025.
59 McDonough, Frank. “Careless whispers: how the German public used and abused the Gestapo.” The Irish Times, 2015.
61 The Federal Archives. “Stasi Records Archive: Tasks and Structure.” Accessed: January 2025.
62 Marsh, Sarah. “Stasi files still case shadow for Germans.” Reuters, 2009.
63 Iskandar, Marwan. “Rafiq Hariri and the fate of Lebanon.” (page 201). Saqi Books, 2006.
64 Fischer, David. “Germany’s New Surveillance Laws Raise Privacy Concerns.” Human Rights Watch, 2021.
65 Masur, Philipp K.; et al. “Privacy attitudes, perceptions, and behaviors of the German population.” Hohenheim University, 2017.
67Stasi versus NSA.” Accessed January 2025.
71 The Glass Room. “Ovia.”
72 The Glass Room. “UNHCR & CSIR.”
76 Del Valle, Gaby. “DHS wants $101 million to upgrade its border surveillance towers.” The Verge, 2024.
77 Elbit America. “Next-Gen Border.”
78 García, Sussan. “Central American and Palestinian Liberation Struggles are Intertwined.” Contra Corriente, 2024.
79 Roussi, Antoaneta. “Pegasus used by at least 5 EU countries, NSO Group tells lawmakers.” Politico, 2022.
81 The Glass Room: San Francisco. “Live School.” Tactical Tech, 2019.
86 Gebhart, Gennie; et al. “Spying on Students: School-Issued Devices and Student Privacy.” Electronic Frontier Foundation, 2017.
88 Turner, Jordan. “The Right Way to Monitor Your Employee Productivity.” Gartner, 2022.
89 Oxfam. “Is Amazon a good place to work?” 2024.
90 Hendry Parsons, Lauren. “78% of employers engage in remote work surveillance, ExpressVPN survey finds.” ExpressVPN, 2023.
93 West, Darrell M. “How employers use technology to surveil employees.” Brookings, 2021.
94 McGregor, Jena. “This software start-up can tell your boss if you’re looking for a job.” The Washington Post. 2021.
96 Ackermann, Rebecca. “Your boss is watching.” MIT Technology Review, 2025.
97 Caltrider, Jen; et al. “It’s Official: Cars Are the Worst Product Category We Have Ever Reviewed for Privacy.” Mozilla Foundation, 2023.
98 Stecklow, Steve; et al. “Tesla workers shared sensitive images recorded by customer cars.” Reuters, 2023.
106 Hempel, Jonathan. “The watchful eye of Israel’s surveillance empire.” 972+ Magazine, 2022.
110 B’Tselem and Breaking the Silence. “The settlement in Area H2, Hebron.” 2020.
111 Gessen, Masha. “A Guided Tour of Hebron, from Two Sides of the Occupation.” The New Yorker, 2019.
112 The United Nations Relief and Works Agency for Palestine Refugees in the Near East (UNRWA). “Hebron H2 - Background and Key Protection Issues.” 2022.
113 Interview with an Israeli lieutenant of The Civil Administration unit. “Testimony › They scan the face.” Breaking the Silence, 2021.
115 Seibt, Sébastian. “How Israel uses facial recognition to monitor West Bank Palestinians.” France 24, 2021.
116 Frenkel, Sheera. “Israel Deploys Expansive Facial Recognition Program in Gaza.” The New York Times, 2024.
120 Interview with an Israeli first sergeant of the Nahal unit, 50th Battalion. “Testimony › The military wants to enter the people into its system for control.” Breaking the Silence, 2020.
122 What the Future Wants. “The Real Life of Your Selfie.” Tactical Tech, 2022.
124 Thompson, Darren. “Retired Police Officer Launches Nonprofit to Search for Missing Indigenous People.” Native News Online, 2023.
126 Guo, Eileen. “The US wants to use facial recognition to identify migrant children as they age.” MIT Technology Review, 2024.
129 Najibi Alex. “Racial Discrimination in Face Recognition Technology.” Harvard Graduate School of Arts and Sciences, Science in the News, 2020.
130 Manis, Eleni; et al. “Seeing is Misbelieving.” S.T.O.P. - The Surveillance Technology Oversight Project, 2024.
131 Liberty “Predictive policing.”
132 Sprick, Daniel. “Predictive Policing in China: An Authoritarian Dream of Public Security.” NAVEIÑ REET: Nordic Journal of Law and Social Research, 2019.
133 Turner Lee, Nichol; et al. “Algorithmic bias detection and mitigation: best practices and policies to reduce consumer harms.” Brookings Institution, 2019.
134 Morrison, Sara. “The tricky truth about how generative AI uses your data.” Vox, 2023.
135 Abraham, Yuval. “Leaked documents expose deep ties between Israeli army and Microsoft.” +972 Magazine, 2025.
136 Angwin, Julia; et al. “Machine Bias.” ProPublica, 2016.
137 Satariano, Adam; et al. An Algorithm Told Police She Was Safe. Then Her Husband Killed Her.” The New York Times, 2024.
139 O’Neil, Cathy. “Do algorithms perpetuate human bias?” NPR, 2018.
141WikiLeaks.”
145 Hao, Karen. “The Facebook whistleblower says its algorithms are dangerous. Here’s why.” MIT Technology Review, 2021.
148 Electronic Frontier Foundation (EFF). “Atlas of Surveillance.” Accessed March 1, 2025.
150 Mir, Rory; et al. “Surveillance Defense for Campus Protests.” Electronic Frontier Foundation, 2024.
151 Amnesty International. “Inside the NYPD’s Surveillance Machine.” Accessed March 1, 2025.
152 Delclós, Carlos. “Countering Digital Colonialism.” CCCB LAB, 2025.