The quest to understand influencing machines
Marek Tuszynski, Executive Director and co-founder of Tactical Tech. September 2024
As much as we live in the polycrisis, where things change on a whim (global temperatures, pollution, land degradation, economic precariousness, expanding possibilities for regional and global conflict, increasing polarisation and confusion about where the problems are and what solutions we can apply to them), we also live in the digital realm of influence, which in short may be much less helpful than one might think in the given polycrisis context. And we all want to influence something in one way or another, or so we hope.
Influence in the context of crises isn’t new, of course. Since ancient Egypt, those in control of information and knowledge have mastered various ways of using difficult-to-understand events (floods as an anomaly) or unexplainable occurrences (solar eclipses as a phenomenon) to manipulate and control public opinion - with one purpose in mind - to maintain the status quo of power. It may seem that we live in times when this should be harder to do - but perhaps it is not, regardless of the atomisation of power, etc. A polycrisis, where multiple fears overlap, is a fertile environment for taking these ancient skills to new levels.
We - the users of the digitised world - seem to be pursuing two different directions when trying to understand the waxing and waning of various crises. The first path is to follow some sort of mainstream media, or what's left of it, on platforms that are deteriorating before our very eyes. This might apply to those in the middle age range who started using the web when it promised to be a step up from the media sphere of the early nineties - who are now like - this information revolution isn’t as democratising as we thought it would be. Some of these middle age range people are nostalgic, occasionally buying a print version of the media that they can browse with their flat latte whilst trying to remember how to read and fold such a large set of papers. When on the internet, they look for favourite corners to park themselves in surrounded by other people like them, discussing the same points of view.
Then there is the other way - the rabbit hole, where the rabbit is actually a cat, and the hole is actually an endless sparkly rainbow. This seems to apply to younger and older people – one group is often characterised as native, and the other as naive. Both are somehow too late or too early, with too little or too much trust in the new formats through which information is delivered on the crises they are trying to navigate. Generalisations really do not make sense, especially generational generalisations. Still, there is one thing to say here - this group would rather stitch together their own narratives around the crises they are experiencing out of whatever they trust more - distributed, decentralised, uncensored, recommended and random mixes of sources – which they think is good because it is against the mainstream. It is the new mainstream. The thing is, they own it – it is their curation, and it increasingly becomes their voice. For all their creativity, somehow, all three approaches –nostalgic, native and naive--are broken. In the end, things make less sense than before the (reading, looking, watching, listening) journey started.
Rather than following the stories coming out of the tech industry (which has monopolised narratives about technology and crises – think scary or impossible stories (man on Mars, AI killing us all, etc.), we should not lose sight of other less headline-grabbing developments and allow ourselves to be a little less trendy and less focused on the here and now. Instead, we need to look at what is being normalised in the information sphere while we deal with all these imposed and competing crises and what the long-term implications are for how we live as inhabitants of the world in different contexts with different privileges and limitations. We need to examine what tools we use as societies to understand these crises - what's good for us and what's not, what tools we use to discuss and deliberate our options and what tools we should be trying to use to come up with viable solutions.
We need to rethink how we deal with information in the context of crises.
Conflict, climate, economic catastrophe, global public health meltdowns - how we get information, what we believe and what we don’t, and how we sort things out together matters. This goes for people whose understanding of technology might have stopped at the last time they used a tape recorder or perhaps a Walkman, as well as those who have never encountered one. These individuals have lost the ability to verify information during evolving crises—either because anything can now imitate what they associate with quality and trust (like news desks, graphs, or authoritative titles) or because they never had the opportunity to develop a strong media literacy or cognitive judgment, being new to the world and still navigating the landscape before them. In a world where everything becomes news, we exist in a state of constant urgency, leading to new ways of consuming this urgency while our capacity to assess what is truly important withers away. The perception of the world, shaped by these bursts of urgency, pushes us to seek quick solutions and extreme expressions of our views. This, of course, benefits those who profit from disorientation and chaos.
We need to rethink how we deal with information in the context of crises. All of us, really. We, the digital users of the digitised information world, need to get a grip on things. A simple fact that all generations - nostalgic, native and naive - seem to agree on. How we explore the world, what and who we trust, and why, in the context of crises, has gotten too far out of our hands and needs to change. We need a place to come together. We need to reinvent how we talk about and solve difficult things while respecting any fundamental differences we might come to the table or screen with.
A citizens' situation room is one of the ways in which we can address some of these issues (watch out for this project currently being developed by Tactical Tech). Because we need to get together and, on the one hand, get a better perspective on how narratives are shaped and promoted around the things we care about in our communities, societies and our homes. On the other hand, to see how to deal with issues and problems directly without falling down rabbit holes, conspiracy fiction and unsatisfactory repetitions of over-washed and shrink-wrapped stories – and all the while disagreeing about whose way of seeing things is right.
The advent of generative AI and synthetic media dramatically worsens the information sphere.
But there is another layer to this story: how influence moves through this digital mediascape, now supercharged by AI.
To untangle this mess, we need to get some more things out on the table. First, polarisation, confusion and mistrust are highly profitable and extremely fertile if you are in the business of influence. And this is particularly the case when things change quickly and are unpredictable, like in a crisis. This is, by now, an often repeated and widely recognised problem, but somehow – despite all the bad press (ironically) and all the regulators looking at the question – it is not being fixed. Instead, more fuel is being added to the fire. The advent of generative AI and synthetic media dramatically worsens the information sphere. Increasing the current state of information disorder - where everything we perceive with our vital knowledge-building senses can be faked.
It is not only that fake is bad. It is also that what is not fake can be accused of being fake - if both look the same. You believe what you want to believe because the work of separating the fake from the real is futile. That in itself is a huge problem. In the context of crises, it’s an even bigger problem. Because these crises are things that nobody wants to go through anyway. They disrupt the order of things, they change the way things need to be done, and they challenge our comfort zones – forcing us to fight, flight or freeze. Each new technology seems to be equally damaging to our information universe and requires scrutiny, as well as new strategies for figuring out what to do about it. We are all slowly trained to become intuitive thinkers instead of perfecting our analytical thinking (1), which does not help us come together or bridge any divides.
Second, we need to talk about the influence industry because the amount of information that flows from and through us comes to us by design, often through advertising, but not only. We at Tactical Tech know from our work on Influence Industry (2) that the nature of influence has changed and accelerated in recent years. We've already mapped its core, but for now, just imagine hundreds of clones of Cambridge Analytica. If you can't remember what they did wrong, apart from accessing the personal data of millions of users taken from Facebook, they pioneered a set of micro-targeting, manipulative influence tactics for political campaigning with no regard for ethics, consent or credibility. We had a close look at the methods used by this industry (what is also called ‘martech’).
The technologies are often the same as those designed to sell us things; however, here they are applied to ideas, opinions and influence around things that connect us and divide us, such as politics and big issues like climate or abortion. The scale and scope of influence industry operations is vast. We at Tactical Tech have looked at them in the context of local and regional elections, parliamentary and presidential. We have indexed and verified their claims, looked at who their clients are, and spent time recording their spending and building a network of partners to monitor their work during elections in almost 40 countries. Since we started working on the influence industry and trying to understand these ‘influence machines’, we have also seen a massive acceleration in the tools available to this industry, in particular, the emergence of ubiquitous machine learning solutions or so-called AI tools. This matters because crises are not only used to divide and sort individuals and communities but also increasingly to attract and detract votes.
There is a desire for a better understanding of how information about crises comes to the fore and why.
In this context, there is another need for reconnection. Our experience of working in public spaces and engaging with all kinds of audiences in all sorts of places in hundreds of cities, large and small, around the world, is that there is no shortage of desire for a better understanding of how information about crises comes to the fore and why. We are divided in many ways – firstly, by the way, we use digital technologies to access information and form opinions, and secondly, by the underlying business models that provide us with all these tools. Tools that monetise our attention, and attention grows as crisis and confusion intersect. This is not a neutral territory but the battleground of sophisticated influence machines fighting for two gains - influence (who controls politics) and money - because, in the end, it always seems to be about money.
As dependent as we are on digital tools and toys, we should persist in trying to do at least two things: regain control over what tools we want and what we want them for - including who makes them, how they make them and what the makers of these tools get out of it - quite basic really, and secondly, experiment with tools we want and spaces we need for things we care about, rather than visiting dodgy halls of mirrors where everything is twisted and upside down and not even funny after a while.
All the images by Marek, from the series “concrete” 2024