Programme EVENT
SA 18. SEP 7:00–9:00 pm

the Art of Composition

Collaborative investigations against weaponized AI


Marit Seyer and worldwide members of the Stop Killer Robots Coalition, Júlia Nueno Guitart, Forensic Architecture, Rebecca Mignot-Mahdavi, Interest Group on International Law and Technology (ESIL), MC: Siavash Eshghi, Heinrich Böll Foundation.

Performance: Daphna Horenczyk, Soundscape: Brando Zores Szely,
Tribunal song:
Volkmar Klien, Special appearance: Soulcat


AI has entered the realm of armed conflict. Behind the glossy PR about how AI facilitates and simplifies our lives, and businesses, darker experiments are under way. They happen in front of our eyes, yet we cannot see them. What is being initiated is a shift from AI gamification to AI weaponization.
Together with our expert collaborators, we ask: What does it take to resist abuses of power that are escalated to the scale of populations and nation states? How can counter-investigations and counter-alliances operate in order to successfully expose the use of AI models and infrastructures in conflicts?

C

onflict resolution is replaced by armed conflict loaded with AI. Even games are not just for fun: besides their gamification – the integration of game-design elements to motivate and engage users – their deeper layers can conceal instructions for war and other atrocities, be they military or terrorist. Ever more people, and other beings, are victims of yet another contingency, another assault, another crime against life. Unwittingly or brazenly, social and other media reinforce such sociopathic tendencies.

AI weaponization has become a global issue, as our partner, the Stop Killer Robot Coalition can testify. AI-supported repression by governmental, military, and corporate forces is directed against both individuals as well as entire populations, including a state’s own citizens, as research in human rights and international law shows. In some cases, AI has already been weaponized to extremely worrying degrees, as investigations, such as those of Forensic Architecture, into cases of individuated mass killings and genocide prove.

But even when we cannot see how AI is made to operate, there are ways to uncover and visualize it. Indeed, evidence-based investigations are the basis of public knowledge in non-transparent contexts. As such, counter-investigative alliances are necessary means to challenge powers that manipulate, exclude and even eliminate people.

Regardless of one’s political beliefs or ideological leanings, the notion that AI is on the verge (or may have already crossed the threshold) of making life-and-death decisions without human oversight should be profoundly unsettling to anyone still committed to rational and humane solutions to conflicts. This would stand in stark contrast to the transhumanist visions of unfettered control, where technological elites steer society toward a corporate-driven future governed by anarcho-authoritarian ideals.

AI weaponization must be countered with all the means available to a democratic society. We must finally realise how powerful we are if we act together and use our institutions properly. No one is above the law, including machines.