How Disinformation Gets Under Our Skin

Author: Ali Bilgic (Professor of International Relations and Middle East Politics, Loughborough University)

Over the past few years, I’ve increasingly found myself grappling with these stubborn questions: why do obviously false stories online capture people’s attention so powerfully? Why do some narratives linger long after fact‑checkers have disproved them? These questions stayed with me because, in my work on political psychology and security, I kept seeing the same recurring theme. People weren’t always drawn to disinformation because they trusted its accuracy. They were drawn to how it made them feel, how it seemed to speak to their anxieties and offer a comforting sense of clarity when the world around them felt messy or unstable. This realisation eventually grew into my latest research article, produced with the support of the Norwegian Research Council’s FAKENEWS project.

Article background

What I argue in the article is that disinformation behaves less like a faulty claim and more like a tool that works through emotion. When people feel uncertain or anxious about politics, cultural change, economic pressures, or simply the speed at which life now moves, disinformation steps in and offers an emotional anchor. It provides stories that tell people who is responsible for their frustrations, who is threatening their values, and who can restore a sense of order. The emotional pull is often stronger than any factual doubt. Something that feels true can be more persuasive than something that is true.

In this project I drew on elements of psychoanalysis, particularly Lacanian ideas about how individuals form a sense of self in relation to others. But the underlying insight is easy to put into everyday language: emotions shape our beliefs far more deeply than we tend to admit. When a person encounters an online story that mirrors their fears or frustrations, the emotional resonance arrives instantly. It is only later—if at all—that more deliberate thinking follows.

To ground this in something concrete, I looked at the so‑called “Great Replacement” conspiracy theory. Variations of this idea have circulated across Europe and North America for years, pushed by far‑right politicians, commentators, and online networks. The theory falsely claims that elites are engineering a demographic “replacement” of white populations by racialised minorities. What fascinated me is not the content—which is simply untrue—but the way it gives shape to a broad sense of unease. For people who already feel anxious about cultural change, economic insecurity or identity, the theory offers a story that transforms that anxiety into a clear emotional target. It names an enemy, explains the supposed “danger”, and promises a path to restoring stability.

Examples and implications

This pattern can be seen in many other examples. Anti‑vaccine rumours, climate change denial, stories about stolen elections. All turn vague feelings of uncertainty into emotionally charged narratives. Once anger, fear, pride, resentment attach themselves to a story, the story becomes difficult to dislodge. People hold onto it not because every detail is convincing, but because it helps them make sense of their emotional world.

This has important implications for anyone concerned about a healthy public debate. If disinformation spreads because it meets emotional needs, then our usual responses such as correcting the facts, debunking falsehoods, or urging people to “think critically” cannot be the whole answer. Those tools remain essential, but they don’t address the underlying emotional landscape in which disinformation thrives. If the emotional gap remains unfilled, the misleading narrative simply adapts, resurfaces or reappears somewhere else.

For me, this is where the broader impact of the research lies. It encourages us to approach disinformation not just as a technical or informational problem but as a deeply human one. The stories that spread most easily are those that offer comfort, coherence or a sense of belonging in moments of uncertainty. Understanding this does not excuse the harmful consequences of these narratives, but it does help us see why certain ideas take hold so quickly and why others—no matter how accurate—struggle to resonate. We need to take emotions seriously and explore how they interact with broader racialised (as in the case of “Grear Replacement”), gendered, sexualised, able-ist power relations.

Emotional dynamics

My hope is that this work invites a shift in our conversations. Instead of assuming people are simply “gullible” or “misled”, we might ask what kinds of emotional pressures and insecurities make certain narratives appealing in the first place. If we can understand that, then we can begin to imagine responses that acknowledge people’s concerns without giving ground to harmful or exclusionary stories.

In a world shaped by rapid change and constant digital noise, these emotional dynamics will only grow more important. By recognising how disinformation taps into our deeper fears and desires, we can start to build societies that are not only better informed, but also more emotionally resilient.

Publication details: Bilgic, A. (2026). Ontological (in)security after truth: Disinformation as affective technology. Cooperation and Conflict, online first. https://doi.org/10.1177/00108367261422466

Bio: Ali Bilgic is a Professor of International Relations and Middle East Politics at Loughborough University and the Political Communication theme lead of CRCC. He is the author of Rethinking Security in the Age of Migration: Trust and Emancipation in Europe (Routledge, 2013 and 2018, 2nd edition) and Turkey, Power and the West:

Gendered International Relations and Foreign Policy (Bloomsbury, 2016) and Positive Security (co-authored with Prof Gunhild Hoogensen Gjorv, 2022 and 2024, 2nd edition).

 

Recent tweets

Contact us