Imagine waking up to the news that a deadly new strain of flu has emerged in your city. Health officials are downplaying it, but social media is flooded with contradictory claims from “medical experts” debating its origin and severity.
Hospitals are filled with patients showing flu-like symptoms, preventing other patients from accessing care and ultimately leading to deaths. It gradually emerges that a foreign adversary orchestrated this panic by planting false information – such as the strain having a very high death rate. Yet despite the casualties, no rules define this as an act of war.
This is cognitive warfare, or cog war for short, where the cognitive domain is used on battlefields or in hostile attacks below the threshold of war.
A classical example of cog war is a concept called “reflexive control” – an art refined by Russia over many decades. It involves shaping an adversary’s perceptions to your own benefit without them understanding that they have been manipulated.
In the context of the Ukraine conflict, this has included narratives about historical claims to Ukrainian land and portraying the west as morally corrupt.
Cog war serves to gain advantage over an adversary by targeting attitudes and behaviour at the individual, group or population level. It is designed to modify perceptions of reality, making “human cognition shaping” into a critical realm of warfare.
It is therefore a weapon in a geopolitical battle that plays out by interactions across human minds rather than across physical realms.
Because cog war can be waged without the physical damage regulated by the current laws of war, it exists in a legal vacuum. But that doesn’t mean it cannot ultimately incite violence based on false information or cause injury and death by secondary effects.
Battle of minds, bodily damage
The notion that war is essentially a mental contest, where cognitive manipulation is central, harks back to the strategist Sun Tzu (fifth century BC), author of The Art of War. Today, the online domain is the main arena for such operations.
The digital revolution has allowed ever-more tailored content to play into biases mapped through our digital footprint, which is called “microtargeting.” Machine intelligence can even feed us targeted content without ever taking a picture or recording a video.
All it takes is a well-designed AI prompt, supporting bad actors’ predefined narrative and goals, while covertly misleading the audience.
Such disinformation campaigns increasingly reach into the physical domain of the human body. In the war in Ukraine, we see continued cog war narratives. These include allegations that the Ukrainian authorities were concealing or purposefully inciting cholera outbreaks.
Allegations of US-supported bioweapons labs also formed part of false-flag justifications for Russia’s full-scale invasion.
During Covid, false information led to deaths when people refused protective measures or used harmful remedies to treat it. Some narratives during the pandemic were driven as part of a geopolitical battle.
While the US engaged in covert information operations, Russian and Chinese state-linked actors coordinated campaigns that used AI-generated social media personas and microtargeting to shape opinions at the level of communities and individuals.
The capability of microtargeting may evolve rapidly as methods for brain-machine coupling become more proficient at collecting data on cognition patterns. Ways of providing a better interface between machines and the human brain range from advanced electrodes that you can put on your scalp to virtual reality goggles with sensory stimulation for a more immersive experience.
DARPA’s Next-Generation Nonsurgical Neurotechnology (N3) program illustrates how these devices may become capable of reading from and writing to multiple points in the brain simultaneously. However, these tools might also be hacked or fed poisoned data as a part of future information manipulation or psychological disruption strategies.
Directly linking the brain to the digital world in this way will erode the line between the information domain and the human body in a way never done before.
Legal gap
Traditional laws of war assume physical force such as bombs and bullets as the primary concern, leaving cognitive warfare in a legal grey zone. Is psychological manipulation an “armed attack” that justifies self-defence under the UN charter?
Currently, no clear answer exists. A state actor could potentially use health disinformation to create mass casualties in another country without formally starting a war.
Similar gaps exist in situations where war, as we traditionally see it, is actually ongoing. Here, cog war can blur the line between permitted military deception (ruses of war) and prohibited perfidy.
Imagine a humanitarian vaccination program secretly collecting DNA, while covertly used by military forces to map clan-based insurgent networks. This exploitation of medical trust would constitute perfidy under humanitarian law – but only if we start recognizing such manipulative tactics as part of warfare.
So, what can be done to protect us in this new reality? First, we need to rethink what “threats” mean in modern conflict. The UN charter already outlaws “threats to use force” against other nations, but this makes us stuck in a mindset of physical threats.
When a foreign power floods your media with false health alerts designed to create panic, isn’t that threatening your country just as effectively as a military blockade?
While this issue was recognized as early as 2017 by the groups of experts who drafted the Tallinn Manual on cyberwarfare (Rule 70), our legal frameworks haven’t caught up.
Second, we must acknowledge that psychological harm is real harm. When we think about war injuries, we picture physical wounds. But post-traumatic stress disorder has long been recognised as a legitimate war injury – so why not the mental health effects of targeted cognitive operations?
Finally, traditional laws of war might not be enough – we should look to human rights frameworks for solutions. These already include protections for freedom of thought, freedom of opinion and prohibitions against war propaganda that could shield civilians from cognitive attacks. States have obligations to uphold these rights both within their territory and abroad.
The use of increasingly sophisticated tactics and technologies to manipulate cognition and emotion poses one of the most insidious threats to human autonomy in our time. Only by adapting our legal frameworks to this challenge can we foster societal resilience and equip future generations to confront the crises and conflicts of tomorrow.
David Gisselsson Nord is professor, Division of Clinical Genetics, Faculty of Medicine, Lund University and Alberto Rinaldi is postdoctoral researcher in human rights and humanitarian law, Lund University
This article is republished from The Conversation under a Creative Commons license. Read the original article.