top of page

ACTIVE PARTICLES IN THE GREY ZONE: THE SYSTEMS VIEW OF INFORMATION


Every time you share a post, you become part of the weapon system.


That is not a metaphor. It is a description of how cognitive warfare actually operates in the contemporary information environment. The dominant framing of information manipulation—which treats individuals as passive recipients of propaganda who can be protected through better content moderation and fact-checking—fundamentally misunderstands the nature of the threat. People are not passive targets. They are active participants in complex adaptive systems and their micro-level behaviours—sharing, liking, commenting, ignoring—collectively generate the macro-level vulnerabilities that adversary operations exploit.


This article argues that we need to replace the broadcast model of influence with a systems model. The implications for counter-strategy are profound and they are uncomfortable: the vulnerability is not in the content but in the dynamics and the dynamics are generated by us.


Why The Broadcast Model Fails

The traditional model of propaganda is sender-receiver: an adversary crafts a message, transmits it and the target audience receives it. Defence, on this model, consists of intercepting the message (content moderation), discrediting the sender (attribution and exposure), or inoculating the receiver (media literacy and prebunking).


Each of these has value. None of them addresses the structural problem. The contemporary information environment does not work like a broadcast system. It works like a complex adaptive system—a network of interconnected agents whose individual behaviours generate emergent properties that no single agent controls or intends.

Social media platforms are complex adaptive systems by any reasonable definition. They consist of millions of agents (users) operating according to local rules (engagement incentives, social norms, identity commitments, emotional responses), connected through dynamic network structures (follows, shares, algorithmic recommendations) and producing emergent phenomena (viral cascades, filter bubbles, polarisation spirals, moral panics) that no individual user planned or desired.


This matters for cognitive warfare because adversary operations do not need to convince anyone of anything in particular. They need to introduce perturbations into the system—inject content, amplify signals, fabricate social proof—and let the system's own dynamics do the rest. The users themselves, through their entirely rational individual responses to engagement incentives and identity pressures, amplify the manipulation into strategic-scale effects. The adversary does not need a million operatives. It needs a few hundred bots to seed content into the right network nodes and millions of genuine users will do the amplification for free.


Individuals As Active Nodes

The shift from passive victims to active nodes is not a blame-the-victim argument. It is a structural observation about how information systems work. When someone shares an emotionally provocative post—even to condemn it—they are amplifying its reach. When they engage with outrage content, they are training the platform's algorithm to serve more of it, not just to them but to their network. When they sort themselves into ideologically homogeneous communities, they are constructing the filter bubbles that adversary operations subsequently exploit.


None of these individual actions is irrational. Each makes sense from the perspective of the individual actor: the post was genuinely outrageous, the community shares their values, the engagement feels like participation. But the aggregated effect of millions of individually rational decisions is a system-level vulnerability that no individual created and no individual can fix.


This is the hallmark of complex adaptive systems. The behaviour of the whole cannot be predicted from the behaviour of the parts. The properties that matter for cognitive warfare—polarisation spirals, trust erosion, narrative fragmentation—are emergent properties that arise from the interaction of individual behaviours with system architecture. They are not caused by any single actor, including the adversary.

The adversary's role is more like a catalyst than a cause. A catalyst does not create a chemical reaction; it lowers the activation energy required for a reaction that is already thermodynamically favourable. Russian operations during the 2016 US elections did not create American political polarisation. They lowered the activation energy for polarisation dynamics that were already latent in the system—dynamics generated by platform architectures, media business models, political sorting and genuine social grievances. The operations were effective not because the content was persuasive but because the system was primed.


The Dynamics That Matter

Four system-level dynamics are particularly relevant for cognitive warfare practitioners—and, therefore, for those who would counter them.


The first is preferential attachment. In network terms, this means that the most connected nodes attract the most new connections—the rich get richer. On social media, this means that the most-shared content gets shared more, creating cascades that can transform a marginal piece of disinformation into a dominant narrative in hours. Adversary operations exploit preferential attachment by seeding content into highly connected nodes—influencers, popular accounts, community leaders—knowing that network effects will handle the distribution.


The second is homophily—the tendency for similar individuals to cluster together. Political homophily on social media platforms creates what network scientists call echo chambers: densely connected sub-networks within which information circulates rapidly but between which it flows slowly or not at all. These echo chambers are not imposed by the platform; they are constructed by users pursuing their preference for like-minded interaction. But once constructed, they create the structural conditions for polarisation, because information that circulates within a homophilous cluster is not subjected to the corrective friction of opposing viewpoints.


The third is emotional contagion. Experimental evidence demonstrates that emotional states propagate through social networks: exposure to emotional content makes users more likely to produce emotional content themselves. Anger and outrage propagate particularly effectively—they generate more engagement, which generates more algorithmic amplification, which generates more exposure. This creates feedback loops in which the emotional temperature of a network cluster can escalate rapidly without any external input. An adversary that injects emotionally charged content into such a loop is adding fuel to a fire that is already burning.


The fourth is threshold dynamics. Complex systems often exhibit phase transitions: gradual changes in underlying conditions produce sudden, discontinuous changes in system behaviour. A society can absorb increasing levels of polarisation without visible crisis—until a threshold is crossed and the system shifts rapidly to a new state. The Romanian election crisis of 2024 is a plausible illustration: years of gradually intensifying information manipulation produced a sudden, system-level failure when the constitutional court annulled an election result. The transition was not gradual. It was catastrophic in the mathematical sense—a sudden reorganisation of the system around a new equilibrium.


For cognitive warfare practitioners, threshold dynamics are the strategic prize. The operational objective is not to produce a specific political outcome but to push the target system toward its tipping point—to degrade the conditions of collective decision-making until a triggering event produces a phase transition. The triggering event itself does not need to be adversary-generated; it just needs a system that has been pushed close enough to the threshold that any shock will push it over.


What This Means For Counter-Strategy

If the threat operates at the system level, counter-strategy must operate at the system level. This has four implications that current policy largely fails to address.

First, content-level interventions are necessary but radically insufficient. Removing individual pieces of disinformation from a complex adaptive system is like removing individual molecules from a river. The system will route around the intervention because the dynamics that generate the content are structural, not content-specific. Content moderation matters—it raises the cost of adversary operations and reduces the volume of manipulative material in circulation. But it cannot address the underlying dynamics that make the system vulnerable.


Second, the most effective interventions target system architecture rather than content. Platform design choices—the engagement metrics that drive algorithmic amplification, the recommendation algorithms that construct filter bubbles, the notification systems that exploit attention—are the structural conditions that generate vulnerability. Changing these design choices changes the dynamics of the entire system. The EU's Digital Services Act represents an early attempt to regulate at this level, requiring platforms to assess and mitigate systemic risks. Whether it will prove effective remains to be seen, but the regulatory logic is sound: address the architecture, not just the content.

Third, individual behaviour change is valuable but must be understood as a system-level intervention. When media literacy programmes teach individuals to pause before sharing, to evaluate sources and to resist emotional manipulation, they are introducing friction into the system—slowing the propagation dynamics that adversary operations exploit. The value of this friction is not that any individual becomes immune to manipulation; it is that enough individuals behaving slightly differently changes the aggregate dynamics of the system. The intervention is individual; the effect is systemic.

Fourth, resilience in a complex adaptive system is not about preventing disruption. It is about maintaining the system's capacity to absorb disruption without undergoing a catastrophic phase transition. This means investing in the structural properties that make systems resilient: diversity of information sources, density of cross-cutting social connections, institutional redundancy and the capacity for self-correction. A resilient information ecosystem is not one in which disinformation does not circulate—that is impossible. It is one in which the system's dynamics prevent disinformation from cascading into system-level failure.


The Uncomfortable Implication

The systems view of information leads to an uncomfortable conclusion: we are the vulnerability. Not because we are stupid, gullible, or morally weak—but because we are nodes in a complex system whose emergent properties we neither control nor fully understand. Our individual decisions, made for entirely understandable reasons, collectively generate the macro-level conditions that adversary operations exploit.

This is uncomfortable because it distributes responsibility. It is much more satisfying to blame the adversary, the platform, or the algorithm than to recognise that the vulnerability is in the dynamics—and the dynamics are generated by the interaction of all participants.


But the discomfort is also the beginning of an answer. If individuals are active participants in the system, they are also potential agents of systemic change. Every decision to pause before sharing, to seek out disconfirming information, to engage with nuance rather than outrage, introduces friction that marginally shifts the system's dynamics. Individually, these actions are trivial. Collectively, they are the difference between a system that absorbs perturbation and one that amplifies it into crisis.

The question for security professionals, policymakers and citizens is whether we can build institutional architectures that make the resilient choice the easy choice—that align individual incentives with systemic health. The current architecture does the opposite: it rewards engagement over accuracy, outrage over nuance, speed over reflection. Changing that architecture is not a technical problem. It is a political one and it requires us to confront what we are willing to demand from the platforms that mediate our collective cognition.


The adversary did not build the system. The adversary learned to exploit it. The defence is not to fight the adversary within the system. It is to change the system.

 
 
 

Comments


business-people-working-data-project.jpg

REQUEST ERG'S SECURITY CONVERGENCE EXPERTISE

Receive tailored, intelligence-led and risk-based
security advice, designed 
to meet your requirements

 

Get in touch with us and we will assist you further.

Security Education, Risk, Resilience Awareness and Culture

Address

Southgate Chambers, 37-39 Southgate Street, Winchester, England, SO23 9EH

EMERGING RISKS GLOBAL ®

Emerging Risks Global ® (ERG) is a trading name of Woodlands International Ltd ©

Registered in England and Wales: 11256211.

VAT GB 507 077 204

Connect With Us

  • Instagram

This website and its content is copyright of  Woodlands International Ltd ©. 2025  All rights reserved. 

bottom of page