top of page

Architecture, Not Agency


 A Position Paper on UK Cognitive Resilience Architecture

 


Dr Paul Wood

Chief Executive Officer, Emerging Risks Global

 April 2026

 

Executive Summary

The Foreign Affairs Committee’s Disinformation Diplomacy report, published on 27 March 2026, recommends that the United Kingdom establish a National Counter Disinformation Centre on a statutory footing, modelled on bodies such as Sweden’s Psychological Defence Agency. The institutional impulse is correct: the current architecture is fragmented and benefits from consolidation. The design template is misleading. Cyber and cognitive operations differ in three properties — attribution structure, perimeter logic and the operational relevance of the foreign and domestic boundary — that determine what kind of institution can effectively respond to each. Additionally, the Swedish institutional model imports a constitutional foundation — Total Defence — that the UK does not possess.


This position paper argues that an NCSC-style consolidated centre will reproduce, at greater cost, the structural problems it is intended to solve. It draws attention to two features of the Committee’s own evidence base. The founding chief executive of the National Cyber Security Centre, giving evidence to the inquiry, characterised the NCSC’s institutional configuration as accidental rather than deliberate. The Foreign, Commonwealth and Development Office Minister with responsibility for the brief told the Committee that consolidation into a single agency would risk the threat being seen as one body’s problem when it is, in fact, every department’s problem. Both witnesses pointed out that the Committee then recommended the model.


This paper sets out an alternative. Rather than a new public-facing centre, it proposes a Cabinet Office Cognitive Resilience Authority with statutory cross-departmental tasking power; a published UK Cognitive Resilience Strategy analogous to the National Cyber Strategy; statutory thresholds defining when foreign information manipulation triggers operational national-security response; the extension of Intelligence and Security Committee oversight to the relevant operational elements; and a funded research and evaluation function anchored in human-factors science. The recommendations are sequenced for completion before the next general election cycle, on the basis that election integrity is the highest-confidence pressure test the architecture will face.


The argument rests on a foundation that is largely absent from current public discourse: the human-factors dimension of cognitive resilience. Cognitive operations succeed because they exploit predictable failure modes in how individuals and institutions process information under stress. Any architecture that does not begin from that premise is solving the wrong problem.


Findings

1. The UK’s counter-FIMI architecture is fragmented across at least seven bodies with no single authority possessing operational cross-departmental tasking power.

2. The Foreign Affairs Committee’s recommendation for a National Counter Disinformation Centre, while addressing a real problem of fragmentation, applies an analogy from cyber defence that does not transfer cleanly to the cognitive domain.

3. The institutional seam between the FCDO Hybrid Threats Directorate and the DSIT National Security Online Information Team is the operational target of adversary networks, not an artefact of organisational history. Russian operations, including the Social Design Agency and the Doppelganger network, are explicitly designed to route foreign-tasked content to domestic audiences via international platforms.

4. The UK lacks a published cognitive-domain doctrine analogous to the National Cyber Strategy. Operational thresholds, ethical limits and oversight boundaries are currently set by case-by-case ministerial judgement.

5. Parliamentary oversight of the relevant capabilities remains constrained. The absence of Intelligence and Security Committee oversight of the National Security Online Information Team and its predecessor has led to recurrent political contestation, materially reducing operational tempo.

6. The evidence base for what counter-FIMI interventions actually work is thin and not systematically collated. No UK body is currently funded to evaluate operational effect across the architecture.


Recommendations

a. Establish a Cabinet Office Cognitive Resilience Authority with statutory cross-departmental tasking power, reporting through the Deputy National Security Adviser and integrated with the Defending Democracy Taskforce — a coordinating spine, not a new public-facing agency.

b. Publish a UK Cognitive Resilience Strategy analogous to the 2022 National Cyber Strategy, anchoring cross-government activity in shared definitions and thresholds.

c. Legislate statutory thresholds defining when foreign information manipulation triggers an operational national-security response, removing case-by-case ambiguity.

d. Bring the relevant operational elements within Intelligence and Security Committee oversight, resolving a contestation that will otherwise recur with each public controversy.

e. Fund a cognitive-domain research and evaluation function, DSTL-led with academic partnership, mandated to assess what interventions across the architecture demonstrably work.

f. Sequence implementation for completion before the next general election cycle, on the basis that election integrity is the highest-confidence pressure test the architecture will face.


1. Scope of this paper

1.1 This paper addresses the institutional architecture of the United Kingdom’s response to foreign information manipulation and interference (FIMI). It is written in response to the Foreign Affairs Committee’s report Disinformation Diplomacy: How Malign Actors Are Seeking to Undermine Democracy, ordered by the House of Commons on 19 March 2026 and published on 27 March 2026. It does not address content moderation policy, the regulatory framework established by the Online Safety Act 2023, platform liability, media literacy programmes, or the specific question of disinformation as it relates to elections, save where these matters bear directly on the institutional design question.

1.2 The paper concentrates on the architectural question for two reasons. The first is that the architectural question is, in the author’s view, both insufficiently addressed in current public discourse and the precondition for effective action across the other domains. The second is that the government response to the Foreign Affairs Committee’s report is now being drafted and the architectural recommendation in that report is the single most consequential institutional choice in this space for the remainder of the present Parliament. The window for influencing that choice is narrow.

1.3 The argument is offered as a contribution to ongoing policy formulation, drawing on operational experience in UK security architecture, structured engagement with the publicly available analysis of cognitive warfare and FIMI and the author’s doctoral research into the human factors that determine institutional and individual resilience under crisis conditions. References are at the end of the paper; all factual claims are sourced to public material.


2. The current UK counter-FIMI architecture

2.1 UK counter-FIMI activity is currently distributed across at least seven bodies. Each operates under a distinct departmental remit, is accountable to a different ministerial line, and is authorised under different legal frameworks. The architecture has evolved incrementally, primarily in response to specific events — Russia’s invasion of Ukraine, the COVID-19 pandemic, electoral cycles — rather than as part of a deliberate institutional design.

2.2 FCDO Hybrid Threats Directorate

Formerly the Cyber, Information and Tech Threats Directorate, established following the 2023 Integrated Review Refresh. Holds the international FIMI mandate. Led the October 2024 sanctions designation of the Social Design Agency. Operates the Open Source Unit and the Counter Information and Manipulation Department. Reports to the Foreign Secretary; gave most recent oral evidence to the Foreign Affairs Committee on 6 January 2026.

2.3 DSIT National Security Online Information Team (NSOIT)

Formerly the Counter Disinformation Unit. Holds the domestic-environment mandate, focused on risks to UK national security and public safety arising in the UK information environment. Subject of recurrent political contestation since 2023; reported operational tempo materially reduced from earlier levels. Reports to the Secretary of State for Science, Innovation and Technology. Currently led by Talitha Rowland as Director for Security and Online Harms, who gave evidence to the FAC follow-up session and previously, as Deputy Director, to the Culture, Media and Sport Committee’s Trusted Voices inquiry.

2.4 Cabinet Office Defending Democracy Taskforce

Coordinating body led by the Security Minister, with cross-departmental membership. Owns electoral integrity coordination. Convening rather than tasking authority.

2.5 Joint Election Security and Preparedness (JESP) Unit

Cross-Whitehall coordination function for election-period activity, referenced in FCDO evidence as an integration point during the 2024 General Election.

2.6 Defence Science and Technology Laboratory (DSTL)

Holds the science-and-technology brief, including foundational doctrinal work. Cowles and Verrall’s The Cognitive Warfare Concept: A Short Introduction (2023) is the principal UK doctrinal text. Reports to the Secretary of State for Defence.

2.7 77 Brigade (British Army)

Operational information-manoeuvre capability, headquartered at Denison Barracks. Reservist-heavy structure reflecting the requirement for media, behavioural-science and linguistic expertise. Operates under standard military command authorities.

2.8 National Cyber Security Centre (NCSC)

Increasingly engaged on the synthetic-media, platform-integrity and AI-enabled influence operation components of the threat. Authority derived from its position within GCHQ. The NCSC for disinformation proposal addressed in this paper would extend the NCSC model into the cognitive domain; for present purposes, NCSC’s existing role is technical-adjacent rather than primary.

2.9 Two observations follow. First, the architecture as it stands has no single body with cross-departmental operational tasking authority. Coordination depends on goodwill between bodies whose ministerial accountabilities lie in different departments. Second, the foreign and domestic boundary is the most significant structural feature of the architecture, and, as Section 4 develops, the structural feature most directly exploited by the threat the architecture is intended to address.


3. The Foreign Affairs Committee’s recommendation

3.1 The Foreign Affairs Committee’s report, published on 27 March 2026, recommends the establishment of a National Counter Disinformation Centre on a statutory footing, modelled on bodies including Sweden’s Psychological Defence Agency. The Committee’s chair, Dame Emily Thornberry MP, framed the recommendation in the language of warfare: “It is the new warfare and open liberal democracies are sitting ducks.” The recommendation reflects an emerging policy consensus, articulated most fully in the September 2025 RUSI Commentary, which argued that the UK now needs a National Disinformation Agency modelled on the NCSC. The RUSI argument rests on three claims that this paper accepts: that the UK requires a single, named, credible public-facing voice on FIMI; that response cycles must be compressed; and that the public-private partnership patterns developed in the cyber domain provide a template for engagement with platforms, fact-checkers, and academic researchers. Section 7 sets out which of those claims should be retained in any alternative design. The disagreement is with the institutional shape proposed to deliver them.

3.2 The institutional impulse is correct. Architectural fragmentation is real, response cycles are too slow and the absence of a public-facing voice on FIMI is a meaningful operational disadvantage. This paper does not contest those findings.

3.3 The design template is the problem. Two pieces of the Committee’s own evidence base point in the opposite direction to the recommendation.


3.4 The founding chief executive of NCSC on the NCSC analogy

Ciaran Martin CB, founding chief executive of NCSC, gave oral evidence to the inquiry on 13 January 2026. Asked about the institutional configuration of NCSC and its applicability as a model, he characterised the arrangement as historically contingent: NCSC’s location within GCHQ but with Foreign Office ministerial accountability “is a bit weird” and “accidental, like a lot of things that are a bit weird. It is all to do with history.” He noted that the rationale for NCSC’s creation was a strategic shift towards more active government intervention in the cyber domain, not the institutional shape itself. The witness most qualified to assess the transferability of the NCSC model told the Committee that the model’s structure was an artefact of historical accident rather than a deliberate design choice.


3.5 The responsible Minister on consolidation

Stephen Doughty MP, then Minister of State at FCDO with responsibility for the brief, told the Committee on 6 January 2026: “There is a challenge in that sometimes if you stick all things into one agency, people go ‘That is their business; they are dealing with that.’ The scale and pervasiveness of this threat is so significant that we need every part of Government and every department to play a role.” The Minister whose department holds the international FIMI mandate publicly resisted single-agency consolidation in the same evidence session that produced the inquiry’s central recommendation.

3.6 The Committee’s recommendation is therefore in tension with the testimony of two of its most authoritative witnesses on the institutional design question. This paper takes that tension seriously and develops the alternative that the witnesses’ testimony implies.


3.7 The Swedish institutional model imports a constitutional foundation that the UK does not possess

The Foreign Affairs Committee modelled its recommendation in part on Sweden’s Psychological Defence Agency (Myndigheten för psykologiskt försvar), established in January 2022. The PDA’s authority and public legitimacy derive in significant part from its embedding within Sweden’s Total Defence (Totalförsvar) framework: a constitutional doctrine integrating civil and military preparation under a single mandate, with broad public consent reflected in successive Defence Bills since the 1940s. The UK has no equivalent doctrine. The institutional shell of the PDA can be transferred without difficulty; the constitutional foundation that gives it both political latitude and public acceptance cannot. Building a UK body on the Swedish institutional template, without the constitutional underpinning, produces a hollowed-out version that inherits the original's political exposure without its political durability. The relevant lesson from the Swedish example is not that the UK should establish an analogous agency; it is that effective psychological defence requires constitutional as well as institutional foundations — a precondition the UK has not yet established.


4. Cognitive operations do not share cyber’s properties

4.1 The proposal for a National Counter Disinformation Centre modelled on NCSC has gathered support across government, civil society and parliamentary discourse. The institutional impulse — consolidation — is correct. The design template is misleading. Cyber and cognitive operations differ in three properties that determine what kind of institution can effectively respond to each. Building a cognitive defence body on the cyber template will, at greater cost, reproduce the structural problems it is intended to solve.


4.2 Attribution is structurally different

Cyber attribution rests on technical artefacts: tactics, techniques and procedures, infrastructure overlap, code reuse, operational tradecraft. These are imperfect but observable. NCSC’s authority — and the public-private partnership on which its model depends — is partly built on its ability to attribute with technically grounded confidence and to share that attribution with industry partners through trusted channels. The model works because TTPs exist and can be pointed to.


Cognitive attribution has no equivalent technical floor. Identifying that a piece of content has been produced or amplified by a hostile actor requires answering three distinct questions: intent (what the actor seeks to achieve), context (whether the content fits a coordinated narrative campaign), and effect (whether the operation moved audiences in measurable ways). Each is interpretive rather than technical. None is amenable to the kind of standardised, declassifiable indicator that NCSC can share with critical national infrastructure operators. A centralised cognitive-attribution authority modelled on NCSC would face an unresolvable choice: it would either issue confident attributions that are politically contested precisely because they rest on interpretation, or issue qualified attributions that are operationally too soft to act on. Neither serves the function that the NCSC serves in the cyber domain.


4.3 The threat is constitutive, not asset-based

Cyber defence has a coherent theory of the protected asset. NCSC’s remit is definable in concrete terms: government systems, critical national infrastructure, designated supply chains and the digital security of UK citizens and organisations. These categories admit of operational thresholds, sectoral standards and demonstrable defensive outcomes.

Cognitive operations target sensemaking — the processes by which individuals and institutions convert information into decisions. Sensemaking has no perimeter. It is distributed across citizens, mediated by platforms that the UK does not and should not regulate as critical national infrastructure and shaped by trust relationships between citizens and institutions that are themselves the target. An NCSC-equivalent remit for cognitive defence faces the inverse of the attribution problem: it must be defined either so broadly (the integrity of the UK information environment) that it cannot be operationalised, or so narrowly (HMG communications channels) that it leaves the actual battlespace undefended. The missing concept is the protected asset itself. Without it, the institutional model has nothing to organise around.


4.4 The foreign and domestic boundary is the operational target

This is the most important difference for institutional design. Cyber operations admit of a meaningful foreign and domestic distinction at the technical level: foreign threat actors operate from foreign infrastructure and even when effects are domestic, the technical evidence produces a tractable jurisdictional split between agencies operating abroad and at home.

Cognitive operations do not have this property. The same content, produced by the same foreign-tasked entity, can target both foreign and domestic audiences simultaneously through the same platforms and vectors. The foreign-domestic boundary is not a property of the operation; it is a property of the UK institutional architecture, which adversaries observe and exploit. Section 5 develops this point through the SDA and Doppelganger case. The general claim is structural: any institutional architecture organised primarily around the foreign-domestic split — including, on present design, a National Counter Disinformation Centre sitting alongside but separate from FCDO and Home Office equities — will reproduce the seam that current adversary operations are designed to penetrate.

4.5 The implication

The right unit of analysis for institutional design is the adversary operation, not the defended asset and not the jurisdictional boundary. The function the UK requires is not a public-facing centre of expertise in the NCSC mould, but a coordinating authority capable of directing the existing distributed capabilities to operate concurrently against a single adversary on a single timeline. Section 8 develops this proposal in detail. The point of the present section is the same as the prior one: the NCSC analogy is shaping policy in a direction that cannot succeed against the threat as it actually presents itself.


5. The seam in practice: the SDA and Doppelganger case

5.1 The institutional seam described in Section 4 is not theoretical. It can be observed in the UK response architecture’s engagement with a single, publicly documented adversary network: the Social Design Agency (SDA), which is tasked and funded by the Russian state and is operationally responsible for the Doppelganger campaign.

5.2 The basic facts of the operation are on the public record. Meta has characterised Doppelganger as the largest, most aggressively persistent Russian-origin influence operation it has tracked and as an advanced persistent threat. The network operates by cloning legitimate Western news domains — including UK outlets — and seeding amplification through coordinated inauthentic accounts on Meta platforms, X and Telegram. In October 2024, the United Kingdom sanctioned the SDA and several of its leadership for seeking to undermine Ukraine; the FCDO designation explicitly identified the SDA’s tasking by the Russian state. The targeting set, on publicly reported evidence, includes UK domestic political discourse, with content shaped to inflame existing divisions on immigration, support for Ukraine and trust in UK institutions.

5.3 This single operation engages the remits of, at a minimum, four UK bodies:

•      FCDO Hybrid Threats Directorate, which holds the international FIMI mandate and led the October 2024 sanctions designation.

•      DSIT NSOIT, whose mandate covers risks to UK national security and public safety arising in the domestic information environment.

•      NCSC, increasingly engaged on the synthetic-media and platform-integrity components.

•      Cabinet Office Defending Democracy Taskforce, in its electoral-integrity coordinating role.

5.4 No single body holds primary operational authority across the operation as a whole. The FCDO can attribute and sanction; it cannot direct the domestic response. NSOIT can engage with platforms on UK-domestic content; it cannot direct foreign attribution or sanctions. The Cabinet Office can convene; it cannot task. Each body acts within its own remit, on its own decision cycle, against an operation that does not respect that division because it was designed not to.

5.5 The practical consequences are visible in the public record. The interval between Meta’s public exposure of Doppelganger and the UK sanctions designation in October 2024 was over two years. That delay does not reflect institutional failure by any single body. It reflects the lack of a coordinating authority that can compress the cycle by assigning multiple departments to a shared timeline. Adversary operations of this kind are persistent and adaptive; a multi-year cycle from exposure to coordinated response is not a viable steady state.

5.6 A counterfactual illustrates the institutional point. Had a Cabinet Office Cognitive Resilience Authority of the kind recommended in this paper existed in 2022, a single tasking instruction could have directed: FCDO to develop the attribution and sanctions case; NSOIT to coordinate platform engagement on UK-targeted content; NCSC to assess synthetic-media indicators; Dstl to evaluate behavioural impact on UK audiences; and the Defending Democracy Taskforce to integrate findings into election-integrity preparation. The component capabilities already exist within the current architecture. What is missing is the authority to direct them concurrently against the same operation on the same timeline.

5.7 The SDA and Doppelganger case is illustrative rather than exceptional. The same institutional pattern is observable in the response to RT’s covert influence operations exposed jointly with the United States and Canada in September 2024 and in publicly reported elements of Chinese cognitive domain operations affecting UK-resident diaspora communities. The general pattern is that adversary networks operate across the foreign and domestic boundary by design; the UK response operates within it by structure; and the gap between the two is where adversary operations achieve their effects.


6. The missing foundation: human factors and crisis resilience

6.1 The architectural argument in Sections 4 and 5 is necessary but not sufficient. Even an optimally designed Cognitive Resilience Authority will fail if it does not rest on a foundation that is largely absent from current UK policy discourse: a working understanding of the human factors that determine how individuals and institutions process information under stress.

6.2 Cognitive operations succeed because they exploit predictable failure modes in human cognition. These failure modes are not new; they have been documented across half a century of research in cognitive science, behavioural economics, organisational decision-making and crisis management. They include: the systematic narrowing of attention under perceived threat; the substitution of in-group identity signals for evidentiary reasoning under conditions of information overload; the asymmetric persistence of first-encountered narratives even after correction; the degradation of probabilistic reasoning under time pressure; and the institutional tendency to prioritise the avoidance of false positives over the detection of low-base-rate true positives in threat-monitoring environments. The relevant literatures are mature and convergent: Tversky and Kahneman (1974) and Kahneman (2011) on probabilistic reasoning under load and the dual-system structure of judgement; Lewandowsky et al. (2012) on the continued-influence effect and the limits of correction; Pennycook and Rand (2021) on the role of inattention rather than ideology in susceptibility to false content; Klein (1998) on naturalistic decision-making under time pressure; and Weick (1995) on the conditions under which institutional sensemaking degrades. The absence of this literature from current counter-FIMI policy discourse is itself a finding.

6.3 Adversary cognitive operations target these mechanisms with operational sophistication. Russian reflexive control doctrine (Thomas, 2004; Bugayova and Stepanenko), Chinese cognitive domain operations doctrine (Beauchamp-Mustafaga, 2021) and Iranian information operations as observed during the February 2026 Iran-related disinformation surge all converge on the same operational principle: degrading the capacity for coherent judgment is more economical than persuading audiences to a particular view. The objective is not to make targets believe a specific falsehood. It is to leave them confused, paralysed and incapable of coordinated response. That condition is often sufficient from a strategic standpoint.

6.4 This framing has direct implications for institutional design:

•      Resilience is not a property of institutions. It is a property of the populations and decision-makers that institutions serve. An architecture optimised for detection and attribution but not for the cognitive resilience of its target populations is solving a measurable but secondary problem.

•      Detection metrics measure exposure, not effect. The fact that a piece of foreign-tasked content has reached UK audiences does not establish that it has degraded their decision-making. The fact that it has not reached them does not establish their resilience. The metrics that matter are behavioural, not exposure-based.

•      Counter-messaging is operationally limited. Empirical evidence on the effectiveness of debunking, counter-narratives and prebunking is mixed and condition-dependent. Roozenbeek and van der Linden (2019) and Roozenbeek et al. (2022) report measurable inoculation effects from prebunking interventions delivered at scale; Lewandowsky et al. (2012) document the conditions under which corrective messaging persists or backfires; the literature on counter-narrative campaigns is more equivocal still. An institution funded to produce counter-messaging at scale, without a feedback mechanism evaluating effect, will optimise for output rather than outcome.

•      Institutional decision-making under information attack is itself a target. The same failure modes that make individuals susceptible to cognitive operations also make institutions susceptible during crises. An architecture that does not include explicit institutional resilience measures — protected decision processes, structured red-teaming, deliberate slowness on contested attribution calls — leaves its own decision-making vulnerable to the operations it is designed to counter.

6.5 Recommendation 8.6 of this paper — funding a cognitive-domain research and evaluation function — addresses this gap directly. Without a research function grounded in human-factors science, the architecture will produce activity but cannot demonstrate effect. The case for the research function is therefore not subordinate to the case for the coordinating authority; it is a precondition for the coordinating authority’s effectiveness.


7. What the NCSC analogy gets right

7.1 The NCSC analogy is being made by serious people for serious reasons. Four elements of the NCSC model are genuinely transferable to a strengthened cognitive-domain response and should be retained in any alternative design.

7.2 Consolidated public-facing identity

NCSC’s effectiveness rests substantially on being a single, named, credible point of contact for industry, civil society and the public. The current counter-FIMI architecture has no such public face and this is a real cost — particularly in incident response, where the absence of an authoritative public voice cedes the narrative to whichever commentator is fastest. The Cabinet Office Cognitive Resilience Authority, as proposed in Section 8, should have a designated public-facing function, even if its primary mode is coordination rather than direct operational activity.

7.3 Declassification mechanism

NCSC has developed an institutional capacity to declassify intelligence material into operationally usable form for industry partners at speed. This is non-trivial and is one of the underrated reasons NCSC functions effectively. Any cognitive-domain architecture requires an analogous mechanism — particularly for attribution material that would otherwise sit at classifications too high to share with platforms, civil society researchers, or Five Eyes partners on routine timescales.

7.4 Public-private partnership

NCSC’s Industry 100 programme and its broader engagement with the cyber security community provide a structural template for engaging with platforms, fact-checking organisations, academic researchers, and Five Eyes partners that any effective cognitive-domain response requires. The institutional habit of partnership is genuinely transferable.

7.5 Research function

NCSC funds and shapes research that informs operational practice. The cognitive domain currently lacks a comparable mechanism. Recommendation 8.6 below addresses this directly, with the additional requirement that the research function be grounded in human-factors science as set out in Section 6.

7.6 What the NCSC analogy gets wrong, addressed in Section 4, is the fundamental institutional shape. What it gets right, addressed in this section, are the supporting institutional features. Both should inform the alternative architecture proposed in Section 8.


8. An alternative architecture: recommendations

8.1 The recommendations below are stated in operational terms — what is being proposed, why, what scope it has, what implementation pathway exists and what trade-offs it carries. The aim is to produce recommendations that can be lifted into government policy with minimal redrafting.

8.2 A Cabinet Office Cognitive Resilience Authority

The proposed Authority would sit within the Cabinet Office, reporting through the Deputy National Security Adviser and integrated with the Defending Democracy Taskforce for ministerial accountability. It would operate as the coordinating spine of the existing architecture rather than as a new public-facing body. Its core function would be the issuance of single-tasking instructions across FCDO Hybrid Threats Directorate, DSIT NSOIT, NCSC, Dstl, the Defending Democracy Taskforce and JESP against identified adversary operations, with statutory authority sufficient to compress current multi-month coordination cycles to days. The model is closer to the Joint Intelligence Organisation than to NCSC: a small, senior, integrative body that directs distributed capabilities rather than duplicating them. The trade-off is that it does not produce a visible public-facing capability; this should be regarded as a feature, not a limitation, given the political volatility documented at NSOIT’s predecessor.

8.2.1 The constitutional vehicle for the Authority’s tasking power is a short statutory instrument modelled on the frameworks under which the National Security Adviser and the Joint Intelligence Organisation operate, conferring on the Authority the power to direct departments holding designated counter-FIMI functions against an identified adversary operation on a single timeline. Statutory recognition matters because it converts a coordinating relationship that currently depends on interdepartmental goodwill into one with the durability of law. The Defending Democracy Taskforce, in its present form, illustrates the limit case of coordination without statutory tasking: it can integrate work that other bodies bring forward, but cannot direct work that other bodies have not prioritised. The proposed Authority differs in that — and in that alone. Its size, seniority and integrative posture should otherwise resemble JIO rather than any operational agency.

8.3 A published UK Cognitive Resilience Strategy

A cross-government doctrine document, structured analogously to the 2022 National Cyber Strategy, should be published openly within twelve months of the Authority’s establishment. The Strategy should define the threat, set out the architecture, articulate operational thresholds and ethical limits and commit to measurable outcomes. The publication itself does substantive work: it anchors cross-government activity in shared definitions, provides Parliament and civil society with a benchmark for scrutiny and signals to allies the maturity of the UK approach. The trade-off is that publication forecloses some operational ambiguity; this is appropriate, given that the absence of public doctrine has been a recurring source of political contestation rather than operational advantage.

8.4 Statutory thresholds for operational response

The current case-by-case ministerial determination of operational thresholds creates two distinct risks: under-response when ministerial attention is elsewhere and over-response when political pressure is acute. Statutory thresholds — defining, for example, the volume, attribution confidence and target-population characteristics that move an operation from monitored to actioned — would constrain both risks. The legislation should be modelled on the Investigatory Powers Act’s approach to graduated authorisation: clear thresholds, clear authorisations, clear oversight. The trade-off is reduced ministerial flexibility; the gain is reduced political volatility and improved practitioner clarity.

8.5 Intelligence and Security Committee oversight

The absence of ISC oversight of NSOIT and its predecessor has been a recurring source of political contestation, contributing directly to the 2023 to 2024 controversy that produced a reported substantial reduction in flagging activity. Operational capability that swings by an order of magnitude with each public criticism is not a steady-state capability. Bringing the relevant elements within ISC purview resolves the legitimacy question that constrains operational latitude. The trade-off is the additional reporting burden on a small directorate; this is a modest cost for substantial political durability.

8.6 A funded research and evaluation function

The UK currently invests in counter-FIMI activity without systematically evaluating what works. Dstl’s existing cognitive warfare research provides a foundation, but nobody is currently funded to evaluate operational effect across the architecture: which interventions move audiences, which produce backfire, which are noticed by adversaries and adapted around. A dedicated evaluation function — Dstl-led, with academic partnership through King’s College London, Oxford and one or two regional universities — would produce the evidence base on which credible long-term investment can rest. As Section 6 sets out, this function must be grounded in human-factors science: outcome metrics, not exposure metrics. The trade-off is the time horizon: evaluation outputs will lag investment by two to three years. This is a feature of evaluation, not a defect and is the principal reason it should begin now rather than later.

8.7 Sequencing

Election integrity is the highest-confidence pressure test the UK counter-FIMI architecture will face. The institutional maturity required to respond to a sustained, multi-vector cognitive operation against a UK general election is substantially greater than the maturity required for current steady-state activity. Working backwards from that benchmark sets a hard timeline for the recommendations above:

•      Authority established and operational within twelve months of the decision.

•      Strategy published within eighteen months.

•      Statutory framework enacted within twenty-four months, subject to the availability of government bill time.

•      Evaluation function producing first outputs within thirty-six months.

This is achievable. The most exposed item on the schedule is the statutory framework, which depends on bill time and on cross-party support that cannot be guaranteed; where slippage is unavoidable, the order of priority should be Authority and Strategy first — both achievable through executive action — with statutory thresholds and ISC oversight extension on the next available legislative window and the evaluation function on a parallel track that does not depend on legislation. Implementing the executive elements first establishes the operational capability against which the legislative elements can then be calibrated. The principal residual risk is that sequencing slips into the post-election period, at which point the lessons of inadequate response will have been learned at a higher cost than necessary.


9. Risks and trade-offs

9.1 Free expression

Any architecture for state response to information operations carries a structural risk to free expression. The history of the Counter Disinformation Unit — including the documented flagging of social media content from sitting parliamentarians, journalists and academics whose statements were critical of government policy but not factually false — illustrates the risk in concrete terms. The recommendations in Section 8 address this risk in three ways: by separating the domestic information environment from the principal scope of the Cognitive Resilience Authority, which is foreign-tasked operations; by introducing statutory thresholds that constrain operational response to material with attestable adversary tasking; and by extending ISC oversight to the activities of NSOIT, which is the body with the most direct exposure to the free-expression boundary. These measures will not eliminate the risk. They are intended to constrain it within tolerances that a democratic polity can sustain.

The strongest principled objection to any state architecture in this domain is that no design, however well-constructed, can be trusted to remain within those tolerances over time and that the prudent default is therefore to keep state distance from the information environment altogether. This paper takes that argument seriously and is not dismissive of it. The empirical answer is that the alternative is not a state-free information environment; it is an information environment shaped by adversary states whose tolerances are demonstrably outside democratic limits. The choice is not between state activity and its absence. It is between bounded, statutorily constrained, parliamentarily overseen state activity and an information environment ceded to actors whose objective is the degradation of democratic decision-making. Recognising that the choice is structured this way does not resolve the principled objection. It does set out the trade-off against which the architecture must be judged.

9.2 Mission creep

A coordinating authority with statutory cross-departmental tasking power could, in time, expand its remit beyond foreign-tasked information operations into adjacent areas — domestic political contestation, lawful but disagreeable speech, and content that is deemed harmful by ministerial judgment of the day. This is the risk that produced the original Counter Disinformation Unit’s problems and it cannot be assumed away by good intentions. The principal mitigations are the statutory definition of remit, ISC oversight and the published Strategy at Recommendation 8.3 — which, by anchoring scope in public doctrine, makes mission creep observable and contestable.

9.3 Politicisation

A capability operating in the cognitive domain will be subject to political contestation regardless of its design. The history of NSOIT illustrates the volatility — operational capacity reportedly fell substantially following sustained criticism from civil society and parliamentarians. A capability that can be reduced by an order of magnitude through public criticism is not, in any meaningful sense, a steady-state capability. The recommendations in Section 8 are designed in part to reduce this volatility by anchoring the capability in published doctrine, statutory thresholds, and parliamentary oversight — converting political contestation into structured argument over identifiable instruments rather than recurring crises over the capability's existence.

9.4 The risks identified here are real and the mitigations proposed are partial. The alternative — the current fragmented architecture, operating without published doctrine, statutory thresholds, or independent oversight and now potentially supplemented by a National Counter Disinformation Centre that does not address the architectural problem — does not avoid these risks. It distributes them among multiple bodies and decision-makers, making them harder to address. The honest argument for the recommendations in this paper is not that they eliminate the structural risks of state activity in the cognitive domain. It is that they make those risks identifiable, contestable and constrained, in a way that the current architecture and the Foreign Affairs Committee’s proposed addition to it do not.


10. Conclusion

10.1 The institutional question is the precondition for the operational, regulatory and ethical questions that will dominate the next phase of UK policy on foreign information manipulation. Getting the institutional design right does not guarantee that the UK response will succeed; getting it wrong guarantees that the response will fail at a greater cost than necessary, against an adversary whose capabilities are already mature and adapting faster than the UK architecture can move.

10.2 The Foreign Affairs Committee has identified the right problem. Its recommendation, however, applies an analogy from cyber defence that does not transfer cleanly to the cognitive domain, is in tension with the testimony of two of its own most authoritative witnesses and rests in part on a Swedish institutional model whose constitutional foundation the UK does not possess. The alternative this paper sets out — a Cabinet Office Cognitive Resilience Authority, a published Strategy, statutory thresholds, ISC oversight and a research function grounded in human-factors science — is offered in the conviction that the design choice is open, that it is being made now and that it is more important than the dominant public proposal acknowledges.

10.3 Emerging Risks Global will continue to contribute to this conversation through engagement with government, parliament, academic partners and the international community working on these questions. We welcome dialogue with parliamentarians, officials, allied governments and civil society organisations whose work intersects with the questions this paper raises.


About the Author

Dr Paul Wood is Chief Executive Officer of Emerging Risks Global. His research examines the human factors that determine institutional and individual resilience under crisis conditions. He has published widely on emerging risks, security architecture and crisis resilience.


This paper is written in the author’s capacity as Chief Executive of Emerging Risks Global. The views expressed are the author’s own and do not represent those of any other organisation with which he is affiliated.

 

 
 
 

Comments


business-people-working-data-project.jpg

REQUEST ERG'S SECURITY CONVERGENCE EXPERTISE

Receive tailored, intelligence-led and risk-based
security advice, designed 
to meet your requirements

 

Get in touch with us and we will assist you further.

Security Education, Risk, Resilience Awareness and Culture

Address

Southgate Chambers, 37-39 Southgate Street, Winchester, England, SO23 9EH

EMERGING RISKS GLOBAL ®

Emerging Risks Global ® (ERG) is a trading name of Woodlands International Ltd ©

Registered in England and Wales: 11256211.

VAT GB 507 077 204

Connect With Us

  • Instagram

This website and its content is copyright of  Woodlands International Ltd ©. 2025  All rights reserved. 

bottom of page