Weaponized Dissonance: Russian and Iranian Disinformation Targeting the MAGA Movement (2022–2025)
Disinformation as hybrid warfare has become a hallmark of both Russian and Iranian foreign policy. In the United States, the MAGA movement—a heterogeneous coalition of nationalists, populists, libertarians, and Christian conservatives—has become a strategic target due to its cultural influence and political volatility. Recent campaigns by Russia and Iran have exploited online factionalism, amplified conspiracy theories, and hijacked digital narratives to deepen divisions within MAGA communities (Lemmon, 2024).
Russian Strategy: Sophisticated Fragmentation and Narrative Hijacking
Since 2022, the Russian Federation has intensified its digital interference strategy targeting the U.S. far-right, particularly factions within the MAGA movement. This new phase of influence is not characterized by crude meme warfare or overt Kremlin messaging. Instead, it is defined by psychologically attuned fragmentation techniques, with the primary goal of degrading ideological cohesion through deception, mimicry, and engineered contradiction.
A key tactic in this phase is the deployment of “mirrored personas”, detailed in Müller (2024). These are synthetic accounts—often AI-assisted—that convincingly emulate the rhetorical patterns, iconography, and grievances of real MAGA supporters. However, rather than reinforcing the movement’s unity, these personas subtly subvert core beliefs. Some raise doubts about Donald Trump’s commitment to isolationism; others question GOP loyalty to conservative Christian values, or suggest betrayal on Second Amendment rights. The psychological manipulation here is sophisticated: by imitating "insiders," Russia is able to spark distrust from within, making disinformation appear organic and thus more persuasive.
These tactics are deployed across a multi-platform ecosystem, with TikTok emerging as a key battleground. As shown by Ishchuk et al. (2024), a documented network of over 1,700 Russian-linked accounts has been used to flood conservative algorithmic feeds with emotionally charged, culturally adapted content. The goal is not simply to provoke outrage, but to divide factions within the MAGA base using precision targeting.
This strategy involves cultural and linguistic tailoring, where bots adopt localized speech, region-specific slang, and thematic appeals designed to resonate with distinct MAGA demographics:
Texan militia groups receive hyper-localized messages about border sovereignty, immigration “invasions,” and Second Amendment absolutism.
Christian homeschoolers are targeted with narratives alleging satanic influence in public education, and questioning Trump’s alignment with biblical values.
Rust Belt populists are fed economic nostalgia, laced with conspiracy theories suggesting Trump has abandoned working-class Americans to serve foreign oligarchs or Israeli interests.
These messages are intentionally contradictory across subgroups, sowing suspicion about loyalty and ideological purity within the movement. For instance, one set of content might glorify Trump as the last defender of Western civilization, while another questions his association with "globalists" or accuses him of capitulating to Big Pharma.
What makes this strategy especially effective is algorithmic amplification. Russian bots not only post content but engage in coordinated liking, commenting, and reposting, triggering TikTok’s recommendation system to elevate divisive materials to real users. This gamifies factionalism, encouraging users to publicly declare loyalty to one MAGA interpretation over another—Trump loyalist vs. populist purist, Christian nationalist vs. libertarian constitutionalist.
Moreover, the visual grammar of this content is platform-native. Instead of overt political logos or formal arguments, Russian-linked videos use:
Patriotic music overlays with slow-motion footage of burning flags or weeping veterans.
Glitch aesthetics that imply suppressed truths and secret knowledge.
First-person “testimony” formats, often scripted and performed by actors or deepfake personas.
This strategy leads to several cascading effects:
Narrative Overload – The sheer volume of conflicting interpretations of MAGA ideology erodes any shared definition or cohesion.
Paranoia Feedback Loops – Users become increasingly suspicious of one another, accusing fellow MAGA supporters of being "grifters," "Zionist plants," or “controlled opposition.”
Radical Fringe Migration – Isolated or disillusioned users drift into more extreme ideological camps, including white nationalism, accelerationism, or Q-adjacent conspiracy forums.
The result is a MAGA ecosystem that is ideologically dense but structurally brittle—high in emotional engagement but low in coordination potential. From a geopolitical standpoint, this serves Russia’s objective of weakening U.S. domestic unity and complicating foreign policy consensus within the Republican base.
Thus, Russia’s strategy post-2022 is best understood not as a direct assault on U.S. elections, but as a long game of ideological sabotage, executed through digital mimicry and cultural fragmentation. The goal is not to convince MAGA supporters to love Moscow—but to render them incapable of trusting each other, much less organizing around shared political goals.
Iranian Influence: Moral Framing and Ideological Exploitation
While Iran's cyber influence operations have historically received less attention than Russia’s, recent evidence shows a marked escalation post-2022, especially in operations targeting the U.S. political right. Iran's approach to disinformation is subtler but no less potent: rather than flooding channels with overt propaganda, it strategically leverages moral framing and ideological mimicry to exploit fissures within the MAGA movement.
According to Akbarzadeh et al. (2025), Iran’s flagship digital influence initiative—the “Web of Big Lies” program—has relied on the creation of fake Christian fundamentalist personas, complete with AI-generated avatars, Bible-verse memes, and anti-globalist talking points. These personas infiltrate right-wing digital spaces with tailored content that praises nationalism and Christian values while seeding suspicion toward pro-Israel factions within MAGA ranks.
The core narrative is crafted to frame U.S. support for Israel as betrayal, not only of American interests but of Christian moral values—suggesting that Zionist agendas exploit the American religious right. Such framing plays on long-standing fringe conspiracy tropes about dual loyalties and foreign influence, but it is delivered in a tone that mimics MAGA vernacular: patriotism, Christian eschatology, and anti-globalist fear.
Importantly, this tactic weaponizes ideological authenticity—posing moral questions like:
“Can you be truly pro-America and blindly support Israel’s wars?”
“Does real Christian faith mean unquestioning support for Zionism?”
“Why do our taxes fund a foreign state while American borders collapse?”
Rather than making direct claims, Iranian accounts often pose rhetorical questions, share unverifiable “testimonies,” or post manipulated clips of MAGA influencers appearing to contradict conservative orthodoxy. The goal is not persuasion, but moral destabilization—creating a sense that parts of the MAGA base have strayed from the "true path."
This operation is amplified by covert partnerships with Russian influence campaigns, particularly following the 2022 escalation of the war in Ukraine. Sánchez (2025) notes that Iranian cyber units have adopted Russian-style tactics such as coordinated “content swarming” across alternative platforms like Truth Social, Gab, Gettr, and Rumble. These campaigns involve synchronized posts by seemingly unrelated accounts to create the illusion of widespread consensus around divisive issues—such as U.S. military aid to Israel, religious infiltration by Zionists, or moral decay attributed to globalist elites.
This cross-platform symmetry—with near-identical memes, hashtags, and talking points appearing across Persian-backed and Kremlin-linked networks—suggests operational convergence. While the two states may pursue different geopolitical ends, their disinformation goals in the U.S. converge: fracture MAGA from within, delegitimize political institutions, and promote populist distrust.
Iran’s approach is also more theologically attuned than Russia’s. Messaging often includes Quranic or Biblical juxtapositions, historical references to the Crusades or colonialism, and subtle antisemitic dog whistles—designed to resonate with Christian nationalist audiences while deflecting overt blame. This “faith-first” strategy is highly effective in low-moderation zones like Telegram channels, decentralized podcasting, and anonymous newsletter circuits.
In sum, Iran’s influence model combines:
Moral mirroring – Adopting MAGA language about faith, family, and freedom.
Narrative distortion – Reframing support for traditional MAGA allies (e.g., Israel) as betrayal.
Platform asymmetry – Operating in unregulated digital enclaves.
Operational synergy with Russia – Coordinating themes, content drops, and amplification tools.
This strategy has intensified since 2023, particularly during the lead-up to the 2024 U.S. elections, where MAGA-aligned communities became increasingly polarized over foreign policy, Christian Zionism, and what constitutes “true” conservatism. Iranian disinformation has helped accelerate this drift into factional infighting, often without ever needing to take credit.
Convergent Tactics: Joint Influence and Identity War
The evolving cyberstrategies of Russia and Iran reveal a deepening convergence in disinformation methodology, even as their geopolitical agendas diverge. Both states increasingly engage in identarian warfare—a form of cognitive manipulation that weaponizes group identity, not just ideology. Their joint objective is to destabilize political cohesion within the MAGA movement by amplifying its internal contradictions.
Investigative analyses and recent empirical studies show that Russian and Iranian operatives have adopted microfactional targeting techniques to splinter MAGA from within. They create and disseminate tailored messages to distinct subgroups—anti-interventionists, evangelicals, white working-class nationalists, libertarian crypto-enthusiasts, and others—using memetically adapted content that speaks to each group’s values but contradicts other subgroups' beliefs.
For example, anti-interventionist audiences are fed isolationist content that depicts foreign aid—especially to Israel or Ukraine—as evidence of GOP betrayal and neoconservative globalist agendas. Simultaneously, evangelical MAGA users are bombarded with pro-Israel, End Times prophecy-laced narratives, framing U.S. military support for Israel as biblically mandated. Both campaigns are labeled “patriotic,” but their ideological foundations are mutually exclusive, creating cross-factional animosity and cognitive dissonance within MAGA communities.
This coordinated polarizing strategy is what Burchell et al. (2025) define as the “weaponization of identity under marketcraft”—the idea that identity traits (religion, class, race, veteran status, and regional affiliation) are now monetizable vectors for information warfare. Platforms function like marketplaces for cultural symbols, and Russia and Iran exploit these systems by injecting divisive content that appears indigenous but is algorithmically seeded to provoke conflict.
Importantly, this manipulation transcends policy disagreement. It focuses on emotional triggers and cultural belonging:
Veteran betrayal: Russian and Iranian actors have shared fabricated claims that GOP leaders abandoned combat veterans, intentionally spread diseases via VA hospitals, or redirected VA funding to refugees. These messages often include emotionally charged visuals—crying soldiers, empty flag-draped coffins, or altered footage of political figures mocking military service.
Race-based narratives: These include conspiracies alleging MAGA figures are compromising “white heritage” through alliances with multiculturalism or foreign-born donors. These messages are deployed both subtly (e.g., nostalgic memes about “lost America”) and overtly (e.g., AI-generated white nationalist quotes attributed to GOP leaders to stoke intra-party racial tension).
Religious purity tests: This strategy is especially potent within evangelical MAGA groups. Disinformation agents post AI-edited clips or fake tweets showing MAGA influencers making theologically suspect statements—questioning the Bible’s inerrancy, embracing Catholicism, or softening positions on abortion or LGBTQ issues. The aim is to delegitimize figures as "false prophets", sowing doctrinal mistrust that undermines religious-political cohesion.
The 2024 U.S. election cycle served as an operational crescendo for these efforts. As Lemmon (2024) documented, MAGA-aligned spaces on platforms like Rumble, Truth Social, and Telegram saw a dramatic spike in cross-faction accusations—calling out others as “controlled opposition,” “Zionist stooges,” or “Putin apologists.” What many users believed were grassroots ideological feuds were in fact engineered skirmishes, guided by foreign actors exploiting platform algorithms to maximize divisiveness.
Furthermore, disinformation operatives increasingly utilize AI-generated personas—believable yet synthetic characters, often claiming to be veterans, single mothers, pastors, or rural small business owners. These personas engage in thread hijacking on discussion forums and livestream chats, posing as authentic voices of MAGA disillusionment or radical orthodoxy, thereby reframing the movement’s boundaries from within.
The strategic convergence between Russian and Iranian operators is not merely tactical but thematic. Despite differing regional goals, both states benefit from a weakened and internally incoherent American right. By using identity as the terrain of contestation, they bypass traditional defenses like fact-checking or platform bans, embedding their influence within the very architecture of belief in MAGA digital subcultures.
This emerging battlefield is cultural, emotional, and intra-tribal—and it is arguably more destabilizing than overt ideological propaganda.
The Weaponization of Identity: Disinformation as Psychological Fragmentation
Recent investigations confirm that Russian and Iranian disinformation architects have shifted their strategies from mass persuasion to precision fragmentation, using a method increasingly referred to as identarian warfare. This tactic doesn't just manipulate information—it manipulates identity affiliations themselves, targeting specific factions within the MAGA movement to turn them against one another in an effort to splinter cohesion and trust.
At the core of this tactic is demographic microtargeting, whereby disinformation operatives push isolationist and anti-globalist propaganda—often drawing from paleoconservative rhetoric—to anti-interventionist MAGA subsets, such as libertarians or “America First” nationalists. Simultaneously, they deliver pro-Israel, anti-Iran, and pro-military messaging to evangelical Christian factions, exploiting eschatological themes and aligning U.S. foreign policy with biblical prophecy. These contradictory narratives are not merely ideological—they are designed to generate internal mistrust and emotional dissonance, weakening the MAGA coalition from within (Burchell et al., 2025).
Burchell and colleagues frame this as the "weaponization of identity under marketcraft"—a process by which cultural beliefs are commodified into units of information warfare. Through this lens, religious belief, regional affiliation, veteran status, and racial identity are not just audience attributes but attack surfaces—vulnerabilities to be exploited by content tailored to provoke shame, fear, or betrayal.
During the 2024 U.S. election cycle, these strategies reached a new level of intensity. As Lemmon (2024) documented, botnets and microinfluencer farms pushed narratives of veteran betrayal, suggesting that MAGA politicians had abandoned U.S. troops overseas or ignored the plight of veterans at home. Simultaneously, race-based identity tropes were deployed—such as memes accusing MAGA leaders of selling out white working-class Americans to appease "globalist elites" or minorities.
A particularly insidious tactic involved the deployment of “religious purity tests.” These campaigns questioned whether prominent MAGA figures were “true Christians,” using out-of-context quotes, digitally altered videos, and AI-generated sermons to create intra-faith suspicion. This sowed division within Christian nationalist factions that had previously been reliable Trump supporters.
What distinguishes identarian warfare from traditional political disinformation is its emphasis on fragmentation over conversion. The goal is not to sway MAGA supporters toward a Russian or Iranian worldview—but to fracture their internal solidarity and amplify paranoia toward adjacent factions. This process is self-reinforcing: the more the movement fragments, the easier it is for foreign actors to push increasingly extreme content to isolated subgroups.
In essence, identity becomes the payload, not just the target, and disinformation becomes a tool not of persuasion but of engineered collapse from within.
Conclusion
From TikTok botnets to Christian nationalist persona farms, Russia and Iran are deploying high-resolution, adaptive influence operations that exploit every available cultural, emotional, and technological vulnerability within the MAGA movement. These campaigns are not merely informational—they are psychosocial weapons, designed to fracture identity groups, provoke ideological cannibalism, and render a once-cohesive political bloc incapable of collective action. What makes these operations especially dangerous is their invisibility to traditional defenses: they do not always spread lies but rather distort truths, manipulate sentiment, and exploit internal contradictions.
By combining identity polarization, financial opacity through cryptocurrency and shell groups, and emotional targeting (veterans, religion, race, betrayal), these campaigns operate as a kind of information-based sabotage. Their goal is not to convert but to corrode—to take a powerful political movement and turn its strength against itself. And so far, they’re succeeding.
If we do not develop the analytical and emotional literacy to detect these tactics in real time, we risk watching powerful social and political coalitions implode from within—not because of ideological collapse, but because their belief systems were hijacked and slowly weaponized against themselves.
Learning to spot these tactics before they metastasize is not optional. It is the new baseline of digital citizenship.
References (2022–2025)
Lemmon, J. (2024), The Weaponization of Fiction and Truth: Disinformation in the 2024 Election.
Akbarzadeh et al. (2025), The Web of Big Lies: State-Sponsored Disinformation in Iran.
Müller, MM. (2024), Looking Doppelganger: Evolving State-Sponsored Disinformation Tactics.
Ishchuk et al. (2024), Activism in Cyberspace as a Hybrid Threats Counter.
Sánchez, JLM (2025), How Disinformation Ruins Public Diplomacy.
Burchell et al. (2025), Beyond Disinformation: Identarian Narratives Meet Lawfare.
Soulé, C. (2025), Prevention or Provocation? Disinformation in the Israel-Iran Context.
Latif et al. (2025), Digital Diplomacy and Disinformation: Reshaping Global Public Opinion.