When the world's at stake,
go beyond the headlines.

National security. For insiders. By insiders.

National security. For insiders. By insiders.

Join War on the Rocks and gain access to content trusted by policymakers, military leaders, and strategic thinkers worldwide.

TikTok, Algorithmic Emotion, and the New Propaganda Battlefield

January 27, 2026
TikTok, Algorithmic Emotion, and the New Propaganda Battlefield
TikTok, Algorithmic Emotion, and the New Propaganda Battlefield

TikTok, Algorithmic Emotion, and the New Propaganda Battlefield

Michael Morgan
January 27, 2026

TikTok has quietly become one of the most influential platforms in the contemporary information environment, not because it promotes a particular ideology, but because it systematically prioritizes emotional engagement. More than a venue for entertainment or cultural trends, the platform operates as an algorithmic environment that reinforces affective responses through repeated exposure and feedback. During the Israel–Hamas war, TikTok’s design offered a clear illustration of how modern propaganda often functions less through explicit persuasion than through the cultivation and amplification of emotionally primed attention.

This analysis draws on an original survey of 193 U.S. college-aged TikTok users, recruited in person from four major universities in Michigan during the fall of 2024. All respondents were between 18 and 30 years old and reported using TikTok as a primary source of news and information. Rather than measuring ideological alignment or partisan identification, the survey focused on how participants emotionally interpreted short-form political and conflict-related content, capturing affective reactions such as empathy, anger, fear, and grief.

The results suggest that classical “conversion” — the adoption of new ideological identities or partisan positions — via TikTok remains relatively uncommon, even during periods of intense conflict. Pew Research Center, “8 facts about Americans and TikTok.” By treating perceived credibility, emotional response, and political orientation as analytically distinct dimensions, the survey allows associations to be examined without assuming direct causation. What changed most consistently across respondents was emotional orientation rather than ideological belief. Participants who viewed TikTok as a credible source of news reported significantly stronger affective reactions to content about the Israel–Hamas war, without corresponding shifts in their declared political allegiance.

TikTok, therefore, does not function primarily as an ideological persuader. It does not instruct users what to think so much as condition how they feel. In the contemporary information environment, that distinction matters: Emotional orientation increasingly precedes belief, shaping the terrain on which political judgments are later formed. While a sample drawn from four universities in one state cannot support claims about TikTok’s effects across all user populations, it does illuminate how emotionally engaged college-aged users experience and interpret conflict content, an analytically relevant group given their outsized presence on the platform.

 

 

A Changed Information Environment

Among Americans under 30, TikTok has become one of the dominant entry points to world events, as recent surveys from the PEW Research Center and the Reuters Institute for the Study of Journalism demonstrate. Short-form social video now rivals or surpasses television and print as a primary platform for many younger users to encounter major international stories.

TikTok eliminates nearly all such media filters from its “For You Feeds,” which is optimized for real-time engagement. According to TikTok’s own explanation of its recommender systems, the platform tracks viewer behavior at the micro level — replays, likes, comments, and other interaction signals — to infer interests and adjust recommendations. These behavioral micro-signals allow the platform to approximate emotional responses with remarkable granularity and to recalibrate content delivery instantaneously.

During the initial days of the Israel–Hamas war, emotionally charged video clips — many unverified or stripped of strategic context — spread through TikTok feeds within hours. Graphic depictions of civilian suffering circulated globally well before professional journalism reached a consensus on verification. Complex geopolitical events were transformed into easily consumable moral narratives, frequently framed as simplified binaries. Those short clips seldom delivered factual depth, but their visual immediacy established emotional baselines. By the time detailed reporting gained visibility, many viewers had already accepted emotional framing that shaped how subsequent facts were interpreted.

In this ecosystem, influence may no longer require centralized coordination, sustained ideological messaging campaigns, or disciplined propaganda networks. Influence flows through platform design. Content succeeds when it stimulates emotion. Visibility follows engagement. Users themselves become distribution nodes, amplifying emotionally resonant narratives in pursuit of identity expression and social validation. Today, propaganda emerges less from organized state broadcasting and more from engagement architectures that monetize emotional arousal.

Algorithmic Conditioning of Emotion

Research across psychology, behavioral economics, and political communication has long established that emotional arousal increases heuristic judgment and reduces reflective evaluation. These dynamics are not new. What remains underexplored, however, is how platform-level recommendation systems operationalize these well-known cognitive tendencies at scale — and how users experience that process subjectively during real-world geopolitical crises.

The contribution of this study lies not in demonstrating that emotion shapes judgment, but in showing how algorithmic feedback loops condition emotional orientation independently of ideological belief. TikTok’s recommender system does not merely expose users to emotional content: It repeatedly confirms and amplifies specific affective responses through engagement-based personalization. Emotional reactions serve both as input and as output in content selection.

Unlike traditional propaganda or persuasion models, which assume intentional messaging and belief change, the mechanism observed here is structurally agnostic to ideology. Emotional responses — empathy, anger, grief, outrage — are reinforced without reference to their political direction. Over repeated cycles, this produces affective convergence: Feeds become emotionally homogeneous even when users do not seek ideological reinforcement.

Crucially, users do not report consciously curating these environments. Emotional narrowing emerges passively through algorithmic optimization rather than active preference signaling. This distinguishes algorithmic emotional conditioning from classical echo chambers or selective exposure models, which rely on deliberate choice. In this context, emotion functions less as a response to belief and more as a pre-cognitive organizing principle for information exposure.

The novelty of this dynamic lies in sequencing rather than intensity. Emotional orientation is conditioned before ideological alignment and without persuasive intent. This study, therefore, reframes emotional engagement not as a byproduct of influence but as an early-stage environmental condition that shapes how later political information is interpreted, evaluated, and circulated.

Research Design and Findings

This study was designed to examine emotional engagement as a distinct analytical dimension rather than as a proxy for ideology or persuasion. Rather than asking whether TikTok changes what users believe, the survey isolates how users feel in response to political and conflict-related content, and how those affective responses interact with perceived credibility, engagement behavior, and narrative exposure.

The central empirical contribution is the identification of emotional pre-alignment as a measurable phenomenon. Across respondents, ideological self-placement remained largely stable regardless of time spent on the platform. What varied consistently was emotional intensity. Participants who perceived TikTok as a credible source of information reported significantly stronger affective reactions to conflict content, even in the absence of ideological change.

This distinction matters because much existing research assumes emotional engagement either reflects or produces belief change. The findings here complicate that assumption. Emotional response, credibility perception, and ideological identity emerge as analytically separable variables. Emotional intensification was associated with increased attention, discussion, and content sharing — but not with conversion or radicalization.

Notably, engagement followed emotion rather than ideology. Participants who experienced stronger emotional reactions were more likely to seek additional information, discuss the conflict with peers, and amplify emotionally resonant narratives. These behaviors occurred across ideological categories, suggesting that emotional conditioning operates prior to and independently from partisan alignment.

Taken together, these findings shift the analytical focus from persuasion outcomes to exposure conditions. The study does not demonstrate that emotional engagement causes later ideological change. It demonstrates that emotionally reinforced attention shapes the informational environment in which political judgments are later formed.

Over time, many respondents reported increasingly repetitive or emotionally consistent exposure to content. Despite awareness of personalization bias and active efforts to encounter alternative perspectives, emotional reinforcement loops reduced narrative diversity. Competing viewpoints became algorithmically less visible. Emotional consistency functioned as a form of informational continuity, in which repeated affective signals served as a substitute for evidentiary validation. Across these findings, the dominant observed mechanism was not persuasion but emotional pre-alignment: a measurable shift in affective orientation that occurs prior to, and independently from, ideological commitment. Moral sympathies, grievance frameworks, and outrage attribution were conditioned psychologically before ideological positions solidified.

These findings do not demonstrate a causal pathway from emotional engagement on TikTok to political behavior, radicalization, or strategic outcomes. They do, however, identify a consistent pattern of emotionally intensified attention that precedes ideological change and shapes how users interpret, evaluate, and circulate political information.

The Emerging Homeland Security Challenge

The findings presented here do not demonstrate that TikTok exposure leads to radicalization, coordinated influence operations, or measurable shifts in political behavior. Nor do they establish the presence of adversarial information operations during the period studied. What the survey does identify is a recurring pre-ideological condition: emotionally intensified attention to conflict narratives that emerges independently of ideological commitment.

From a homeland security perspective, this distinction matters. Much existing analysis of online influence focuses on overt persuasion, recruitment, or mobilization. The present findings suggest that an earlier and more diffuse process may be unfolding in parallel — one in which emotional orientation is conditioned prior to, and separately from, ideological alignment. This does not constitute evidence of malign activity, but it does describe a structural vulnerability within engagement-optimized platforms.

Emotional pre-alignment does not itself threaten democratic stability or national security. However, when emotionally resonant narratives — particularly those emphasizing grievance, moral absolutism, or victimhood — become familiar and repeatedly reinforced, they may shape how subsequent political information is interpreted and circulated. The risk, therefore, lies not in emotion as such, but in the possibility that emotionally conditioned audiences may be more receptive to later persuasive efforts should coordinated influence campaigns seek to exploit these dynamics.

This study does not claim that such exploitation is occurring, nor that emotional engagement reliably produces downstream political effects. It does suggest that emotionally intensified attention may represent an early-stage influence condition that is analytically distinct from persuasion, recruitment, or mobilization, and therefore not well captured by frameworks focused exclusively on ideological outcomes.

The Intelligence Blind Spot

Contemporary intelligence and security monitoring systems are primarily designed to identify ideological signals: explicit extremist narratives, recruitment rhetoric, mobilization cues, and operational indicators. These systems are effective at detecting overt persuasion and coordinated messaging, but they are less well suited to observing shifts in emotional orientation within otherwise legitimate public discourse.

The survey does not establish that intelligence organizations are overlooking active influence operations, nor does it suggest that emotional engagement should be treated as an indicator of malign intent. Instead, it highlights a potential analytical gap between when influence-related dynamics begin and when they become visible to existing detection frameworks. Emotional intensification — manifesting through affective framing, grievance amplification, or moral absolutism may precede ideological articulation by weeks or months, if it develops into ideology at all.

Importantly, emotional pre-alignment is not inherently pathological. It emerges through lawful expression, authentic user behavior, and platform-level engagement optimization. As such, it does not register cleanly as a security signal and should not be treated as one. Its relevance lies in timing rather than threat classification: By the point at which explicit ideological or mobilization indicators appear, emotionally reinforced narrative communities may already be well established.

This temporal sequencing suggests a limitation, rather than a failure, of existing monitoring approaches. Systems optimized to detect ideology may identify influence effects later in their development, after emotional orientations have already shaped how information is evaluated and shared. Recognizing this sequencing does not require expanding surveillance or redefining dissent as risk. It requires conceptual clarity about what current tools are designed to see, and what they are not.

Strategic Risk at Scale

Algorithmic amplification may reduce the cost and barriers associated with influence operations, enabling smaller actors to reach large audiences through emotionally optimized content. Studies of TikTok’s personalization architecture, including recent computer science analyses of its For You feed, indicate that engagement engines tend to privilege intensity and virality over accuracy or nuance. The survey does not directly test these dynamics. Nevertheless, its findings suggest that emotionally personalized ecosystems may limit the effectiveness of traditional counter-influence tools, such as message rebuttal or official narrative competition, when persuasion flows through individualized affective pathways rather than centralized information channels.

Building Democratic Resilience

If emotional pre-alignment shapes how political information is encountered and interpreted, then democratic resilience efforts must move beyond correcting factual inaccuracies alone. The findings presented here do not call for expanded surveillance, content suppression, or ideological regulation. Instead, they point toward modest but meaningful adjustments in how analysts, institutions, and platforms conceptualize and respond to emotionally conditioned information environments.

For analysts and intelligence practitioners, the implication is not to treat emotional expression as a security indicator, but to recognize affective escalation as contextual information. Open-source intelligence workflows could incorporate non-ideological affective signals — such as sudden surges in emotionally saturated sharing, grievance amplification, or moral absolutist framing — as background indicators of emerging narrative environments. These signals would not constitute evidence of malign activity, but they could help analysts understand when emotionally reinforced attention shapes information exposure before explicit ideological articulation.

At the institutional level, resilience efforts would benefit from integrating emotional literacy into existing media and information literacy programs. Current initiatives often emphasize source verification and the detection of misinformation. Complementing these approaches with training on affective manipulation — such as how validation-seeking behavior, outrage amplification, and algorithmic reinforcement operate — would equip users to recognize emotional conditioning without framing emotion itself as suspect. The objective is not emotional detachment, but emotional awareness.

Platforms likewise need not adjudicate political content or intent to mitigate emotional amplification effects. Design interventions that introduce reflective friction — such as prompts encouraging users to consider context before resharing emotionally charged material or greater transparency around why particular content is being recommended — could modestly disrupt reinforcement loops while preserving user autonomy. Independent auditing of recommendation systems, focused specifically on affective amplification patterns rather than ideological outcomes, would further support accountability without expanding content moderation mandates.

Taken together, these measures do not promise to neutralize influence or eliminate emotional engagement from political discourse. Nor should they. Democratic politics is inherently emotional. What they offer is a way to reduce the extent to which algorithmically optimized emotional intensity silently shapes political attention without users’ awareness. Strengthening democratic resilience, in this sense, means preserving citizens’ capacity to recognize when how they feel is being systematically conditioned, even when what they believe has not yet changed.

The New Propaganda Domain

TikTok illustrates a broader transformation in the mechanics of propaganda. Persuasion in the digital age appears increasingly emotional rather than doctrinal. Algorithms do not instruct beliefs; they cultivate affective orientations. Drawing on original survey data collected from U.S. TikTok users in the weeks following Hamas’s Oct. 7, 2023, attacks, this research suggests, rather than proves, that emotional conditioning may function as an early-stage influence mechanism that existing counterpropaganda and intelligence models are not well designed to observe. Emotional alignment may shape the psychological terrain on which later persuasion operates, even in the absence of immediate ideological change.

Democratic stability, therefore, depends not only on protecting informational accuracy but on recognizing how emotional environments condition political judgment. The emotional domain of public discourse has become increasingly contested, and understanding its dynamics is essential to safeguarding democratic cognition in the contemporary information landscape.

 

 

Michael Morgan, Ph.D., is an independent researcher whose work focuses on political communication, algorithmic media systems, and influence dynamics in contemporary conflict. His research bridges social science and national security, with particular attention to how emotional engagement shapes political judgment online.

Image: Solen Feyissa via Wikimedia Commons

Warcast
Get the Briefing from Those Who've Been There
Subscribe for sharp analysis and grounded insights from warriors, diplomats, and scholars.