Law enforcement officials and academic researchers are warning about a fast-moving threat in which online notoriety, rather than any specific ideology, is the motivation. In crannies of Discord, Telegram and extremist forums, young users are gathering spreadsheets filled with the home addresses of journalists they hate or favorite memes; turning online edgelord ideas into real-life attacks — a phenomenon that experts and scholars have described as nihilistic violent extremism (NVE), which even the FBI has started paying attention to.
The equation is simple: Clout becomes the reward, violence becomes the content and virality becomes the metric of success. What starts out as shock-posting more often turns into performative cruelty intended to win approval from peers, accrue followers or gain status within semi-private groups.
Inside Nihilistic Violent Extremism on Social Media
NVE, as researchers at the NYU Center for Business and Human Rights (including Luke Barnes and Mariana Olaizola Rosenblat) describe it, is a violence-first subculture: generally young, typically male, and unmoored from coherent politics. Followers prize transgression, worship former attackers and treat harm as a kind of content-creation challenge.
The FBI has begun citing NVE in investigations and court filings, casting it as a non-ideological threat vector that frustrates traditional counterextremism playbooks.
In the absence of coherent manifesto or group hierarchy, the “why” has other ends: It’s attention — accrued from communities that gamify cruelty.
From Memes to Crimes: Clout-Chasing Turned Violent
Recent examples illustrate the pipeline from clout-chasing to violence. Investigators have connected school attacks and public assaults to chats in which members drank in the “scorekeeping” of pledges of more and more extreme stunts. One network, under the rubric “764” and detailed in law enforcement records as well as media reports, reportedly used both Discord and Telegram to carry out a joint campaign of harassment and exploitation — members would “level up” by demonstrating that they were willing to circumvent moral and legal mores.
This is not confined to the U.S. Between Israel and Europe, security agencies with small cells have been encouraging random street attacks caught on film for in-group adulation.
The footage works as proof-of-work, traded like tokens in a reputation economy that values escalation.
How Clout Becomes a Weapon in Online Extremism
Attention is the currency of the social web, and as I mentioned earlier the most shocking material often travels furthest. Short-form video, livestreams and repost features can turn the small ripples of dangerous provocations into a wave that reaches millions in minutes. Semi-private spaces facilitate the organizing, and public feeds provide the stage and applause.
Adolescent exposure is broad. Pew Research Center reported 95% of American teens use YouTube, 67% are on TikTok and among those ages 14 to 17, about a third say they are online “almost constantly.” That saturation does not make violence, but it makes fertile soil for copycat dynamics, with spectacle and infamy framed as roads to belonging.
Many mass attackers telegraph their intent, or seek to glorify past perpetrators by posting on the web.
The U.S. Secret Service’s National Threat Assessment Center has cited this as a key point in several reports about prevention. And in NVE spaces, that “leakage” becomes content strategy — an opportunity for peers to egg on the next act.
What the Data Shows About Clout-Driven Violence Online
The scale of the problem isn’t lost, though, even as some shooters act for different reasons. The Gun Violence Archive counts more than 600 mass shootings annually in the United States over the past several years. The Center for Strategic and International Studies has reported growth in plots by lone actors and small cells across the ideological spectrum, a trend reflected in Europol’s terrorism situation reports that identify “ideology-light” violence incubated online.
Three common NVE signatures, according to NYU researchers:
- Rapid radicalization without extensive offline cues
- Status systems that reward boundary-pushing acts of tribalism
- Migration to encrypted or semi-private channels where conventional moderation is hobbled
Even arrest or incarceration can be turned into prestige, strengthening the feedback loop.
What Platforms and Policymakers Can Do to Reduce Harm
It’s also possible that safety-by-design measures can mitigate the clout reward without turning platforms into censors. More practical measures include throttling sudden virality for brand-new accounts where trust is low; de-amplifying reposts that praise or gamify violence; friction screens for searches connected to names of attackers; and surge staffing for trust and safety during windows of crisis to catch any copycat attempts.
Encrypted and semi-private spaces present real challenges, but businesses can still act on metadata signals, network activity and credible user reports while upholding privacy. Independent audits and transparency reporting — particularly of takedown speed for violent glorification content — would enable researchers to measure the impact.
Public agencies can share more transparent NVE typologies and threat indicators with schools and community groups. Red flags that appear to have signaled violent intentions include idolizing past attackers as “icons,” swapping rank within online forums and enhancing stunts made to impress an audience. Families need clear reporting avenues and crisis support when they flag dangerous content, while educators should incorporate media literacy that demystifies virality and deglamorizes notoriety.
The throughline is simple: in these cliques, memes are choreography and clout is money. Unless we recalibrate the incentives that turn violence into a road to fame, we will continue to witness digital performativity spill over into physical damage. The fix will not be one policy lever; it requires platforms, researchers and law enforcement rowing in the same direction, moving quickly with transparency and accountability.