The Crank Singularity: Why Human Nature and Social Media Don’t Mix
When the Spotlight Always Wins, Truth Loses Without a Fight
Here’s my working theory: a certain percentage of the human population are just cranks. Not in a moral sense—just temperamentally disposed to obsessive patterns, conspiratorial frameworks, or a kind of performative contrarianism that thrives on attention. These people have always existed. In every village, every workplace, every family gathering, there’s been that person who sees patterns others miss, who knows what “they” don’t want you to know, who has the real story behind the official story.
The difference now is that the filtering systems that once kept them at the fringes have collapsed—and social media turned their dysfunction into content. What we’re living through isn’t just a media failure or an information crisis. It’s what I call the Crank Singularity: the point where cranks stopped being background noise and became the dominant signal in our epistemic environment.
And the uncomfortable truth is: cranks are entertaining. They’re confident. They offer clarity in a world of uncertainty. They generate a kind of narrative heat that institutional discourse, with all its caution and contingency, simply cannot match. In an attention economy where engagement is everything, the crank becomes the epistemic lighthouse—not because they’re right, but because they’re bright.
The Great Filter Collapse
To understand how we got here, we need to recognize what we’ve lost. For most of human history, cranks were naturally filtered by the friction of communication itself. To spread ideas, you needed institutions—publishers, broadcasters, universities, professional networks. These weren’t perfect filters, and they could fail in dangerous ways, excluding important voices and protecting powerful interests. But they offered ballast. They created friction that separated signal from noise, expertise from opinion, reasoned argument from emotional manipulation.
The collapse happened in stages, each one punching a bigger hole in the filtering tent.
Talk radio was the first breach. In the 1980s, the elimination of the Fairness Doctrine combined with AM radio’s search for profitable content created space for hosts like Rush Limbaugh to build audiences around pure opinion and emotional provocation. Suddenly you didn’t need institutional credentials or editorial oversight to reach millions of people. You just needed to be entertaining enough to hold attention through commercial breaks.
Cable news widened the breach. CNN’s 24-hour format created an infinite demand for content, which Fox News learned to fill with opinion and outrage rather than expensive reporting. The model was brilliant in its simplicity: instead of gathering news, manufacture the news by having strong opinions about it. Instead of informing audiences, build audiences by confirming what they already believed.
Social media blew the tent wide open. Platforms like Facebook and Twitter eliminated the last barriers to mass communication. Suddenly anyone with a smartphone could reach global audiences. The algorithms that determined what people saw weren’t programmed to promote truth or accuracy—they were programmed to maximize engagement. And engagement, it turned out, was maximized by content that provoked strong emotional reactions: outrage, fear, tribal solidarity, and the intoxicating feeling of possessing secret knowledge.
Each stage of this collapse followed the same logic: remove friction from communication, and the most emotionally stimulating content wins. The cranks didn’t get smarter or more numerous—they just got louder and more visible, until their signal drowned out everything else.
The Psychology of Crank Appeal
But why are cranks so compelling in the first place? Why do people gravitate toward conspiracy theories and fringe explanations when more credible information is readily available?
The answer lies in understanding what cranks offer that institutional discourse cannot: complete narrative coherence. Academic papers hedge their conclusions. Journalists qualify their reporting. Scientists acknowledge uncertainty. But cranks offer total explanations for complex phenomena. They provide heroes and villains, clear cause-and-effect relationships, and the satisfying sense that everything connects to everything else.
Consider how QAnon functioned as a meaning-making system. It didn’t just explain one event or phenomenon—it provided a unified theory of everything. Politics, entertainment, business, religion, international relations—all of it connected to a single master narrative about good versus evil, truth versus deception, patriots versus traitors. For people overwhelmed by the complexity and uncertainty of modern life, this kind of comprehensive explanation is psychologically irresistible.
Cranks also offer something else that institutional discourse struggles to provide: social belonging through exclusive knowledge. Being part of a conspiracy theory isn’t just about believing something—it’s about joining a community of people who know something that others don’t. It’s about having access to hidden truths that make you special, important, part of an elect group that sees through the lies everyone else accepts.
This creates what researchers call “epistemic bubbles”—not just echo chambers where people hear the same opinions repeated, but sealed meaning-systems where questioning the fundamental framework becomes literally unthinkable. The crank theory doesn’t just explain events—it explains why other explanations are wrong, why institutions lie, why experts can’t be trusted. It’s self-reinforcing and immune to external correction.
Finally, cranks offer cognitive efficiency in an information-overloaded world. Instead of having to evaluate complex evidence, weigh competing claims, and live with uncertainty, you get simple answers that make everything clear. Instead of having to think about systemic problems that might require difficult solutions, you get specific villains who can be defeated. Instead of having to grapple with the tragic complexity of human nature, you get morally pure heroes fighting obvious evil.
The Attention Economics Problem
The rise of the cranks isn’t just about psychology—it’s about economics. Specifically, the economics of attention in digital environments where content creators compete for a finite resource: human focus.
In this environment, truth has a fundamental disadvantage. Truth is often complicated, uncertain, and emotionally unsatisfying. It requires nuance, context, and the intellectual humility to admit what we don’t know. Lies, by contrast, can be simple, certain, and emotionally compelling. They can tell people exactly what they want to hear, confirm their existing beliefs, and provide the psychological satisfaction of having enemies to hate and problems with clear solutions.
The algorithms that govern social media platforms aren’t neutral. They’re designed to maximize engagement—likes, shares, comments, time spent on platform. And engagement is maximized by content that provokes strong emotional reactions. Outrage performs better than nuance. Fear spreads faster than reassurance. Tribal solidarity generates more engagement than calls for understanding across difference.
This creates what I call the “entertainment exploit” in democracy’s structure. Democratic discourse is supposed to be about citizens reasoning together about complex problems, weighing evidence, and making collective decisions. But social media platforms have turned democratic discourse into entertainment, where the most emotionally stimulating content always wins.
Cranks are perfectly adapted to this environment. They’re not trying to inform or educate—they’re trying to entertain and engage. They understand, intuitively or explicitly, that in an attention economy, being interesting is more important than being right. They generate emotional energy that makes content spread, regardless of whether it’s true.
Consider how anti-vaccine content spreads on social media. The most viral posts aren’t carefully researched analyses of vaccine safety data—they’re emotional stories about children allegedly harmed by vaccines, dramatic testimonials from parents, and compelling videos of charismatic speakers making confident claims about government cover-ups. These posts generate massive engagement not because they’re accurate, but because they’re emotionally satisfying to people who already distrust institutions.
Meanwhile, public health officials posting actual vaccine safety data get a handful of likes and shares. Their content is accurate but boring, nuanced but emotionally flat, true but not entertaining. In the attention economy, they lose by default.
The Mimetic Acceleration
The spread of crank theories follows patterns that have less to do with rational persuasion than with what researchers call social contagion—the way behaviors and beliefs spread through social networks like viruses.
People don’t typically adopt conspiracy theories because they’ve been presented with compelling evidence. They adopt them because people they know and trust have adopted them, because believing them provides social benefits, because rejecting them carries social costs. The content of the theory is often less important than its social function.
This is why fact-checking is largely ineffective against crank theories. When someone’s belief system is providing social belonging, emotional satisfaction, and cognitive coherence, presenting them with contradictory facts doesn’t address any of the underlying needs the belief system serves. In fact, it often backfires by reinforcing the persecution narrative that’s built into most crank theories.
QAnon followers didn’t abandon their beliefs when Q’s predictions failed to materialize. Election denial supporters didn’t change their minds when courts rejected their claims. Anti-vaccine activists don’t modify their positions when presented with safety data. Because these aren’t really beliefs about facts—they’re membership signals for social groups, sources of meaning and identity that transcend any particular factual claim.
Social media accelerates this process by creating artificial intimacy between content creators and their audiences. Parasocial relationships—one-sided emotional connections people form with media figures—make audiences more likely to adopt the beliefs of influencers they feel personally connected to. The algorithm then amplifies this effect by showing people more content from sources they’ve engaged with, creating feedback loops that strengthen these artificial relationships.
The result is a kind of mimetic acceleration—a rapid spread of beliefs and behaviors through social networks, driven not by their truth value but by their social utility and emotional appeal. Cranks aren’t just spreading misinformation—they’re creating new forms of social identity and belonging that are immune to traditional forms of correction.
Case Study: The QAnon Template
QAnon provides the perfect case study in how the Crank Singularity operates, because it shows how a completely fabricated narrative can achieve massive scale and influence by exploiting the structural vulnerabilities in our information environment.
QAnon began in October 2017 with anonymous posts on 4chan, an obscure message board known for pranks and hoaxes. The posts claimed to be from a high-level government insider with “Q clearance” who was revealing details about Donald Trump’s secret war against a global cabal of Satan-worshipping pedophiles. The posts were cryptic, full of questions and codes that followers could “decode” to reveal hidden meanings.
What made QAnon successful wasn’t the quality of its information—the predictions consistently failed to materialize—but the quality of its engagement. It gamified conspiracy thinking, turning followers into active participants who felt like they were solving puzzles and uncovering secrets. It provided a comprehensive explanatory framework for everything from politics to entertainment to current events. And it created a sense of community and purpose for people who felt alienated from mainstream society.
The movement spread primarily through social media platforms like Facebook, YouTube, and Twitter, where the algorithms amplified engaging content regardless of its accuracy. QAnon content generated enormous engagement—not because it was true, but because it was emotionally satisfying to people who felt confused and powerless about complex political and social changes.
By 2020, QAnon had evolved from an obscure internet phenomenon to a political movement that helped elect members of Congress. Polls showed that significant percentages of Americans believed at least some QAnon claims. The movement had become self-sustaining, generating its own media ecosystem, fundraising infrastructure, and political influence.
QAnon demonstrated how cranks could achieve unprecedented scale and influence by understanding and exploiting the attention economy. The movement’s leaders weren’t necessarily true believers—many appeared to be opportunistic grifters who recognized that conspiracy content generated audiences and revenue. But their followers were often sincere people looking for meaning, community, and understanding in a confusing world.
When QAnon finally lost momentum after January 6, 2021, its energy didn’t disappear—it migrated to other conspiracy theories and movements that followed similar patterns. The infrastructure QAnon built—the social networks, the content creation systems, the monetization strategies—was quickly repurposed for new crank theories about elections, vaccines, and other topics.
The Institutional Response Problem
Traditional institutions have struggled to respond effectively to the Crank Singularity because they’re fighting the wrong battle. Most anti-misinformation efforts focus on fact-checking and content moderation—trying to identify false claims and either correct them or remove them from platforms.
But this approach misunderstands the fundamental problem. Crank theories don’t spread because people are confused about facts—they spread because people are seeking meaning, community, and emotional satisfaction that mainstream discourse fails to provide. Fact-checking a conspiracy theory is like trying to cure loneliness with a dictionary.
Moreover, content moderation often backfires by reinforcing the persecution narratives that are built into most crank theories. When platforms remove conspiracy content, it confirms believers’ suspicions that powerful forces are trying to silence the truth. When fact-checkers debunk false claims, it validates the idea that institutional experts can’t be trusted.
The problem is structural, not informational. Social media platforms are optimized for engagement, not truth. Content creators are rewarded for generating strong emotional reactions, not for accuracy or nuance. Audiences are conditioned to expect entertainment value from their information consumption, not educational value.
Academic institutions, traditional media, and government agencies are fundamentally mismatched to this environment. They’re optimized for accuracy, credibility, and institutional legitimacy—values that don’t translate well to an attention economy where being boring is the worst sin you can commit.
Scientists publish peer-reviewed papers that take months to produce and are read by dozens of people. Cranks publish YouTube videos that take hours to produce and are watched by millions. Journalists write carefully sourced articles that present multiple perspectives. Cranks create content that tells people exactly what they want to hear. Government officials give measured statements that acknowledge uncertainty and complexity. Cranks offer simple explanations that make everything clear.
In the attention economy, the cranks win by default.
The Moral Dimension
The Crank Singularity isn’t just an epistemological problem—it’s a moral one. Because cranks don’t just spread false information—they exploit human psychology for personal gain, often targeting the most vulnerable people with promises they can’t keep and explanations that lead nowhere.
Consider the wellness-to-QAnon pipeline that researchers have documented: how people searching for health information get gradually drawn into increasingly extreme conspiracy theories. Vulnerable people dealing with health crises are offered simple explanations for their suffering and easy solutions for their problems. When the solutions don’t work, they’re told it’s because the conspiracy is deeper than they thought, that more extreme measures are necessary.
This isn’t just misinformation—it’s exploitation. It’s taking advantage of people’s genuine needs and legitimate frustrations to build audiences and generate revenue. The cranks who profit from this system—through book sales, speaking fees, supplement sales, and platform monetization—are often financially insulated from the consequences of the false hope they sell.
The moral dimension becomes even clearer when we look at how crank theories affect public health and democratic participation. Anti-vaccine content doesn’t just spread false information about vaccine safety—it contributes to disease outbreaks that harm children. Election denial doesn’t just spread false information about voting systems—it undermines faith in democratic institutions and leads to political violence.
The cranks who promote these theories often live in communities with high vaccination rates and respect election results when their preferred candidates win. They export risk to others while insulating themselves from consequences—a form of moral hazard that traditional market mechanisms don’t correct.
Rebuilding the Ballast
So what can be done? How do we restore some balance between signal and noise without creating new forms of censorship or institutional control that might be even more dangerous than the problem they’re trying to solve?
The answer isn’t to rebuild the old gatekeeping systems—that’s neither possible nor entirely desirable. Many of those systems were exclusionary, biased, and captured by narrow interests. The democratization of communication that social media enabled has genuine benefits that we don’t want to lose.
Instead, we need to build new forms of “epistemic ballast”—systems that can help people navigate information environments without relying on centralized control or institutional authority.
Media literacy education needs to focus less on teaching people to identify “fake news” and more on helping them understand how attention economics work, how their psychology can be exploited, and how to recognize when content is designed to manipulate rather than inform.
Platform design needs to evolve beyond engagement maximization toward what researchers call “prosocial design”—systems that reward accuracy, nuance, and constructive dialogue rather than just emotional reaction and tribal reinforcement.
Cultural norms need to develop around information consumption that treat intellectual humility as a virtue rather than a weakness, that value the ability to say “I don’t know” and “I was wrong,” that reward people for changing their minds when presented with better evidence.
Institutional reform needs to focus on making expertise more accessible and engaging without sacrificing accuracy, finding ways to compete in the attention economy without abandoning commitment to truth.
Community building needs to provide the social belonging and meaning-making that conspiracy theories offer, but grounded in constructive rather than destructive activities—local engagement, skill-building, mutual aid, and other forms of community resilience.
Most importantly, we need to recognize that this is fundamentally a cultural problem that requires cultural solutions. You can’t regulate your way out of the Crank Singularity, and you can’t fact-check your way back to shared reality. The solution has to involve rebuilding the social infrastructure that helps communities distinguish between helpful and harmful forms of meaning-making.
The Stakes
The Crank Singularity represents more than just an information problem or a media problem—it’s a challenge to the fundamental assumptions that make democratic society possible. Democracy requires that citizens be capable of reasoning together about complex problems, evaluating evidence, and making collective decisions based on shared reality.
When cranks become the dominant voices in public discourse, when conspiracy theories spread faster than accurate information, when engagement metrics reward emotional manipulation over rational argument—the basic conditions for democratic deliberation break down.
The result isn’t just political polarization or policy disagreement—it’s the collapse of shared epistemic frameworks that make productive disagreement possible. When people can’t agree on basic facts about reality, they can’t engage in the kind of debate and compromise that democracy requires.
This is what makes the Crank Singularity so dangerous. It’s not just that people believe false things—it’s that the systems that once helped communities distinguish between true and false things have been overwhelmed by systems that reward falseness for being more entertaining than truth.
The cranks aren’t going away. They’re a permanent feature of human psychology and social organization. But their influence doesn’t have to be unlimited. The challenge is building information environments that can accommodate human nature—including its cranky aspects—without being dominated by its worst impulses.
The alternative is a future where truth doesn’t just lose the occasional battle—it loses the war entirely, not because falseness is stronger, but because falseness learned to be more entertaining. In an attention economy, that might be enough.
But it doesn’t have to be. The spotlight doesn’t always have to win. Truth can learn to put on a better show without abandoning its commitment to accuracy. The center can hold, but only if we rebuild the ballast that keeps it stable when the winds start blowing.
The wire still holds. The choice is ours. But the clock is ticking, and the cranks aren’t waiting for us to figure it out.
The revolution is recognizing that in an attention economy, being right isn’t enough—you also have to be interesting. The rebellion is rebuilding information systems that reward truth over engagement. The resistance is choosing signal over noise, even when noise is more entertaining.
Just in from Cory Doctorow, and relevant to our discussion
https://pluralistic.net/2025/07/22/all-day-suckers/
One hell of a piece Mike and glad you’ve highlighted something I’ve expressed to people in Silicon Valley, Dumbo and anyone willing to engage for nearly two decades now.
I’ve come to the conclusion that the open newsfeed model cannot possibly be the default state. We need to get back to basics, focus on the individual and trust that most will come to their senses and consume content closer to reality the majority of the time. I can send you what I have in mind directly, but in the meantime here is a clip I recorded and uploaded to YouTube as to how I’m seeing all this.
https://youtu.be/syJlYAr7cSw?feature=shared