- What dark traits actually are
- The erosion of truth: how dark traits fuel global misinformation
- Why misinformation is rewarding to the people who spread it
- Why the rest of us help it spread
- The personality-tech feedback loop
- How to spot dark-trait dynamics in public discourse
- What protects us better than fact-checking alone
- Truth is not self-sustaining
A false claim goes viral, millions see it, and the usual explanation arrives right on cue: people are confused, underinformed, or simply bad at critical thinking. That story is comforting, but incomplete. The erosion of truth: how dark traits fuel global misinformation is not just about ignorance. It is also about personality, motive, and the uncomfortable fact that some people do not spread misleading content by accident.
Psychology gives us a sharper lens here. Misinformation does not move through the world on logic alone. It rides on status games, tribal signaling, emotional reward, and in some cases, traits linked to manipulation, callousness, and grandiosity. If we want to cut through the myths and pseudo-science about why falsehoods spread, we need to stop treating every bad actor as a confused bystander.
What dark traits actually are
In psychology, the phrase “dark traits” usually points to tendencies associated with the so-called Dark Triad: narcissism, Machiavellianism, and psychopathy. Some researchers also include sadism, creating a “Dark Tetrad.” These are not casual insults or internet labels. They describe personality patterns linked to manipulation, low empathy, entitlement, strategic deceit, impulsivity, and sometimes enjoyment of others’ distress.
That does not mean everyone high in these traits becomes a propagandist. Personality is not destiny. But these tendencies can make misinformation especially attractive because misinformation often rewards the exact things dark traits chase: attention, influence, dominance, and disruption.
A narcissistic style may be drawn to certainty, superiority, and the public performance of being the one person who “sees through the lies.” A Machiavellian style may treat information as a tool, not a truth claim – useful if it shapes opinion, wins power, or confuses opponents. A psychopathic style may simply show less concern for the human cost of spreading falsehood if the act is rewarding or entertaining. Add everyday sadism, and the chaos itself can become part of the appeal.
The erosion of truth: how dark traits fuel global misinformation
The key mistake is assuming misinformation spreads because people value truth and just happen to get it wrong. Often, truth is not the goal. Social reward is.
Dark traits align with several features of modern media environments. First, platforms reward emotional intensity. Outrage, fear, and moral contempt travel faster than careful nuance. Second, visibility is cheap. A manipulative person no longer needs institutional power to shape attention. Third, ambiguity helps the skilled deceiver. If facts are muddy and trust is already low, strategic misinformation becomes easier to plant and harder to dislodge.
This matters globally because the same psychological machinery scales across borders. A rumor in one country, a conspiracy in another, and a disinformation campaign elsewhere may look culturally different, but they often exploit the same human vulnerabilities: identity, fear, grievance, and the craving for certainty. Dark traits do not create those vulnerabilities. They weaponize them.
Why misinformation is rewarding to the people who spread it
For some people, sharing false or misleading content is not mainly about belief. It is about payoff.
Attention is one payoff. Posting extreme claims can generate engagement, followers, and social relevance. For personalities that hunger for admiration or dominance, that reward can be powerful. Status is another. In polarized spaces, being the loudest voice can matter more than being the most accurate one.
Control is a third payoff. Machiavellian personalities, in particular, tend to think strategically about people. If information can be used to steer behavior, destabilize trust, or split groups into opposing camps, misinformation becomes a social weapon. Even when the claim itself is flimsy, the confusion it creates can be useful.
Then there is the pleasure of disruption. This is the motive people often underestimate. Some individuals are not trying to persuade everyone. They are trying to provoke, unsettle, or watch others react. In that case, the spread of misinformation can feel less like an argument and more like entertainment.
Why the rest of us help it spread
If dark traits help generate misinformation, ordinary psychology helps distribute it.
Most people do not share false content because it is cruel or manipulative. They share it because it feels emotionally true, socially validating, or urgent. Confirmation bias plays a role, but so do identity and belonging. A claim that supports your group, flatters your worldview, or explains a complex problem in one simple sentence is easier to accept than a messy, qualified reality.
This is where bad actors gain leverage. They do not need everyone to be deceptive. They only need enough people to be reactive. Once emotionally charged misinformation enters a network, motivated reasoning can do the rest.
There is also a trust problem. When institutions feel distant, elitist, inconsistent, or self-protective, people become more open to alternative explanations. Some of those explanations are healthy skepticism. Some are manipulative fiction dressed up as a hidden truth. The difference is not always obvious in fast-moving digital spaces.
The personality-tech feedback loop
Technology did not invent deception, but it changed the economics of it.
Platforms tend to reward content that triggers quick reactions. That creates a favorable environment for people with darker interpersonal strategies. If your style is impulsive, provocative, shameless, or highly tactical, online spaces can amplify your strengths. You can test narratives quickly, adapt to backlash, exploit outrage cycles, and keep moving before accountability catches up.
Algorithms are not secretly evil, but they are not built to prize psychological integrity either. They optimize for engagement. That means they can end up rewarding the very traits that distort public reality.
This does not mean every viral falsehood comes from a dark personality. Sometimes misinformation spreads through sincere misunderstanding, political panic, poor journalism, or ordinary cognitive bias. But when dark traits are present, the process becomes more deliberate. The falsehood is not just mistaken. It is used.
How to spot dark-trait dynamics in public discourse

You cannot diagnose strangers from a feed, and you should not try. Still, certain patterns are worth noticing.
One is strategic inconsistency. The person does not care whether claims fit together; they care whether each claim is useful in the moment. Another is contempt without accountability. They provoke, distort, and smear, then frame any challenge as censorship or weakness. A third is performative certainty. Complex issues are flattened into absolute claims because certainty is socially powerful, even when it is intellectually empty.
Look, too, for asymmetry. These communicators often demand proof from others while offering none themselves. They accuse critics of bias while openly manipulating emotion. They present themselves as brave truth-tellers, but their real skill is exploiting distrust.
That combination – confidence, grievance, and tactical deceit – is often more psychologically informative than the factual claim itself.
What protects us better than fact-checking alone
Fact-checking matters, but it is not enough. If misinformation is partly driven by motive and personality, then correction has to address the social and emotional machinery around the lie.
First, people need friction. Slowing down sharing behavior reduces impulsive amplification. Second, we need stronger norms around intellectual humility. Not the fake humility of “who can really know anything,” but the mature kind that tolerates complexity and uncertainty. Third, media literacy should include motive literacy. Ask not only “Is this true?” but also “What reward does this person get from me believing or sharing it?”
On a personal level, one of the most useful habits is noticing your own emotional temperature. If a post makes you feel instantly righteous, furious, or thrilled to expose someone, that is exactly when your judgment is easiest to steer.
For readers who want a more psychologically grounded way to make sense of modern behavior, that is the broader project behind platforms like The Psychology of Everything: not just checking claims, but understanding the human motives that make those claims spread.
Truth is not self-sustaining
One of the hardest lessons in all of this is that truth does not automatically win because it is true. It needs cultural support, psychological maturity, and systems that do not reward manipulation quite so generously.
The erosion of truth is not only a failure of knowledge. It is a failure of incentives, character, and social design. Some people are pulled toward misinformation because they are scared or misled. Others are drawn to it because it gives them status, leverage, or pleasure.
The more clearly we see that difference, the harder we become to manipulate. And that may be the real starting point – not blind trust, not cynical distrust, but a more disciplined understanding of human nature.