Impacts Of A Me-ritocracy
The algorithm's Cold War against ancient virtues
(My mind seems stuck on puns lately.)
Me-ritocracy. Y’know – like “me” + meritocracy. Pretty good, eh?
Well, I think it might describe something real: a social order where power – i.e. opportunities, deals, social leverage – is disproportionately awarded to those who are the most visible.
If you look a layer deeper, that means this power often goes to those most willing and able to:
Self-aggrandize
Self-optimize for external standards (listed below)
Turn identity into a brand
This piece is my exploration of this idea and its implications.
And before I come off like an ultra-douche, I want to clarify that this isn’t aimed at individual people or actions – it’s more of a commentary on the incentive structure and its impacts.
The more things change…
Of course, it’s never really been a meritocracy – not in any society, anywhere, ever. I’m not sure a true ‘meritocracy’ can really exist.
What we had in the prior iteration was more of a credential-itocracy (sorry, I’m all out of good puns).
Because in a status-oriented society, we’re always looking for shortcuts for how to measure each other. As Mac from Always Sunny calls it, we’re constantly giving each other an Ocular Pat-Down™.
For a long time, we treated certain credentials and institutional affiliations as shorthand for intelligence and merit – imperfect measures that favor privileged economic backgrounds and ability to play the system.
So while status in a supposed ‘meritocracy’ has always been organized around imperfect shortcuts and proxies, these signals still have major implications for:
Our individual aspirations
Our sense of success (both culturally and as individuals)
Our cultural norms
The traits we subconsciously learn to…
→ Admire
→ Consider “good” (in some sort of vague, karmic way)
→ Associate with legitimacy and authority
→ Imitate
And the display of “merit” – whether legitimate or not – has shifted in many areas:
From:
Can you prove your competence through institutions?
To:
Does your online presence make you feel authoritative to me?
Our prior iteration of pretend meritocracy had plenty of flaws, but this one has its own set of tradeoffs.
What wins in a me-ritocracy?
Visibility.
And with the algorithm acting as the almighty gatekeeper between creators and that visibility, they don’t have a choice but to adapt their behavior to whatever the system rewards. Which is content that will:
Spread – as measured by engagement metrics
Keep people watching – as measured by user watching behavior
Over time, these systems have ‘learned’ what reliably moves human attention. Turns out, it’s the same ol’ psychological levers that have been used to influence mass psychology since forever:
“Us” vs. “Them” Framing
Out-group language from US news outlets and politicians is by far the strongest predictor of shares and retweets –
4.8x stronger than negative emotion.
6.7x stronger than moral-emotional language.
Appealing to Emotion Over Reason
This study found that adding a single moral-emotional word to a tweet increases its retweet rate by ~17%.
Using High-Arousal Emotions to Create Engagement
Posts that spark high-arousal emotions – anger, anxiety, awe – are shown to significantly increase sharing. Low-arousal emotions – like sadness – have the opposite effect.
It’s not like every single creator recognizes this consciously and decides to tap into these levers in order to boost their own engagement. I think it’s a bit more sinister than that.
The result is an environment that nudges people toward a specific set of behaviors if they want to be heard:
→ Having a lot of ‘hot takes’ (especially on current events).
→ Removing nuance in favor of peacocking confidence.
→ Framing issues as binary choices rather than complex systems.
→ Treating disagreement as opposition.
→ Binding personal identity to publicly defensible positions.
Ancient virtues vs. modern math
Over long periods of time, human civilizations have independently converged on a similar set of virtues that sustain trust, cooperation, and shared life:
Humility
Patience
Honesty
Care for others
Respect for limits
To name a few.
When we compare this list of widely recognized, cross-cultural virtues to the list of algorithmic preferences, it seems like there are major conflicts.
Instead of humility… the algorithm rewards self-assertion and overconfidence.
Instead of patience… the algorithm rewards immediacy.
Instead of respect for limits… the algorithm rewards exaggeration and provocation (i.e. trolling).
This tension makes more sense when you consider what each system is actually optimized for.
Moral traditions evolved slowly to get large groups of people to cooperate. Algorithms optimize a narrower goal – attention – pursued with ruthless efficiency.
There is no reason to expect these selection pressures to align. And increasingly, they don’t
Are we creating the biggest egos ever?
As Michael Sandel argues in The Tyranny of Merit, systems that appear to reward talent and effort – meritocracies – have an unintended side effect:
Those who rise tend to internalize success as evidence of virtue and view the struggles of others as personal deficiency.
And when the meritocracy becomes a me-ritocracy, this becomes…
If I’m visible, I deserve to be seen.
If you’re not visible, you don’t.
This is a far more intimate and psychologically sticky form of status than credentials ever were. Institutions and gatekeepers could be blamed and silently resented, but algorithms are supposed to feel neutral, even democratic – the feed just “shows what people want.”
And over time, this reshapes how people interpret the outcomes:
My ideas spread because they’re good.
My engagement metrics grow because I’m right.
My audience listens because I’m worth listening to.
And conversely:
If your work doesn’t spread, maybe it wasn’t valuable.
If you’re ignored, maybe you don’t deserve relevance.
Integrity simply doesn’t scale…
In this environment, self-mythologizing becomes the most viable path to success.
It’s the slow, likely subconscious process of curating a “Public You” that the algorithm rewards:
Which parts of You get engagement.
Which parts of You travel best.
When the same traits are constantly rewarded, they reshape our shared sense of what’s good, serious, or worthy, even if it’s not explicit.
And over time, this is how you end up with a culture full of people who are:
Deeply certain
Loudly self-referential
Constantly explaining
Rarely revising
Not because they’re bad people.
It’s what the system selects for.


