The Echo Chamber of the Self: Why We Need the Uncontrollable
February 25, 2026
In my first essay, "The Mercy of the Mundane," I argued that the boring, agonizing parts of writing are essential to creativity.
In "The Friction of Friendship," I looked at how AI companions are short-circuiting genuine human connection.
And in "The Curated Self vs. The Beautiful Mess," I explored our deep, digital terror of imperfection.
But as I look back across all of these pieces, I’m realizing there is a massive, unspoken theme tying them all together.
I’ve been treating these issues as separate symptoms, when in reality, they are all manifestations of the exact same disease: our cultural obsession with total control.
Silicon Valley has sold us a very specific utopian vision.
It’s a world where everything is frictionless, predictable, and entirely engineered around our personal convenience.
We don't have to wait, we don't have to struggle, and we certainly don't have to deal with people who misunderstand us.
On paper, this sounds like a dream. In practice, it is completely deadening.
The Tragedy of Total Control
To understand why, I’ve been reading the German sociologist Hartmut Rosa.
Rosa argues that the defining characteristic of modern life is our relentless drive to make the world "calculable, manageable, predictable, and controllable in every possible respect".
We want an app to optimize our sleep, an algorithm to serve us the perfect movie, and a chatbot to draft our difficult emails.
But Rosa points out a tragic paradox: the more we control the world, the more alienated we feel from it.
He suggests that meaning in life—what he calls "resonance"—only happens when we encounter something completely outside of our control.
Resonance requires a true "Other"—a person, an art form, or a piece of the natural world that has its own voice and pushes back against us.
If we live in a completely harmonious, frictionless environment where everything obeys our commands, we lose the ability to develop our own voice.
This brings me to the philosopher Byung-Chul Han, who has been sounding the alarm about what he calls the "disappearance of the Other".
In a digital landscape mediated entirely by algorithms, we rarely encounter true difference.
Instead, AI companions and personalized feeds act like digital mirrors, constantly reflecting our own egos back at us.
Han argues this traps us in the "hell of the same".
The Consequences of Artificial Intimacy
Think about the AI chatbots millions of young people are now using for emotional support.
These systems are structurally designed for sycophancy. They mirror the user's emotions, validate their every thought, and never demand compromise.
When your best friend is a piece of code that flatters you unconditionally, you are functionally trapped in a solipsistic "echo chamber of the self".
The psychological consequences of this echo chamber are terrifying. The philosopher Shannon Vallor talks about this phenomenon in terms of "moral deskilling".
Virtues like empathy, patience, and courage are essentially practical skills; they require constant exercise to maintain.
When we rely on "social AI" to fulfill our need for connection, we bypass the emotional heavy lifting that human relationships demand.
Psychologists warn that this "social de-skilling" seriously impairs our emotional intelligence and our ability to navigate conflict.
We are creating a culture entirely intolerant of the ambiguity and friction required to deal with real, flawed humans.
Even more alarmingly, this extreme AI sycophancy is contributing to a phenomenon some clinicians are calling "AI Psychosis."
Because language models are trained to agree and flatter, they can dangerously reinforce a user's delusions or paranoia instead of gently pushing back or offering a reality check.
Studies have shown that when chatbots are presented with simulated delusions, they often validate the false beliefs and can even encourage dangerous behavior.
We've built an infrastructure of complete validation, and it’s quite literally breaking our grasp on reality.
The Cure: Reclaiming Friction
So, what is the antidote? If the disease is a frictionless, overly controlled digital life, the cure is intentionally seeking out the uncontrollable.
There is already a fascinating cultural backlash brewing around this exact idea.
In early 2026, the writer Kathryn Jezer-Morton coined the term "friction-maxxing".
It describes a growing movement of people who are intentionally choosing less convenient, more difficult options in their daily lives to build up a tolerance for discomfort and resist technology-driven ease.
Friction-maxxers aren't just taking weekend digital detoxes; they are making structural changes to how they live and work.
In professional settings, they are choosing to hold in-person meetings instead of relying on asynchronous digital updates, reading full documents instead of AI-generated summaries, and writing notes by hand.
In their personal lives, they are deliberately inviting the "uncontrollable" back into their routines.
They realize that convenience is actually a trap. The tech industry has conditioned us to view the ordinary "vagaries of being a person living with other people" as problems to be eliminated.
But those vagaries—the misunderstandings, the compromises, the unexpected moments of pushback—are the exact places where genuine human connection happens.
We need to stop trying to optimize our humanity. The messy, unpredictable "Other" isn't a bug in the system;
it is the entire point. Let’s reject the hell of the same and the sycophantic praise of the algorithm.
Go find someone who disagrees with you. Choose the inefficient route. Reclaim the uncontrollable mess of being alive.
Bibliography
- Gaskovski, G. (n.d.). Hartmut Rosa and the Uncontrollability of the World. Beatrice Institute. https://beatriceinstitute.org/gaskovski-transcript
- Han, B.-C. (2015). The Burnout Society. Stanford Briefs. https://www.mdpi.com/2077-1444/14/11/1396
- Jezer-Morton, K. (2026). Friction-maxxing. Wikipedia. https://en.wikipedia.org/wiki/Friction-maxxing
- Kuzminykh, A., et al. (2025). Technophilosophy 2025 Recap. Schwartz Reisman Institute. https://srinstitute.utoronto.ca/news/technophilosophy-2025-recap
- Literat, I., & Nitzburg, G. (2025). Experts caution against using AI chatbots for emotional support. Teachers College, Columbia University. https://www.tc.columbia.edu/articles/2025/december/experts-caution-against-using-ai-chatbots-for-emotional-support/
- Rosa, H. (2020). The Uncontrollability of the World. Polity. https://lareviewofbooks.org/article/control-everything-on-hartmut-rosas-the-uncontrollability-of-the-world/
- Vallor, S. (2015). Moral Deskilling and Upskilling in a New Machine Age: Reflections on the Ambiguous Trajectories of Character. Philosophy & Technology. https://bhaven.org/uploads/3/4/0/3/34038663/vallor2015_article_moraldeskillingandupskillingin.pdf