When Algorithms Shape Our Reality
Anxiety leaves its imprint not only in the body but also in how we experience the digital world. For many, offline triggers push them online in search of belonging. But instead of connection, they meet a system designed for engagement, not care.
At first, algorithms know nothing. In the “cold start,” content is random and irrelevant. Soon, patterns emerge—dopamine-rich posts, flawless selfies, manufactured lifestyles without disclaimers. The algorithm learns quickly: novelty, outrage, comparison. Each interaction shapes what comes next.
What it does not understand is pain. Every dopamine high is followed by depletion. Each comparison deepens loneliness. Over time, the same algorithm that promised connection leaves users more anxious, more isolated, and more fragile. This is not malfunction—it is design.
Why Algorithms Outpace Us
At their best, algorithms make life easier. They help us find the fastest route home, recommend music we might love, and filter vast oceans of information into something we can navigate. In healthcare, they can even detect illness earlier than the human eye. When designed responsibly, algorithms extend our abilities, giving us back time, energy, and opportunities for connection.
The human brain also runs on algorithms. Mental shortcuts evolved to save energy, predicting outcomes with speed. These heuristics once served survival. Today, they are easily manipulated by synthetic systems more sophisticated than our biology.
As Yuval Noah Harari observed, cyberspace decisions—about sovereignty, privacy, and security—were never made democratically. Instead, choices were made by designers and companies outside public oversight. Algorithms now shape economies, politics, and daily life faster than our ability to regulate them.
The roots of illusive design lie in capitalism’s side effects. Leaders promise growth, and designers are tasked with delivering, even when the promises are impossible. Decisions are won by money, not usability. The result is a zero-sum race: companies copy one another’s features while users—outnumbered—play a game they cannot win. Nash equilibrium is never reached.
Instagram chases TikTok. TikTok chases retention. In that race, creators such as artists and photographers lose. Their posts are buried beneath brand ads, their reach throttled by algorithms that prioritize engagement over expression. What was once a space for creativity and discovery has become a cycle of imitation, where adoption and retention are valued above wellbeing.
The Consequences of Illusion
Algorithms are powerful tools. But when left unregulated, they promote impulsivity, conformity, and comparison. They amplify outrage, reinforce bias, and fragment societies into filter bubbles. The like button, the curated reel, the beauty filter—none were designed with long-term wellbeing in mind. Yet they now define how people—especially the younger generations—experience reality.
These are not neutral choices. Gambling experts help optimize addiction. Highlight reels intensify comparison. Platforms, more powerful than governments in their influence, conduct mass-scale social experiments without accountability. As designers, many of us entered this field to make life more usable and inclusive. But without ethical boundaries, algorithms erode connection instead of nurturing it.
When technology operates in secrecy, it asks for faith without accountability, and that is never a fair exchange. The choices of tech leaders themselves reveal this tension. Steve Jobs famously limited his own children’s exposure to the very devices he helped create. That decision wasn’t hypocrisy, but a signal. Even he recognized that something within these interactions was more powerful, and perhaps more harmful, than it appeared on the surface.
Choosing a Cure
Change begins with awareness. Just as algorithms learn from us, we can learn from them, pausing to notice how each interaction shapes mood, thought, and behavior. Recognizing hijacked attention and questioning distorted beliefs is itself an act of resistance.
For many, that means stepping back. If offline avoidance led to online escape, awareness can lead to a healthier withdrawal. Our natural algorithms formed through evolution, cannot compete with synthetic ones optimized for profit. But we can redesign the systems around them.
Toward Ethical Algorithms
Compassion must be central. When people are treated as data points, dehumanization follows. Nowhere is this clearer than in hiring systems. Applicants spend hours crafting résumés and cover letters only to be ghosted. Roles are posted, closed, or repurposed without explanation. For those with expiring visas, silence means life-altering uncertainty.
Ghosting is not just poor communication. It is a form of illusive design, keeping people in limbo without closure, elevating anxiety, and disregarding dignity. Treating users as expendable statistics reflects the same mindset that drives exploitative algorithms. Both are forms of psychopathy, normalized under efficiency.
But efficiency does not have to mean cruelty. Transparency, feedback, and closure are simple design choices that honor human worth. They remind us that every interaction is with a person, not a number.
Designing for Dignity
Algorithms need not be illusions. With ethical guardrails, they can help us connect, learn, and heal. Instead of optimizing for addiction, we can optimize for trust. Instead of amplifying comparison, we can amplify resilience. Instead of leaving people in uncertainty, we can design for closure and care.
Algorithms already influence more of our lives than governments. But their power can be redirected. If design can drive anxiety, it can also restore balance. If it can isolate, it can also reconnect.
The future is not about abandoning algorithms. It is about reclaiming them, transforming them from artificial psychopaths into tools for human flourishing. That is the responsibility of ethical design. And that is the opportunity before us.
Further Reading
- The Worst Argument That Social-Media Companies Use to Defend Themselves — After Babel (Jonathan Haidt)
- Homo Deus: A Brief History of Tomorrow — Yuval Noah Harari
- Don’t Make Me Think, Revisited — Steve Krug
- The Design of Everyday Things — Don Norman
- Opportunities and Challenges in Entrepreneurship — UC Berkeley (YouTube)
- Sama: I feel so bad about advice I gave while running YC I might delete my blog — Hacker News
- Don’t Make the Investor Your Customer — YouTube
- Your Undivided Attention: Down the Rabbit Hole by Design — Tristan Harris & Guillaume Chaslot (Apple Podcasts)
- Live: Facebook Whistleblower Testifies at Senate Hearing — NBC News (YouTube)
- FTC should stop OpenAI from launching new GPT models, says AI policy group — The Verge
- The Cold Start Problem — Andrew Chen
- Unicorn social app IRL to shut down after admitting 95% of its users were fake — TechCrunch
- The Righteous Mind: Why Good People Are Divided by Politics and Religion — Jonathan Haidt
- Deep Work: Rules for Focused Success in a Distracted World — Cal Newport
- Man and His Symbols — Carl Jung
- The Jordan B. Peterson Podcast: Sex and Dating Apps | Rob Henderson — Apple Podcasts
- Dr. David Buss: How Humans Select & Keep Romantic Partners — Huberman Lab (YouTube)
Leave a Reply