Table of Contents
- The Seduction of the Echo Chamber
- The Collapse of Disagreement
- The Difference Between Echo and Witness
- The Skill of Being in Community
- What We’re Losing
- What We Need Instead
- Remembering Our Humanity
Recently, I came across a podcast on AI and mental health that really spoke to some of the trends we’ve been seeing in our work, and a weariness for what is happening to the state of community and beingness. The conversation lingered with me for days, not because it introduced anything that we weren’t aware of already, but because it named something many of us have been feeling but struggling to articulate: a quiet erosion of the connective tissue that holds us together as people.
More so than ever, we’ve been hearing about how folx are replacing connection and introspection with AI. Teenagers using AI to consult on their deepest insecurities and feelings of not-enoughness, adults using AI to sound board on how to deal with relational conflicts, making sense of their pain and the senselessness of the world around them. It’s not just a trend we’re observing from a distance; it’s showing up in our therapy rooms, in our supervision sessions, in conversations with friends and colleagues who work in schools, community organizations, and healthcare settings.
The stories are remarkably similar: someone facing a difficult conversation with a partner turns to ChatGPT instead of a trusted friend. A young person navigating questions about their identity asks an AI chatbot instead of reaching out to their queer community. Someone experiencing grief or confusion about their place in the world seeks answers from an algorithm designed to predict the most statistically likely response, rather than sitting with the uncomfortable, generative messiness of human connection.
AI and Mental Health: The Seduction of the Echo Chamber
Don’t get me wrong, I think AI can be useful and give helpful insights. There are moments when a well-designed tool can offer information, spark creativity, or help us organize our thoughts. AI can be a starting point for research, a way to draft ideas, or even a mirror that reflects back what we’re trying to say in a clearer form.
But there’s also been harrowing experiences of being pushed further into worldviews that continue to harm. We’ve seen how recommendation algorithms on social media have funneled people into increasingly extreme positions, how predictive text can reinforce biases, how AI trained on existing data perpetuates the same oppressive patterns embedded in that data. When we turn to AI for emotional support or guidance on complex relational and existential questions, we’re not just getting neutral information—we’re getting reflections of existing power structures, colonial frameworks, and the individualistic narratives that dominate the datasets these tools are trained on.
I’m not saying that we should stop using AI altogether; as always, there is nuance and discussion and choices to be made with all that it entails. But it does make me wonder about how this tool, built to be a predictive echo chamber, capitalizes and layers onto the isolation and the divisiveness we’re experiencing at large.
Think about it: AI doesn’t challenge you. It doesn’t push back. It doesn’t bring its own lived experience, its own cultural context, its own messy humanity into the conversation. It gives you what it predicts you want to hear based on patterns in data. And while that might feel validating in the moment, it’s a fundamentally different experience than being truly seen by another person. AI and mental health doesn’t lead to growth, it leads to rabbit holes.
The Collapse of Disagreement
And at the root if it, here’s what I keep coming back to: Are we all so engrained in the individualistic, colonial hierarchies that convince us we’re not enough that any form of disagreement or perceived rejection feels like the air we need to live is being taken out of us?
We’re living in a time of profound violence around difference. Cancel culture, call-out culture, the constant threat of being misunderstood or rejected—these dynamics have created an environment where many of us are walking on eggshells, terrified of saying the wrong thing, of being seen as problematic, of losing our place in community. And that fear is not unfounded. We’ve all witnessed how quickly someone can be cast out, how permanently relationships can fracture over a single misstep or misunderstanding.
Is the threat of polarization and being ousted from community so strong that it becomes survival just to be able to squash any critique and questioning of ourselves and the way we see the world? Is this why combining AI and mental health feels like such a safer way to express yourself, to be “seen”, to be “witnessed”?
When we turn to AI instead of people, we’re protecting ourselves from the vulnerability of being truly known. We’re avoiding the risk of rejection, the discomfort of being challenged, the possibility that someone might see us differently than we see ourselves. And I get it. I really do. The stakes feel so high right now. Community feels so fragile. Many of us, especially those of us who are BIPOC, queer, disabled, or otherwise marginalized, have experienced real harm in spaces that claimed to be safe. We’ve been tokenized, gaslit, excluded, and betrayed. Of course we’re cautious. Of course we’re tired.
But when we replace human connection with AI interaction, we’re not actually protecting ourselves. We’re just postponing the inevitable work of learning how to be in relationship with each other—with all the messiness, conflict, repair, and growth that entails.
The Difference Between Echo and Witness
When we think about the issue with AI and mental health, we have to look at the difference between the validation and experience of someone else, a whole separate person with their own histories and thoughts and imperfections, being able to see us in our wholeness, in our histories, in our thoughts and imperfection, than a program designed to echo back what we input.
When someone believes in you more than you do, that’s not just information—it’s relational nourishment. When someone offers a different perspective you’ve never thought of before, that’s not just data—it’s the expansion that comes from genuine encounter with otherness. When someone co-regulates with you in the depths of the pain you’re experiencing, that’s not just validation—it’s the embodied, nervous-system-level experience of not being alone in your suffering.
That’s connective. That’s what our bodies and spirits are wired for. That’s what actually heals.
When we make room for co-creation amidst different viewpoints, when we’re able to fight better, to do conflict well; with respect, dignity, and humanity at the center, that’s growth. That’s the kind of transformation that doesn’t happen in isolation or in conversation with an algorithm. It happens in the friction and the tenderness of real relationship.
I think about the clients I’ve worked with, the friends, the colleagues, the partners, who’ve experienced profound shifts not because I said something particularly brilliant, but because they felt truly seen in a moment of vulnerability. I think about the supervision sessions where a supervisee and I have sat with discomfort together, working through a rupture or a difference in perspective, and come out the other side with deeper trust and understanding. I think about the community organizing work I’ve been part of, where we’ve had to navigate conflict, power dynamics, and differing visions—and how that messy, difficult work has been some of the most meaningful of my life.
None of that could have happened with AI. None of that could have happened if we normalized and systematized AI and mental health.
The Skill of Being in Community
And, all of that takes skill. Being part of community is a skill that’s learned and continues to be learned.
This is something we don’t talk about enough. We act as though connection should be natural, as though we should all just inherently know how to be in healthy relationship with each other. But the truth is, many of us were never taught these skills. We grew up in families and systems that modeled competition over collaboration, individualism over interdependence, conflict avoidance over conflict as transformation.
We live in a culture that tells us we should be self-sufficient, that needing others is weakness, that vulnerability is dangerous. We live under capitalism, which profits from our isolation and sells us individual solutions to collective problems. We live in the aftermath of colonization, which systematically destroyed Indigenous and cultural practices of community care, collective decision-making, and intergenerational knowledge-sharing.
So no, it’s not surprising that we struggle with community. It’s not surprising that we don’t know how to navigate conflict, how to repair ruptures, how to hold space for difference, how to ask for help, how to offer support without fixing, how to show up consistently, how to be accountable when we cause harm.
These are skills. And like any skill, they require practice, feedback, patience, and a willingness to fail and try again.
Are we, as a society, prioritizing the building of this skill? Of beingness? Of connectiveness? Of cultivating generativity between us?
I don’t think we are. At least not in any widespread, systemic way.
Instead, we’re moving more and more towards the quick fix of automation, the fast food of community that satiates our hunger of connection and witnessing, but doesn’t actually nourish us. We’re outsourcing emotional labor to chatbots. We’re replacing therapy with apps. We’re substituting the slow, difficult work of building trust and intimacy with the instant gratification of AI-generated responses that tell us what we want to hear. That’s what combining AI and mental health is.
And I get why it’s appealing. It’s faster. It’s easier. It’s less risky. It doesn’t require us to be vulnerable or to navigate the complexity of another person’s needs and boundaries. It doesn’t ask us to grow or change or sit with discomfort.
But it also doesn’t feed us. Not really. Not in the ways we actually need.
What We’re Losing
When we replace human connection with AI interaction, we lose something so essential. We lose the experience of being surprised by someone else’s perspective. We lose the opportunity to practice empathy, to stretch our capacity to hold complexity, to learn how to be with someone in their pain without trying to fix it.
We lose the chance to be challenged in ways that help us grow. We lose the accountability that comes from being in relationship with people who know us, who can call us in when we’re off track, who can reflect back to us when our behavior doesn’t match our values.
We lose the co-regulation that happens when we’re physically present with another person—the way our nervous systems sync up, the way another person’s calm can help settle our anxiety, the way shared laughter or tears can shift our physiology in ways that no amount of typing into a chatbot can replicate.
We lose the sense of belonging that comes from being part of something larger than ourselves. We lose the experience of interdependence, of knowing that we matter to others and that others matter to us, of being woven into a web of relationships that hold us even when we’re struggling.
We lose the practice of conflict and repair, which is where some of the deepest intimacy and trust is built. We lose the opportunity to learn that relationships can survive disagreement, that ruptures can be mended, that we can cause harm and still be loved enough to be held accountable as part of community, that we can be hurt and still choose to stay.
And perhaps most importantly, we lose the chance to build the kind of communities we desperately need—communities that can hold us through crisis, that can organize for collective liberation, that can model alternatives to the oppressive systems we’re trying to dismantle.
What We Need Instead
I don’t have a list of quick mental health tips for the rise of AI in mental health care. I don’t, nor do I want to, have a prescription for what I think your relationship with AI should be.
What I do have is an invitation. A nudge. A call.
I want to nudge you in thinking about whether there’s a part of you yearning for connection right now; if there’s a part of you that feels like there’s a call to check in on your people.
Maybe it’s the friend you’ve been meaning to text for weeks. Maybe it’s the family member you’ve been avoiding because the last conversation was awkward. Maybe it’s the community group you stopped attending because you felt overwhelmed or out of place. Maybe it’s the therapist or support group you’ve been considering but haven’t reached out to yet.
Maybe it’s just sitting with yourself and noticing: When do I turn to AI instead of people? What am I protecting myself from? What am I afraid might happen if I reach out? What do I actually need that AI can’t give me?
And maybe, if you’re in a place to do so, it’s also about thinking about how you can show up for others. Who in your life might be feeling isolated right now? Who might be turning to AI because they don’t feel like they have anyone else to turn to? How can you create spaces—even small ones—where people feel safe to be vulnerable, to disagree, to mess up and repair, to practice the skills of being in community together?
This isn’t about perfectionism. It’s not about having it all figured out. It’s not about never using AI or always being available to everyone. It’s about remembering that we need each other. That we’re wired for connection. That the work of building and sustaining community is some of the most important work we can do.
Remembering Our Humanity
Let’s not forget our humanity, our interconnectedness, our rooted beingness. Let’s not forget each other.
In a world that is increasingly automated, algorithmically driven, and designed to keep us isolated and consuming, choosing connection is a radical act. Choosing to show up for each other, to practice the messy skills of relationship, to build communities that can hold complexity and conflict—this is resistance. This is how we survive. This is how we create the world we want to live in.
AI isn’t going anywhere. It’s going to keep developing, keep being integrated into more aspects of our lives, keep offering us convenient alternatives to the hard work of being human together. And we’re going to have to keep making choices about how we engage with it, what we use it for, and what we refuse to outsource to it.
Let’s not forget what’s at stake. Let’s not forget that there are some things—connection, witness, co-regulation, community, belonging—that can’t be automated. That shouldn’t be automated. That we lose something essential when we try.
So this is my call: Reach out. Check in. Show up. Practice. Fail. Repair. Try again. Build the communities we need. Remember each other. Remember ourselves.
Because at the end of the day, we’re all we’ve got. And that’s not a limitation—it’s a gift.
If you’re feeling the call to deepen your connections and build the skills of being in community, we’re here. At Venturous Counselling, we offer individual and relationship therapy that centers justice, healing, and the messy, beautiful work of being human together. We also provide clinical supervision and consulting for practitioners who want to build practices rooted in community care and collective liberation. Book a free 15-minute consultation to explore how we might support you.
This post was written on the unceded territories of the Musqueam, Tsleil-Waututh, and Squamish peoples. We acknowledge the ongoing impacts of colonization and commit to supporting Indigenous sovereignty and self-determination in our work.