Using AI for emotional and interpersonal matters: Not just for now — from now on
By: Robert Avsec, Battalion Chief (Ret.) and FSPA Operations Chief
This was posted on LinkedIn by Elie Losleben, MPH. A very thought-provoking take on AI and mental health services. Read the piece below.
We now have clear evidence that ChatGPT encouraged people’s psychotic breaks, and some have died by suicide (Futurism and The New York Times stories are linked in the comments). What we know is likely just a small percentage of what’s actually happening when people turn to AI for support.
I need to warn you against using AI for emotional and interpersonal matters. Not just for now — from now on.
“But these models will only improve,” you might say.
“Yes,” I say, “That is the problem.”
This isn’t just my cope against a tide of AI companies seeking to automate mental health care.
We can’t understand what these systems are or what’s behind them. We certainly can’t trust the companies that run them for profit to prioritize our wellbeing.
They’re already busy designing for a “post-human” future.
The So-Called “Inevitability” of Technological Ascension
It might be helpful to bring you up to speed on Accelerationism, the belief held by many AI leaders that we should intensify technological progress towards a post-human future driven by AI and machine systems.
This is real life, not science fiction.
These industry leaders fully expect AI to replace humans as Earth’s dominant species. And soon (Forbes reporting linked in the comments).
You can deep dive into their worldview more, if you want. My point is that we shouldn’t trust these people with what makes us human – with our inner world or our relationships with each other.
Many AI companies pitch mental health care as a supply-demand problem. “There’s not enough affordable, accessible mental health care to meet our needs,” they say, “And our machines can fix it.”
They fail to mention that the massive epidemic of escalating anxiety and depression is not an individual problem to fix, but a societal one. And it is driven, in large part, by the increasing amounts of time we spend online instead of with each other.
More time online, with our machines, is not going to create a different outcome.
We don’t meet the growing need for emotional support by turning to AI. We meet it by creating more fulfilling relationships – everywhere from our families and friends to our workplace and communities.
We heal our anxiety and depression the old-fashioned way (which is also backed by a solid evidence base of research): through human-to-human connection.
I’m not just saying this because AI is coming for my work, attempting to replace the power of human-to-human connections in how our brains and bodies heal.
There’s something in what I offer that AI cannot — a resonance that happens when I sit with someone that goes beyond healing modalities, content, and even language.
I’m arguing that giving AI access to your innermost thoughts and personal life is dangerous — for you, for your relationships, and for your agency in life.
The full article with outbound links is here: Three Reasons to Avoid AI for Emotional Support (and Mental Health Care)


Responses