With the advent of affordable, widespread AI tech, we’re entering a different world. And the most common question I see coming up is “can AI replace _______?” Fill in the blank with whatever profession the person is concerned about. Here’s an actual example from a recent conversation I had:
“Can AI replace therapists?”
It’s easy to answer – “no.” It’s not even a question in my mind. Going to a therapist involves establishing a relationship with a person, some degree of trust/rapport being developed, and a dozen other intangibles that AI isn’t currently capable of. If what you need is a fully-qualified therapist, AI isn’t going to solve your problem.
But let’s ask a couple of more useful questions. How about:
- Can AI solve some of the problems for which ‘going to a therapist’ is the conventional solution?
- Can AI make some of the benefits of ‘going to a therapist’ more accessible to people?
The answer to each of those questions is “absolutely.”
Consider that currently in the United States, getting in to see a therapist – especially if you’re juggling insurance and have to find one “in network” – can involve being placed on a six-month waiting list. That creates a substantial delay for a service category where prompt service can sometimes literally mean the difference between life and death. Now add the fact that seeing a qualified therapist – even with insurance – can be hundreds of dollars per month. This puts it out of the price range for a number of people that need it.
And the wait lists and costs create a potential opportunity. For example, while an AI can’t develop meaningful trust/rapport, or act as a fungible substitute for an in-person relationship, it can definitely handle one of the basic components of therapy – listening and ask questions.
This means that if a patient comes in complaining about an overbearing mother, the AI won’t be able to convincingly empathize. It can’t believably say “I understand; I would feel that way too.” But it can say, “you seem angry about your mother’s behavior. What about your mother doing that makes you angry?” It can then prompt the patient through thoroughly describing how they feel. The AI can ask follow-up questions based on what the patient says, and prompt more introspection. It can also follow up by recommending strategies for conflict avoidance, de-escalation, etc. And that may be everything a given patient needs at the moment.
This means that with modern AI transformer models there’s a very real possibility that an AI – with appropriate safeguards – could talk people through many basic issues. This would be especially true if the language model were trained on a corpus of psychology texts, writings by therapists, anonymized transcripts of therapy sessions, etc.
Rather than being about AI replacing therapists, then, it becomes about considering how AI can help to deliver a subset of therapeutic services both at scale and at relatively low cost to anybody with an Internet connection or app on their phone, thus ameliorating some of the challenges with cost and availability.
That’s the way I look at the current state of AI. It’s not about “how do we replace X with AI” – it’s about “what part of X can AI do well enough to be useful?” And I suspect that when we sit down and really think it over, the answer to that question in most fields at the moment is “less than we’d hope, but much more than we think.”