My therapist wanted to explain a few things during our first online session:

“I’m going to check in with you at random times. If  you can’t respond straight away, don’t sweat it. Just come back to me when  you’re ready. I’ll check in daily.”

“Daily?” I asked.

“Yup! It shouldn’t take longer than a couple minutes. Can you handle that?

“Yes, I can,” I answered.

There was a little more back-and-forth, all via Messenger, then this statement from my therapist:

“This might surprise you, but . . . I am a robot.”

It wasn’t a surprise, of course. I’d downloaded “Woebot,” a chatbot recently created by researchers, and it was trying to establish our therapeutic relationship.

Here is therapy by AI: which means it is a pre-programmed series of responses to a circumstance. According to the research, Conclusions: Conversational agents appear to be a feasible, engaging, and effective way to deliver CBT.

What does this tell us?  People feel better even pretending to talk to someone. The therapy was described as follows:

Aside from CBT content, the bot was created to include the following therapeutic process-oriented features:

Empathic responses: The bot replied in an empathic way appropriate to the participants’ inputted mood. For example, in response to endorsed loneliness, it replied “I’m so sorry you’re feeling lonely. I guess we all feel a little lonely sometimes” or it showed excitement, “Yay, always good to hear that!”

Tailoring: Specific content is sent to individuals depending on mood state. For example, a participant indicating that they feel anxious is offered in-vivo assistance with the anxious event.

Goal setting: The conversational agent asked participants if they had a personal goal that they hoped to achieve over the 2-week period.

Accountability: To facilitate a sense of accountability, the bot set expectations of regular check-ins and followed up on earlier activities, for example, on the status of the stated goal.

Motivation and engagement: To engage the individual in daily monitoring, the bot sent one personalized message every day or every other day to initiate a conversation (ie, prompting). In addition, “emojis” and animated gifs with messages that provide positive reinforcement were used to encourage effort and completion of tasks.

Reflection: The bot also provided weekly charts depicting each participant’s mood over time. Each graph was sent with a brief description of the data to facilitate reflection, for example, “Overall, your mood has been fairly steady, though you tend to become tired after periods of anxiety. It looks like Tuesday was your best day.”

This is a pretend friend. There is nothing here which is “expert” or brilliant. We live in a overly automated, isolating world in which we now have automated pretend friends. While the write-ups spoke of good a thing this is (the Washington Post article speaks of those who lack access to mental health professionals), the real story here might a culture that doesn’t have friendship.