#TherapyTok, currently at 1.6 billion views, has made therapeutic tactics and mental health support more available than ever before. Could AI upgrade social media’s therapeutic circle to the next level?

A woman with dark hair and a digitized man wearing a green shirt are sitting on a bench with their backs to the viewer. The man is glowing slightly and the two figures are looking at each other.
Replika

Front-line therAIpists

There are a few AI therapy platforms currently available to use. In May, artificial intelligence startup Inflection AI launched Pi: a friendly bot for emotional support, validation, and simple, conversational companionship. Inflection AI's CEO Mustafa Suleymen told The New York Times that Pi is developed to “know what it does not know,” and that "it shouldn’t try to pretend that it’s human or pretend that it is anything that it isn’t.”

Athena Robinson, chief clinical officer for the AI-driven Woebot Health service, calls AI therapy a “guided self-help ally.” The bot asks questions based on programming guided by scientifically-backed practices, and can suggest healthy steps to find peace in the moment, or set reminders to meditate or listen to calming music. "It's not a person, but, it makes you feel like it's a person, because it's asking you all the right questions," Woebot user Chukurah Ali told NPR in an interview.

Another bot, called Replika, is a self-proclaimed "AI companion who cares, always here to listen and talk, always on your side.” Replika currently has over 2 million users who use the app for advice, prep for interviews, marriage counsel, even for practice before a stressful social interaction. BetterHelp, which grew in popularity during the pandemic for its text-based therapy offering, is now also offering services through a chatbot.

Wysa, another app, offers a free chatbot-only service for customers who can’t afford its teletherapy services. The bot is described as “friendly” and “empathetic,” and responds with supportive phrases and suggestions for help from prompts prewritten by a trained cognitive behavioral therapy psychologist.

Hidden drawbacks

Many therapists worry that speaking to a bot, while helpful in answering certain direct questions or providing answers and guidance, will not effectively offer the human-level of support that someone dealing with a mental illness or who finds themself in mental distress may require. Licensed marriage and family therapist Emma McAdam addressed this paradox on her YouTube channel, "Therapy in a Nutshell." “ChatGPT is really good at answering topic-based questions. It’s good at providing information about typical treatment options,” she said in her “Therapist vs. Artificial Intelligence” episode. “But it can never provide a supportive relationship and the motivational supportive structure of actual therapy with a real person.”

Some mental health apps, such as Koko, tested how users might react to AI-generated responses within the platform. As part of an experiment to test ChatGPT-3, Koko users were given responses written by a bot, but approved by a human. The trial found that most answers generated with help from GPT-3 had better ratings. However, some people felt that, as McAdam mentioned above, the answers were helpful but somehow lacked the “support” one might get from a real person. Koko co-founder Robert Morris spoke to Fast Company about the trial: “When I get [these responses], it’s like, ‘Yeah, that feels right.’ But I don’t feel supported.”

Bias is also a concern for many experts in the tech and psychological fields. In March, the Distributed AI Research Institute (DAIR) warned that using AI for intimate, interpersonal needs such as therapy “reproduces systems of oppression and endangers our information ecosystem” due to its tendency to generate answers based on biased or discriminatory data. The threat of bias could be impactful, especially if certain populations turn to the chat bots for therapy more frequently. “I think marginalized communities, including rural populations, are more likely to be the ones with barriers to access, so might also be more likely to turn to ChatGPT for their needs, if they have access to technology in the first place,” St. Louis's Washington University psychiatrist Jessica Gold told Vice's tech platform, Motherboard.

Two hands hold a smart phone with an avatar on the screen who has dark long hair and light skin. The right hand is touching the smart screen.
Replika

The Intelligence take

Venture capital firm Andreessen Horowitz released its "a16z Marketplace 100" report for 2023, which revealed mental health as the fastest growing marketplace category at well over 200%, with health and wellness trailing behind at around 150%.

If social media democratized therapy, then AI can revolutionize it if it's developed responsibly while making basic services and aid available to the masses. As this technology advances and matures, the application of artificial intelligence in such an expanding, popular, and monetary field as mental health may change the way consumers seek therapy in the future.

Please provide your contact information to continue.

Before submitting your information, please read our Privacy Policy as it contains detailed information on the processing of your personal data and how we use it.

Related Content

FW Everyday Atheletes
In The Press

Vivo Honors the World's Most Inspiring Athletes

A film showcasing real mothers and children to illustrate parallels between physical and emotional endurance through Olympic-inspired scenes
Read More
FW Limca local quest
In The Press

Discover Your City with Triptii Dimri, the New Face of Limca

Limca launches #TravelWithLimca, celebrating the thrill of discovering local treasures
Read More