Alice Hendy MBE on using AI chatbots as a therapeutic tool

As chatbots are being programmed to talk, laugh, sing, and become more lifelike overall, Gen Zers are increasingly turning to them for advice and comfort.

SCREENSHOT media asked Alice Hendy MBE about the risks of relying on chatbots for therapeutic purposes - read the full article on

“One of the primary risks of relying on AI as a therapeutic tool is the potential for the dehumanisation of the therapeutic process. In traditional therapy, the therapeutic relationship between the therapist and the client is central to the healing process. This human connection allows for empathy, understanding, and the ability to tailor treatment to the unique needs of the individual. By relying on AI as a replacement for human therapists, there is a risk of losing this essential element of the therapeutic process, potentially leading to a reduction in the quality of care and the depth of healing experienced by clients,” Hendy explained regarding these concerns.

“Another risk of relying on AI as a therapeutic tool is the potential for over-reliance on technology to solve complex human problems. While AI can undoubtedly offer valuable insights and support in therapy, it is essential to recognise that human emotions, experiences, and relationships are inherently complex and multifaceted. The limitations of AI in understanding and responding to the nuances of human experience must be acknowledged, and human therapists should continue to play a central role in the therapeutic process.”

Another reason critics are sceptical of AI is the question of whether it can detect complex mental health conditions, provide culturally sensitive support, and read between the lines of what is being said rather than taking statements entirely at their face value.

Contemplating these questions, Hendy replied: “AI has the potential to detect signs of risky behaviour and alert individuals or relevant authorities. However, the effectiveness of AI in this regard depends on the quality and accuracy of the data it is trained on, as well as the ethical considerations in determining what constitutes risky behaviour.”

Moreover, the mental health advocate warned: “However, it is important to approach this with sensitivity and awareness of potential biases in AI algorithms.”

Considering the motivations that drive young people towards artificial intelligence though, we still wanted to shine a light on some of the advantages that users can get by using chatbots for (minor) emotional concerns.

“One of the key advantages of using chatbots as a therapy tool is their accessibility. Many individuals may not have access to traditional therapy due to financial or logistical barriers. By utilising chatbots, individuals can seek support and guidance at any time and from any location, breaking down the barriers that may prevent them from seeking help,” Hendy emphasised.

“Furthermore, chatbots have the potential to provide continuous support. Traditional therapy sessions are often limited to scheduled appointments, leaving individuals to navigate their mental health challenges on their own in between sessions. Chatbots, on the other hand, can provide continuous support and guidance, offering a consistent source of assistance to those in need.”

Nevertheless, these advantages still drew us back to the initial warning points from experts.

“It’s important to note that while chatbots have the potential to offer valuable support, they should not be seen as a replacement for traditional therapy. Human connection and empathy play a crucial role in therapy, and chatbots cannot fully replicate these aspects,” she said.

Hendy’s point about the importance of authentic human connection becomes the anchor to hold on to in the debate around AI as therapists, considering that the nuances of human emotions and the essential empathy and understanding provided by human therapists cannot be fully replicated by chatbots.

Read the full article on

Our Sponsors and Supporters