- Peter Chan Kin-yan, founder of psychology organisation Treehole HK, and psychiatry scholar Dr Tim Li Man-ho explain the benefits – and downsides – of this new technology
- Every week, Talking Points gives you a worksheet to practise your reading comprehension with exercises about the story we’ve written
“It seems like something is bothering you. Would you mind sharing what happened?”
As you type in your response on the MindForest app, the chatbot generates answers tailored to your input and provides relevant psychological information to help untangle your thoughts.
The app, which combines psychology and artificial intelligence (AI), is the brainchild of Peter Chan Kin-yan. He’s the founder of Treehole HK, a psychology organisation dedicated to offering large-scale digital interventions for addressing the city’s alarming mental health conditions.
“Our traditional psychology model emphasises one-on-one therapy sessions. It’s like having a mountain on fire and sending firefighters with extinguishers to put out the flames,” Chan said.
“But that alone is not enough; what we truly need is a far-reaching mental health intervention ... This is what inspired us to integrate technology into psychology to generate widespread benefits,” said Chan, who launched MindForest last November.
His product offers chatbot functions on four key aspects: the workplace, relationships, personal growth, and health. It can interact and respond in Chinese and English.
“I believe that AI and practitioners share certain similarities, such as possessing rich psychological knowledge and skills in questioning and answering, which can help individuals achieve personal growth,” said the psychology graduate from the University of Hong Kong.
Hong Kong therapist explains why saying ‘I don’t care’ can hurt communication – and relationships
Chan stressed that the app is not intended to replace therapists but to serve as a useful tool for self-reflection.
“In the end, what facilitates your growth is not the response provided by the AI but how deeply you think during the process of giving instructions to the chatbot. This dynamic is rarely observed in real-life interactions with a therapist, as we often expect the therapist to take the lead.”
He highlighted a function of the app called Insight Journal. It is designed based on psychological intervention journaling and emphasises the therapeutic effect of documenting and sharing one’s thoughts.
“For teens, starting a journal may come with self-doubt, such as concerns about their writing skills or not knowing where to begin. However, this AI-powered app will guide you in expressing yourself and help you clear your mind,” Chan said, adding that the app also offers resources such as interactive self-development courses.
So far, his app has 2,000 active monthly users, and the feedback has been positive. Apart from working on providing more diverse psychological resources, his team emphasises responsible technology to provide users with a sense of security.
“We underwent intensive training and trials to ensure the app doesn’t provide harmful content or messages to our users. We are also implementing end-to-end encryption to ensure the privacy and protection of all conversations on the app,” he explained.
People-pleasing perils: why your mental health will thank you for drawing boundaries and saying ‘no’
Useful tool for the basics
Since the debut of ChatGPT in 2022, many digital platforms have emerged, leveraging generative AI to offer psychological assistance.
“I believe AI is indispensable in our everyday lives ... and we need to embrace the technology,” said Dr Tim Li Man-ho, assistant professor at the Chinese University of Hong Kong’s Department of Psychiatry. He explained that AI apps can offer basic help, saying, “[they] can fill the service gap [and can be] useful in psychology education and low-intensity treatment.”
Commenting on the emergence of AI-powered mental health platforms, Li said: “It offers convenient and speedy assistance, and if users need further help, we can offer face-to-face or professional support accordingly. This digital way can fill the gap for those who refuse to seek help or are waiting for advice.”
“It can help record our everyday health, mood, and activity, which allows professionals to obtain a deeper understanding of their clients and provide more comprehensive therapy,” explained the scholar, who is interested in AI and digital mental health.
Are you ‘social masking’? Why some hide their personality to fit in and how to learn to be yourself
Protect your privacy
While Li sees a promising future for AI in psychology, he acknowledged the lack of strong evidence regarding its effectiveness and that the technology also comes with safety concerns.
“It could be risky to apply AI in practical usage as it learns [by collecting massive amounts of] data, which may lead to inappropriate responses that mislead users,” he said. “This sets the technology apart from practitioners who have been trained to address people’s problems ... It requires time to modify the AI and enhance its safety.”
He suggested opting for AI apps introduced by authorities or recognised psychological organisations. He also emphasised the importance of privacy and responsibility when using these apps, cautioning users not to disclose excessive personal information as most chats are transported to the cloud, where safety and privacy cannot be guaranteed.
How to identify and break out of ‘thinking traps’ before they impact your mental health
Li also warned youngsters not to become overly attached to these apps. “While AI chatbots can help us sort out our thoughts and provide a convenient way to express our emotions, it is crucial to understand that these are tools to assist us and not a panacea.”
He added: “After all, AI has its limitations, and it cannot replace the people around you. Teens should maintain their social circles and know when to turn to professionals for further assistance.”
To test your understanding of this story, download our printable worksheet or answer the questions in the quiz below.