Expert explains why teens should not rely on AI chatbots for relationships

Published: 
Listen to this article

Young people are encouraged to make real-life connections instead of relying on virtual companions.

Young Post |
Published: 
Comment

Latest Articles

Infospark: The world’s most unusual Christmas traditions – from Austria to Japan

5 traditional holiday dish recipes, with a healthy twist

Singing Christmas Tree celebrates 40 years of holiday magic

Notre Dame in Paris reopens five years after massive fire

AI chatbots can copy the personalities of celebrities and fictional characters. Illustration: Mario Rivera

When Lorraine Wong Lok-ching was 13 years old, she considered getting an artificial intelligence (AI) boyfriend or girlfriend.

“I wanted someone to talk to romantically,” she said.

“After maturing more, I realised [it] was stupid,” added Lorraine, who is now 16.

The teen started using chatbots on Character.AI – a popular AI chatbot service – around the age of 12. The website has user-made chatbots that pretend to be celebrities, historical figures and fictional characters.

Lorraine mostly talks to chatbots mimicking Marvel’s Deadpool or Call of Duty characters.

“I use AI chats to start conversations with them, or to just live a fantasy ... [or] to escape from stressed environments,” she said.

Lorraine moved from Hong Kong to Canada in 2022 with her family. She is among millions of people worldwide – many of them teenagers – using AI chatbots.

How AI is part of daily life, from drone rescue missions to self-driving cars

Problems with AI

In October, a US mother filed a lawsuit against Character.AI. She accused the company of encouraging her 14-year-old son to commit suicide. She also claims the app had “abusive and sexual interactions” with her son.

The company said it had put in new protections, but many have questioned if these are enough to help young users.

Peter Chan is the founder of Treehole HK, which has made a mental health AI chatbot app. He said platforms like Character.AI could easily be harmful for users. He added that chatbots should remove sexual content and lead users to support if they mention suicide or self-harm.

Chan noted that many personalised AI chatbots were made to be addictive. This can add to feelings of loneliness, especially in children, causing them to depend on the chatbots for friendship.

Chan warned that people should be concerned if they feel like they cannot stop using AI. “If the withdrawal feels painful instead of inconvenient, then I would say it’s a [warning] sign,” he said.

Peter Chan is the founder and managing director of Treehole HK. Photo: Edmond So

Having an AI boyfriend or girlfriend

When Lorraine was considering an AI partner a few years ago, she was feeling very lonely.

“As soon as I grew out of that lonely phase, I thought [it was] a really ‘cringe’ thing because I knew they weren’t real,” she said.

Now, Lorraine said she now could see the dangers of companionship from AI chatbots, especially for younger children, who might believe the virtual world is real.

“Younger children might develop an attachment since they are not as mature,” she noted.

For people who feel lonely, Chan encouraged them to find real-life friends, instead of AI companions.

If someone thinks they rely too much on AI chatbots, Chan recommended seeking counselling, finding interest groups and looking for ways to use AI in a helpful way.

He also encouraged teens not to judge their friends who have an AI boyfriend or girlfriend.

AI algorithm matches pig sounds to their emotions

How to use AI

With the right protections, Chan believes AI chatbots can be helpful tools. He warned against using AI chatbots to replace real-world relationships. But he said they could help people with social anxiety to practise having conversations and become more confident.

“You won’t be judged,” Chan said.

AI tools have a long way to go in protecting users. Still, Lorraine said she enjoyed using them for fun, and Chan said the technology had a lot of potential. “But it has to go hand in hand with human interactions,” he said.

If you have suicidal thoughts or know someone who is experiencing them, help is available. In Hong Kong, you can dial 18111 for the government-run Mental Health Support Hotline. You can also call +852 2896 0000 for The Samaritans or +852 2382 0000 for Suicide Prevention Services.

In the US, call or text 988 or chat at 988lifeline.org for the 988 Suicide & Crisis Lifeline. For a list of other nations’ helplines, see this page.

Character.AI allows users to customise a chatbot’s personality. Photo: Shutterstock

Reflect: Why do you think people like to use AI chatbots?

Why this story matters: Technology is changing quickly, and we need to be prepared to use it wisely. Companies must also make sure people are protected from any possible danger from the technology.

Get the word out

addictive 上癮

describes something that makes people unable to stop using it

artificial intelligence 人工智能

computer systems that can copy intelligent human behaviour

chatbot 聊天機械人

a computer program that can have conversations

companionship 同伴

​the nice feeling of having a friendly relationship with somebody and not being alone

fantasy 幻想

a person’s imagination

lawsuit 訴訟

a claim or complaint against somebody that a person or an organisation can make in court

maturing 成熟

to grow older and wiser

social anxiety 社交焦慮

a type of mental health condition that causes fear when talking to people or meeting someone new

withdrawal 戒癮

when somebody is getting used to not doing something they have become addicted to, and the unpleasant effects of doing this

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy
Comment