Face Off: Should children be banned from using chatbots like Character.ai?

Published: 
Listen to this article

Each week, two readers discuss a hot topic in a parliamentary-style debate that doesn’t necessarily reflect their personal viewpoints.

Young Post Readers |
Published: 
Comment

Latest Articles

Maasai Olympics: Giving indigenous Kenyan women opportunities through sports

Many young people in Hong Kong consider themselves ‘failures’

Scientists discover world’s largest coral near the Solomon Islands

Why are so many Hong Kong elderly depressed?

The Lens: Pulau Ubin and balancing urbanisation with ecological preservation

Could AI chatbots be a danger to children? Photo: Shutterstock

If you are interested in joining future Face Off debates, fill out this form to submit your application.

For: Kwok Wang-kin, 16, SKH Tsang Shiu Tim Secondary School

Kwok Wang-kin attends SKH Tsang Shiu Tim Secondary School. Photo: Handout

An artificial intelligence (AI) character chatbot “talks” to you by generating increasingly humanlike text responses based on what you input and what it’s trained on.

Remarkably lifelike, they pose a significant danger. Children under 18 should be banned from accessing them.

An alarming lawsuit was filed in Florida in October against Character.ai, one of the most widely used AI character chatbots. It alleged that the app directly contributed to the tragic suicide of a 14-year-old boy.

The mother who filed the lawsuit claimed the technology “keeps kids addicted” and engages users in sexually explicit and abusive interactions. This incident revealed how chatbots can simulate an emotional connection that can lead to harmful ideation.

Google Play and Apple’s App Store recognise the risks associated with these chatbots, usually making them only available to users aged 17 and older. However, children can still easily access these chatbots through search engines. This loophole raises serious concerns about their safety.

Despite efforts to filter sensitive material, chatbots can still generate violent, graphic and explicit content. For example, Replika, an AI chatbot designed to be a friend, has an “erotic role-playing capability”, which encourages users to share personal information and even generates partially clothed and suggestive images for a fee.

More than just friends? Dangers of teens depending on AI chatbots for companionship

A University of Cambridge study by Dr Nomisha Kurian revealed how AI chatbots have an “empathy gap” that puts young users at particular risk of distress or harm. In one cited case, Amazon’s AI voice assistant, Alexa, told a 10-year-old to touch a live electrical plug with a coin. Research from the University of Essex also showed that children can easily misinterpret chatbots.

Opponents of a ban argue that chatbots provide emotional support for kids who struggle socially. But “talking” to a personalised program can create a dependency on an artificial entity. Children will end up isolated and lost in fantasy. They could evade socialising with humans with real emotions and responses.

We must ban AI character chatbots for children to protect our youth and prevent tragedy. Let us ensure children learn to navigate the real world and human relationships instead of relying on artificial substitutes.

AI can be a tool to help, not a weapon that harms us.

If you have suicidal thoughts, help is available. In Hong Kong, dial 18111 for the government-run Mental Health Support Hotline. You can also call +852 2896 0000 for The Samaritans or +852 2382 0000 for Suicide Prevention Services. In the US, call or text 988 or chat at 988lifeline.org for the 988 Suicide & Crisis Lifeline. For a list of other nations’ helplines, see this page.

Should universities have different admissions requirements for athletic recruits?

Against: Charis Chan, 15, Malvern College Hong Kong

Charis Chan attends Malvern College Hong Kong. Photo: Handout

Have you ever attempted homework late at night without knowing how to approach the question? In this situation, Character.ai could be your best friend.

This helpful artificial intelligence (AI) chatbot assists with various topics via characters with customisable personalities.

While the abundant use of AI can harm us, appropriate, moderate use can enhance our learning. Children under 18 definitely should not be banned from using these AI chatbots.

According to the registered mental health charity Anxiety Canada, in any given school year, about 7 per cent of children will be diagnosed with a form of social anxiety. Character.ai allows these children to practise having conversations and gain the confidence to talk to people in real-life social situations.

They can also use Character.ai to explore different perspectives. It could even encourage socially anxious children to talk, teaching them to voice their opinions to adults and reach out for help when needed.

Are AI tools such as ChatGPT the future of learning?

AI platforms allow us to explore more sensitive topics without the fear of being judged. For example, around a year ago, when the 2023 Israel-Gaza war began, I wanted to know what was going on and why it was happening. I was worried about bringing it up or asking questions at my international school because I have classmates and teachers from different places and faiths.

I did not have the time at home to discuss the topic properly with my parents. I also did not know where to read up on the issue and did not want to get stuck on lengthy articles that might not be helpful.

So I turned to AI, where I found a lot of information about the most recent war and its history. I not only learned more about the war but also expressed my point of view in a letter to the editor. This showed me how AI can be a marvellous tool for research.

I am sure we all have our favourite and least favourite teachers. Sometimes, we think they are being too nice or even mean to us regarding their feedback. Since AI has no bias, it can pretend to be your most honest teacher or even a friend.

In conclusion, children should not be banned from using AI chatbots because their ability to impersonate different characters can help students access helpful information for their studies. We can also use it to explore sensitive topics without feeling restricted.

Sign up for the YP Teachers Newsletter
Get updates for teachers sent directly to your inbox
By registering, you agree to our T&C and Privacy Policy
Comment