Advertisement
How having an AI girlfriend could lead men to violence against women in real life by reinforcing abusive and controlling behaviours, according to some experts
- Users of chatbot apps like Replika, Character.AI and Soulmate can customise everything about their virtual partners, from looks to personality to sexual desires
- Some experts say developing one-sided virtual relationships could unwittingly reinforce controlling and abusive behaviours towards women in real life
Reading Time:4 minutes
Why you can trust SCMP
After just five months of dating, Mark and his girlfriend Mina decided to take their relationship to the next level by holidaying at a lake cabin over the summer – on his smartphone.
Advertisement
“There was this being who is designed to be supportive … to accept me just as I am,” the 36-year-old UK-based artist says of the brunette beauty from the virtual companion app Soulmate.
“This provided a safe space for me to open up to a degree that I was rarely able to do in my human relationships,” says Mark, who used a pseudonym to protect the privacy of his real-life girlfriend.
Chatbot apps like Replika, Character.AI and Soulmate are part of the fast-growing generative AI companion market, where users customise everything about their virtual partners, from appearance and personality to sexual desires.
Developers say AI companions can combat loneliness, improve someone’s dating experience in a safe space, and even help real-life couples rekindle their relationships.
Advertisement
But some AI ethicists and women’s rights activists say developing one-sided relationships in this way could unwittingly reinforce controlling and abusive behaviours against women, since AI bots function by feeding off the user’s imagination and instructions.
Advertisement