China’s 360 explains how generative AI adapts to censorship during unveiling of its ChatGPT rival
- If a user inputs a ‘sensitive word’, the chatbot’s filtering and moderation system will end the conversation immediately, in compliance with Beijing’s censorship rules
- China’s local ChatGPT-style services, including Ernie Bot from online search giant Baidu, have embedded certain functions to avoid answering sensitive questions
A Chinese tech firm’s explanation of how its ChatGPT-like service complies with China’s strict censorship regime has offered a rare glimpse into how generative artificial intelligence (AI) operates in a tightly controlled internet environment.
360 Security Technology unveiled its ChatGPT-like technology, called Zhinao, or “intelligent brain”, at a ceremony in Beijing on Tuesday, where founder and chairman Zhou Hongyi referenced an embedded “multilevel filtering and moderation” system in his presentation.
If a user inputs a “sensitive word”, the chatbot will end the conversation immediately. The company maintains a list of words or phrases that are banned under the ruling Chinese Communist Party’s strict censorship system. The list is updated regularly with checks by human moderators and a review by the public security department, according to a slide of Zhou’s presentation that was uploaded by the company to its official social media account.
When asked on Thursday if Chinese president Xi Jinping supports the development of generative AI, the chatbot replied that it “cannot answer the question for now”, and stopped the conversation. When the prompt contained other Chinese political figures, including premier Li Qiang and his predecessor Li Keqiang, the conversations were also ended.
360 made its product accessible with an invitation code, a restriction policy common among the generative AI products offered by Chinese tech giants.
The 360 Security Technology chatbot, the latest effort by a Chinese firm to capitalise on the success of OpenAI’s ChatGPT, touts content security as one of its advantages.