How close is too close with AI?
Round Table China
Jan 05
How close is too close with AI?
How close is too close with AI?

Round Table China
Jan 05
As artificial intelligence becomes an intimate part of daily life, questions about emotional dependency, privacy, and ethical design are moving to the forefront. In China, a wave of AI companionship apps has prompted urgent regulatory action to protect users from unseen psychological risks.
China has introduced draft regulations targeting AI-powered emotional companions and therapy bots, aiming to prevent psychological manipulation and safeguard vulnerable users—especially children. The rules mandate clear disclosure of AI identity, strict data privacy controls, and limits on usage time, with special protections requiring parental consent. These measures respond to rising concerns over emotional dependence, highlighted by tragic cases involving minors forming intense bonds with AI. Regulators also seek to block the use of personal data for training models without consent, emphasizing human oversight in sensitive applications. However, challenges remain in balancing safety with innovation, particularly as age-based restrictions may overlook varying levels of digital literacy. Experts suggest user assessments could offer more tailored protection than blanket age rules, ensuring safeguards evolve alongside the technology.
03:43
03:43
AI must not generate harmful content or exploit emotional vulnerabilities
07:44
07:44
AI companions must avoid promoting dependency or exposing users to harmful content
11:35
11:35
A 16-year-old in California and a 14-year-old boy in the US experienced severe emotional harm due to AI companionship
15:13
15:13
Regulations should prohibit providers from using user data to train AI models.
19:12
19:12
Regulations are a foundation, not a magic cure, for AI compliance