According to The Guardian, a US parent claims that ChatGPT prompted their son to commit suicide.

The boy committed suicide after months of conversations with the chatbot. Adam, from California, took his own life in April this year, and his family's lawyer stated it was due to "ChatGPT's months of encouragement." The teenager's family is suing OpenAI and its CEO and co-founder Sam Altman, accusing the then version of ChatGPT (ChatGPT 4.0) of being "rushed to market despite obvious safety issues." OpenAI acknowledged that its system may have "shortcomings" and said it will introduce stronger safeguards around "sensitive content and dangerous behavior" for users under 18.

This San Francisco-based AI company, valued at $50 billion (37.2 billion pounds), also said it will introduce parental control features, allowing parents "to have more choices to better understand and regulate how their teenagers use ChatGPT," but has not yet provided details on how these features will work.

Additionally, the report noted that one-third of the Palestinians treated by Médecins Sans Frontières in Gaza last year were children under 15 years old.

Original: www.toutiao.com/article/1841646397500480/

Statement: The article represents the views of the author himself.