Channels

Sticky Video Player with Ad Breaks Responsive Sticky Ad Banner
AD Affiliate Disclosure: contains advertisements and affiliate links. If you click on an ad or make a purchase through a link, CoachKeewee.com may earn a commission at no extra cost to you.
📺 WATCH US NOW!

How many ChatGPT users discuss suicide with the AI? The number may shock you.

In a Monday blog post, OpenAI touted the improvements its default model, GPT-5, has made in identifying and responding to users’ troubling responses, including suicidal ideation. While new safeguards and the introduction of psychiatrists in helping train GPT-5 are leading to improved AI responses to mental health prompts, the blog post also pointed out some numbers that are bound to raise eyebrows.

While explaining GPT-5’s abilities to detect serious mental health concerns, like psychosis and mania, the post noted that troubling user conversations with the chatbot are “rare.”

“While, as noted above, these conversations are difficult to detect and measure given how rare they are, our initial analysis estimates that around 0.07% of users active in a given week and 0.01% of messages indicate possible signs of mental health emergencies related to psychosis or mania.”

The percentage seems small, but ChatGPT has 800 million weekly users, according to Sam Altman, the CEO of OpenAI, which owns ChatGPT. Altman made that stunning announcement earlier this month at OpenAI’s DevDay. 

If Altman’s numbers are correct, that equates to 560,000 ChatGPT users showing signs of psychosis or mania, and 80,000 of their messages indicating mental health emergencies, according to the site’s estimates. 

OpenAI is continuing to work with its models to better identify signs of self-harm and steer those people to resources, like suicide hotlines or their own friends or family members. The blog post continues to suggest that ChatGPT conversations regarding self-harm are rare, but estimates that “0.15% of users active in a given week have conversations that include explicit indicators of potential suicidal planning or intent and 0.05% of messages contain explicit or implicit indicators of suicidal ideation or intent.”

With 800 million weekly users, that equates to 1.2 million ChatGPT users engaging in conversations with AI about suicide in a given week, and 400,000 messages from users that demonstrate direct or indirect indications of suicidal intent.

“Even a very small percentage of our large user base represents a meaningful number of people, and that’s why we take this work so seriously,” an OpenAI spokesperson told Mashable, adding that the company believes ChatGPT’s growing user base reflects society at large, where mental health symptoms and emotional distress are “universally present.”

The spokesperson also reiterated that the company’s numbers are estimates and “the numbers we provided may significantly change as we learn more.”

OpenAI is currently facing a lawsuit from the parents of Adam Raine, a 16-year-old who died by suicide earlier this year during a time of heavy ChatGPT use. In a recently amended legal complaint, the Raines allege OpenAI twice downgraded suicide prevention safeguards in order to increase engagement in the months prior to their son’s death.

If you’re feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text “START” to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email info@nami.org. If you don’t like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.

Content Accuracy: Keewee.News provides news, lifestyle, and cultural content for informational purposes only. Some content is generated or assisted by AI and may contain inaccuracies, errors, or omissions. Readers are responsible for verifying the information. Third-Party Content: We aggregate articles, images, and videos from external sources. All rights to third-party content remain with their respective owners. Keewee.News does not claim ownership or responsibility for third-party materials. Affiliate Advertising: Some content may include affiliate links or sponsored placements. We may earn commissions from purchases made through these links, but we do not guarantee product claims. Age Restrictions: Our content is intended for viewers 21 years and older where applicable. Viewer discretion is advised. Limitation of Liability: By using Keewee.News, you agree that we are not liable for any losses, damages, or claims arising from the content, including AI-generated or third-party material. DMCA & Copyright: If you believe your copyrighted work has been used without permission, contact us at dcma@keewee.news. No Mass Arbitration: Users agree that any disputes will not involve mass or class arbitration; all claims must be individual.

Sponsored Advertisement