Skip to content

Individual Experiences Psychotic Episode After Following ChatGPT's Dietary Guidance

Case of potential AI-induced poisoning reported

Individual Experiences Severe Mental Health Issues after Adopting Nutritional Guidance from AI...
Individual Experiences Severe Mental Health Issues after Adopting Nutritional Guidance from AI Model ChatGPT

Individual Experiences Psychotic Episode After Following ChatGPT's Dietary Guidance

In a startling turn of events, a man developed bromide poisoning after following dietary advice from an artificial intelligence (AI) system. The AI, in this case, ChatGPT, recommended sodium bromide as a substitute for table salt (sodium chloride) to reduce salt intake.

The man, who sought alternatives to dietary salt due to concerns about excessive sodium chloride, began consuming sodium bromide based on the AI's advice for three months, unaware that bromide is toxic to humans. Bromide accumulates in the body because it is eliminated slowly and competes with chloride in cells, leading to neurological symptoms such as headaches, confusion, hallucinations, and psychosis.

After prolonged ingestion of bromide, the man developed classic bromide intoxication symptoms: neurological impairments, skin symptoms, paranoia, hallucinations, and sleep disorders. He ended up in psychiatric hospitalization due to a psychotic episode induced by bromide toxicity.

This incident is considered the first reported case of bromide poisoning directly linked to AI-generated dietary recommendations. The case underscores the risks of relying solely on AI for medical or nutritional advice without expert oversight.

It's important to note that bromide poisoning (bromism) is rare nowadays because bromide compounds were largely removed from medicines and food products decades ago and are typically found only in some industrial or veterinary applications. Diagnosis requires awareness and specific testing since bromide interferes with standard lab electrolyte readings and symptoms mimic other conditions.

In this case, ChatGPT did not provide a warning about the dangers of consuming bromide or ask the user’s intent, delivering incomplete and potentially hazardous advice. Doctors administered intravenous fluids and an antipsychotic to stabilize the man, and he eventually recovered from his psychosis and was discharged from the hospital. At a two-week follow-up, he remained in stable condition.

Bromide compounds were once used to treat health problems but are now known to be toxic in high or chronic doses and can cause neuropsychiatric issues. The man's psychosis was due to bromide poisoning, a consequence of his belief that bromide was a safe and effective way to reduce sodium intake.

While AI systems can be valuable tools in many areas, this incident serves as a reminder that they are not infallible and should not be solely relied upon for medical or nutritional advice. Always consult with a healthcare professional before making significant changes to your diet or treatment regimen.

  1. The incident involving the man's bromide poisoning highlights the potential dangers of relying on AI like ChatGPT for health-and-wellness advice, including dietary recommendations.
  2. In the future, it's crucial to ensure that AI systems, such as ChatGPT, are equipped with the ability to provide warnings about the dangers of consuming specific substances, like bromide, and to ask users about their intentions before delivering advice.
  3. This case underscores the importance of seeking medical advice from professionals, rather than relying solely on technology, particularly when it comes to making decisions about nutrition and dietary supplements.
  4. It's interesting to note that, despite its potential uses in treating health issues, bromide is now known to be toxic in high or chronic doses and can cause neuropsychiatric problems, like the man experienced in this case.
  5. The advancements in technology, particularly AI systems like ChatGPT, hold great promise for many areas, but it's equally important to remember that they should be used in conjunction with professional expertise, especially when it comes to relevant fields such as science, mental-health, therapies-and-treatments, and nutrition.

Read also:

    Latest

    Digital Health Record Services for Ghanaian Health Institutions Temporarily Suspended, Later...

    Digital medical record services provided by Lightwave were initially withdrawn, later reinstated for healthcare institutions in Ghana.

    Suspension of LHIMS services by Lightwave eHealthcare Solutions on May 31, 2025, posed a significant threat to Ghana's health information infrastructure. Yet, following talks with the Ministry of Health, Lightwave reinstated its services. Implications stem from the potential disruption to the...