A 60-Year-Old Man Who Turned To ChatGPT For Diet Advice Ended Up Poisoning Himself And Landed In The Hospital

zeeforce
4 Min Read


With ChatGPT’s constant evolution and advancement as a tool, many have started relying excessively on it, not just for streamlining their tasks and everyday life but also for deeply personal matters. We have heard some bizarre cases of marriages ending due to the tool tracing evidence of infidelity, and even people relying on the chatbot for therapy, which Sam Altman has warned against. Now, a troubling case has come forward where the risk of relying on the AI assistant for health guidance has arisen. A 60-year-old man in New York is said to have developed a rare and dangerous condition after following the dietary advice laid out by the model.

The hidden dangers of AI health tips: Bromism and the salt alternative that went too far

While we have been hearing about not relying on any AI model excessively from time to time, a recent warning comes from the U.S. medical journal against seeking medical advice from ChatGPT after a man fell seriously ill following dietary advice from the chatbot, as highlighted by NBC News. The case has been published in the Annals of Internal Medicine, where an extensive report about a 60-year-old man who developed bromism, also known as bromide poisoning, after acting on the guidance of the AI assistant has been published.

The poor man developed the rare and dangerous condition after he replaced common table salt (sodium chloride) with sodium bromide, a substance he obtained online. ChatGPT gave advice when he inquired about a potential substitute for salt. The man consumed the sodium bromide daily for three months, believing it to be a healthier alternative. The outcome of this, however, was devastating as he developed paranoia, insomnia, psychosis, and other severe physical symptoms, making him think at one point in time that he was being poisoned by one of his neighbors.

The man ended up being hospitalized, where the doctors diagnosed bromism, which is the toxic reaction to excessive bromide exposure. After he stopped consuming bromide and received the necessary treatment, his symptoms began to resolve. While the condition is hardly seen these days due to the rare availability of bromide salts, the case does serve as a reminder against substances like these that can be purchased online without much oversight.

The patient’s story also raises alarm about how AI should be used with caution, especially in health contexts. The researchers even looked into whether or not ChatGPT was giving such responses, and it was, in fact, suggesting bromide as an alternative to chloride. No warning of toxicity was given, making the responses even more concerning. Chatbots cannot be medical professionals and offer the same judgment and responsibility, and thus, they should not be taken as experts in the field.

Meanwhile, AI models should come with better safety guardrails, especially when it comes to sensitive topics. We live in an age of AI tools; thus, curiosity should not overlap with caution and should never outweigh professional advice.



Source link

Share This Article
Leave a comment
Optimized by Optimole
Verified by MonsterInsights