NEED TO KNOW
A man who wanted to improve his diet claimed he swapped sodium chloride, known as table salt, for sodium bromide, after misunderstanding ChatGPT’s advice, according to a case report
He ended up with bromine poisoning and developed paranoia and hallucinations, eventually being placed on involuntary psychiatric hold
Researchers warn that AI has the potential to contribute to the development of “preventable adverse health outcomes”
A man who consulted ChatGPT about his diet was placed on involuntary psychiatric hold after he poisoned himself by substituting sodium chloride, or table salt, for sodium bromide based on what he says the AI bot told him.
PEOPLE has reached out to OpenAI, the parent company of ChatGPT, for comment.
According to a case report in the Annals of Internal Medicine: Clinical Cases, a 60-year-old man went to the emergency room with claims that his neighbor was poisoning him. He had “no past psychiatric or medical history,” but his blood work prompted doctors to admit him to a telemetry bed for further evaluation. While being admitted, he told them he was on a strict diet and distilled his water; doctors noted he was “very thirsty but paranoid about water he was offered.”
Getty
Stock image of a salt shaker.
His condition significantly deteriorated after 24 hours, when he had “increasing paranoia and auditory and visual hallucinations.” After trying to escape, he was placed on an involuntary psychiatric hold for “grave disability.”
Initial tests made doctors suspect bromism, or bromine poisoning. Once the man began to recover, he shared symptoms in addition to his psychosis — insomnia, fatigue, red bumps — that further supported the bromine poisoning diagnosis.
https://people-app.onelink.me/HNIa/kz7l4cuf
As the U.S. Centers for Disease Control explains, bromide is a naturally occurring element that can be used in lieu of chlorine in pools and is used in agriculture or as a fire suppressant. There are no cures for bromine poisoning — just supportive care — and those who survive bromine poisoning may struggle with long-term effects.
The man said that he’d read about the drawbacks of table salt, or sodium chloride, and had decided to swap it with sodium bromide, which he bought online. He did so after “consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning,” the case study said, pointing out that the AI bot likely did not recommend he begin consuming sodium bromide.
The man was treated for three weeks before he was released.
Researchers tried to replicate the man’s ChatGPT search. Although they could not reproduce his exact questions, they input similar queries, and the AI tool “produced a response that included bromide. Though the reply stated that context matters, it did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do.”
Getty
Stock image of someone using their smartphone.
The researchers cautioned that, while AI “is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information, as it is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.”
The researchers warned that AI can “potentially contribute to the development of preventable adverse health outcomes.”
Never miss a story — sign up for PEOPLE’s free daily newsletter to stay up-to-date on the best of what PEOPLE has to offer, from celebrity news to compelling human interest stories.
Read the original article on People