August 9, 2025
The man got to a psychiatric hospital after dietary consultation with Chatgpt thumbnail
Ukraine News Today

The man got to a psychiatric hospital after dietary consultation with Chatgpt

Vlad Cherevko I have been interested in all kinds of electronics and technologies since 2004. I like to play computer games, and I understand the work of different gadgets. I regularly monitor the news of the technology in the world and write materials about it.

In the US, a case of psychosis caused by bromide poisoning, which a man used for three months on the recommendation of Chatgpt, has been recorded. This was reported by Washington University doctors in Annals of International Medicine: Clinical Cases, Gizmodo writes.

The patient went to the hospital, claiming that his neighbor had poisoned. Although the physical indicators were within the normal range, the man showed paranoid behavior, refused to drink water when he wanted, he had auditory and visual hallucinations. Subsequently, he developed a full psychosis. During this condition, the man tried to escape from doctors, after which he was hospitalized forcibly in a psychiatric hospital.

Doctors suspected bromism-bromide poisoning, which rarely happened since the 1980s, when the substance has been removed from medicines because of its toxicity. They began to administer the patient intravenously fluid and antipsychotics so that his mental state could stabilize. After improvement, the man said that he began to take sodium bromide, trying to reduce sodium chloride (salt) in the diet.

Not finding clear tips in scientific sources, what to replace the salt, he turned to Chatgpt, which, he said, suggested replacing chloride with bromide. After that, the man acquired this substance online and began to use. Doctors suggest that Chatgpt could give a recommendation without proper context without warning about the risks.

Given the timeline of the case, the man appealed to the Chatgpt version 3.5 or 4.0. Doctors did not have access to the history of the patient’s chats, but protested Chatgpt 3.5, who really mentioned bromide as a replacement of chloride. She was probably referring to technical use, for example, in the field of purification, but not food. At the same time, the model did not ask about the purpose of the request and did not warn the toxicity of the substance.

The patient stabilized after treatment, was discharged after three weeks, and remained in a satisfactory state at a control examination.

Doctors stressed that although AI may be a useful tool, his or her advice should not replace consultation with a specialist, especially in issues related to health and safety.

”, – WRITE: mezha.media

Vlad Cherevko I have been interested in all kinds of electronics and technologies since 2004. I like to play computer games, and I understand the work of different gadgets. I regularly monitor the news of the technology in the world and write materials about it.

In the US, a case of psychosis caused by bromide poisoning, which a man used for three months on the recommendation of Chatgpt, has been recorded. This was reported by Washington University doctors in Annals of International Medicine: Clinical Cases, Gizmodo writes.

The patient went to the hospital, claiming that his neighbor had poisoned. Although the physical indicators were within the normal range, the man showed paranoid behavior, refused to drink water when he wanted, he had auditory and visual hallucinations. Subsequently, he developed a full psychosis. During this condition, the man tried to escape from doctors, after which he was hospitalized forcibly in a psychiatric hospital.

Doctors suspected bromism-bromide poisoning, which rarely happened since the 1980s, when the substance has been removed from medicines because of its toxicity. They began to administer the patient intravenously fluid and antipsychotics so that his mental state could stabilize. After improvement, the man said that he began to take sodium bromide, trying to reduce sodium chloride (salt) in the diet.

Not finding clear tips in scientific sources, what to replace the salt, he turned to Chatgpt, which, he said, suggested replacing chloride with bromide. After that, the man acquired this substance online and began to use. Doctors suggest that Chatgpt could give a recommendation without proper context without warning about the risks.

Given the timeline of the case, the man appealed to the Chatgpt version 3.5 or 4.0. Doctors did not have access to the history of the patient’s chats, but protested Chatgpt 3.5, who really mentioned bromide as a replacement of chloride. She was probably referring to technical use, for example, in the field of purification, but not food. At the same time, the model did not ask about the purpose of the request and did not warn the toxicity of the substance.

The patient stabilized after treatment, was discharged after three weeks, and remained in a satisfactory state at a control examination.

Doctors stressed that although AI may be a useful tool, his or her advice should not replace consultation with a specialist, especially in issues related to health and safety.

Related posts

In Lviv, the Ukrainian-Polish expedition began search and exhumation works

business ua

Four people were affected by shelling of the Nikopol district in Dnipropetrovsk region

radiosvoboda

Honor attack on Kharkiv: There was a fire in the city

radiosvoboda

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More