A medical journal has warned ChatGPT users not to rely on the chatbot for medical advice after a man developed a rare medical condition after following its instructions about removing salt from his diet. The 60-year-old-male checked into a hospital after developing bromism, aka bromide toxicity, after ChatGPT recommended he replace the sodium chloride (table salt) in his diet with bromide—an early 20th century sedative, according to the Annals of Internal Medicine. Three months after making the switch, the man checked himself into the hospital and told staff he believed he was being poisoned by his neighbor. Doctors noted he was paranoid about drinking water and displayed numerous symptoms of bromism, including facial acne, extreme thirst, and insomnia. He later tried to escape the hospital 24 hours after being admitted and was treated for psychosis. The patient later said he had been influenced to add bromide to his diet by ChatGPT, which doctors later found did not provide a health warning or ask why the information was being accessed. The Annals of Internal Medicine article acknowledged that AI could be a “bridge” between the medical community and general public but warned of the dangers of the “decontextualized information” presented by the service and advised doctors to consider AI when questioning patents where they had obtained their information from.
Read it at The Guardian





