AI Can Be Your Food Coach… Well, Perhaps Not

September 5, 2025

Is this better or worse than putting glue on pizza? TechSpot reveals yet another severe consequence of trusting AI: “Man Develops Rare 19th-Century Psychiatric Disorder After Following ChatGPT’s Diet Advice.” Writer Rob Thubron tells us:

“The case involved a 60-year-old man who, after reading reports on the negative impact excessive amounts of sodium chloride (common table salt) can have on the body, decided to remove it from his diet. There were plenty of articles on reducing salt intake, but he wanted it removed completely. So, he asked ChatGPT for advice, which he followed. After being on his new diet for three months, the man admitted himself to hospital over claims that his neighbor was poisoning him. His symptoms included new-onset facial acne and cherry angiomas, fatigue, insomnia, excessive thirst, poor coordination, and a rash. He also expressed increasing paranoia and auditory and visual hallucinations, which, after he attempted to escape, ‘resulted in an involuntary psychiatric hold for grave disability.’”

Yikes! It was later learned ChatGPT suggested he replace table salt with sodium bromide. That resulted, unsurprisingly, in this severe case of bromism. That malady has not been common since the 1930s. Maybe ChatGPT confused the user with a spa/hot tub or an oil and gas drill. Or perhaps its medical knowledge is just a bit out of date. Either way, this sad incident illustrates what a mistake it is to rely on generative AI for important answers. This patient was not the only one here with hallucinations.

Cynthia Murrell, September 5, 2025

Comments

Got something to say?





  • Archives

  • Recent Posts

  • Meta