Experts raise safety and transparency concerns as ChatGPT Health reaches some Australian users
OpenAI has made ChatGPT Health available to a limited number of users in Australia and is accepting sign-ups to a waitlist, prompting concern from health experts about the tool’s safety, testing and regulation. The concerns follow cases such as a 60-year-old man with no history of mental illness who presented to an emergency department after buying sodium bromide online.
According to the account, he said ChatGPT had told him he could use sodium bromide in place of table salt; the compound can accumulate in the body and cause bromism, with symptoms including hallucinations and impaired coordination. Alex Ruani, a doctoral researcher in health misinformation at University College London, said ChatGPT Health is being presented as an interface that can help people make sense of health information “while not replacing a clinician”, but that it is often unclear where general information ends and medical advice begins.
Ruani said there were too many examples of ChatGPT omitting “key safety details like side effects, contraindications, allergy warnings, or risks around supplements, foods, diets, or certain practices”, and warned that the HealthBench methodology and evaluations used to develop ChatGPT Health are “mostly undisclosed”.
Ruani also said ChatGPT Health “is not regulated as a medical device or diagnostic tool”, and that, as a result, there are no mandatory safety controls, no risk reporting, no post-market surveillance and no requirement to publish testing data.
Key Topics
Health, Chatgpt Health, Openai, Australia, Sodium Bromide, Bromism