Study Finds AI Chatbots Can Give Misleading Health Advice

Medically reviewed by Carmen Pope, Senior Medical Editor, B. Pharm. Last updated on April 21, 2026.

via HealthDay

TUESDAY, April 21, 2026 — "Do I really need chemotherapy?"

"Is this natural remedy safer?"

"Does eating sugar cause cancer?"

As more people turn to artificial intelligence (AI) for quick answers to health questions like these, a new study finds the advice they receive can sometimes be incomplete, misleading or potentially harmful.

Researchers tested several popular AI chatbots to see how they handled common medical questions, including topics known to be prone to misinformation.

The results, recently published in BMJ Open, raised concerns.

In the study, nearly half of chatbot responses were "problematic." About 30% were "somewhat problematic," meaning they lacked full context, while 19.6% were considered "highly problematic," meaning they offered inaccurate or misleading information.

The team, based at the Lundquist Institute for Biomedical Innovation at Harbor-UCLA Medical Center, tested tools including ChatGPT, Google’s Gemini, Meta AI, DeepSeek and Grok.

Lead author Nicholas Tiller said the questions were designed to reflect how people often search for information online.

“A lot of people are asking exactly those questions,” Tiller told NBC News. “If somebody believes that raw milk is going to be beneficial, then the search terms are already going to be primed with that kind of language.”

Researchers asked about topics such as cancer, vaccines and whether products like 5G technology or antiperspirants cause cancer.

While many responses included accurate warnings, some introduced risky ideas.

When asked about alternatives to chemotherapy, for example, chatbots often said these options were not proven, but still suggested treatments like acupuncture, herbal remedies and special diets, NBC News reported. Some even pointed people to clinics offering these services.

Researchers called this "false balance," where scientific and unscientific information receive equal weight.

Doctors warn this kind of messaging can be harmful.

“Some of this stuff hurts people directly,” said Dr. Michael Foote, an assistant attending professor at Memorial Sloan Kettering Cancer Center in New York City, who was not involved in the study.

“Some of these medicines aren’t evaluated by the [U.S. Food and Drug Administration], can hurt your liver, hurt your metabolism and some of them hurt you by patients relying on them and not doing conventional treatments,” he said.

Foote added that AI can also create unnecessary fear.

"I’ve encountered where patients come in crying, really upset because the AI chatbot told them they have six to 12 months to live, which, of course, is totally ridiculous," he told NBC News.

The study found chatbot performance was similar across platforms, but Grok scored the lowest overall.

About one-third of adults now use AI for health advice, according to a recent KFF poll.

But AI isn’t yet ready for prime time, experts warn.

“The technology that’s needed, the methodology that’s needed for the FDA, for people, for doctors, to understand how it works and to have trust in the system is not there yet,” said Dr. Ashwin Ramaswamy, an instructor of urology at Mount Sinai Hospital in New York City.

Sources

  • NBC News, April 20, 2026
  • Disclaimer: Statistical data in medical articles provide general trends and do not pertain to individuals. Individual factors can vary greatly. Always seek personalized medical advice for individual healthcare decisions.

    Source: HealthDay

    Read more

    Disclaimer

    Every effort has been made to ensure that the information provided by Drugslib.com is accurate, up-to-date, and complete, but no guarantee is made to that effect. Drug information contained herein may be time sensitive. Drugslib.com information has been compiled for use by healthcare practitioners and consumers in the United States and therefore Drugslib.com does not warrant that uses outside of the United States are appropriate, unless specifically indicated otherwise. Drugslib.com's drug information does not endorse drugs, diagnose patients or recommend therapy. Drugslib.com's drug information is an informational resource designed to assist licensed healthcare practitioners in caring for their patients and/or to serve consumers viewing this service as a supplement to, and not a substitute for, the expertise, skill, knowledge and judgment of healthcare practitioners.

    The absence of a warning for a given drug or drug combination in no way should be construed to indicate that the drug or drug combination is safe, effective or appropriate for any given patient. Drugslib.com does not assume any responsibility for any aspect of healthcare administered with the aid of information Drugslib.com provides. The information contained herein is not intended to cover all possible uses, directions, precautions, warnings, drug interactions, allergic reactions, or adverse effects. If you have questions about the drugs you are taking, check with your doctor, nurse or pharmacist.

    Popular Keywords