ChatGPT Performs Well as 'Partner' in Diagnosing Patients

Medically reviewed by Drugs.com.

By Ernie Mundell HealthDay Reporter

TUESDAY, Dec. 12, 2023 -- Doctor's brains are great decision-makers, but even the smartest physicians might be well-served with a little diagnostic help from ChatGPT, a new study suggests.

The main benefit comes from a thinking process known as "probabilistic reasoning" -- knowing the odds that something will (or won't) happen.

“Humans struggle with probabilistic reasoning, the practice of making decisions based on calculating odds,” explained study lead author Dr. Adam Rodman, of Beth Israel Deaconess Medical Center in Boston.

“Probabilistic reasoning is one of several components of making a diagnosis, which is an incredibly complex process that uses a variety of different cognitive strategies," he explained in a Beth Israel news release. "We chose to evaluate probabilistic reasoning in isolation, because it is a well-known area where humans could use support.”

The Beth Israel team utilized data from a previously published survey of 550 health care practitioners. All had been asked to perform probabilistic reasoning to diagnose five separate medical cases.

In the new study, however, Rodman's team gave the same five cases to ChatGPT's AI algorithm, the Large Language Model (LLM), ChatGPT-4.

The cases included information from common medical tests, such as a chest scan for pneumonia, a mammography for breast cancer, a stress test for coronary artery disease and a urine culture for urinary tract infection.

Based on that info, the chatbot used its own probabilistic reasoning to reassess the likelihood of various patient diagnoses.

Of the five cases, the chatbot was more accurate than the human doctor for two; similarly accurate for another two; and less accurate for one. The researchers considered this a "draw" when comparing humans to the chatbot for medical diagnoses.

But the ChatGPT-4 chatbot excelled when a patients' tests came back negative (rather than positive), becoming more accurate at diagnosis than the doctors in all five cases.

“Humans sometimes feel the risk is higher than it is after a negative test result, which can lead to over-treatment, more tests and too many medications,” Rodman pointed out. He's an internal medicine physician and investigator in the department of medicine at Beth Israel.

The study was published Dec. 11 in JAMA Network Open.

It's possible then that doctors may someday work in tandem with AI to become even more accurate in patient diagnosis, the researchers said.

Rodman called that prospect "exciting."

"Even if imperfect, their [chatbots'] ease of use and ability to be integrated into clinical workflows could theoretically make humans make better decisions,” he said. “Future research into collective human and artificial intelligence is sorely needed.”

Sources

  • Beth Israel Deaconess Medical Center, news release, Dec. 11, 2023
  • Disclaimer: Statistical data in medical articles provide general trends and do not pertain to individuals. Individual factors can vary greatly. Always seek personalized medical advice for individual healthcare decisions.

    Source: HealthDay

    Read more

    Disclaimer

    Every effort has been made to ensure that the information provided by Drugslib.com is accurate, up-to-date, and complete, but no guarantee is made to that effect. Drug information contained herein may be time sensitive. Drugslib.com information has been compiled for use by healthcare practitioners and consumers in the United States and therefore Drugslib.com does not warrant that uses outside of the United States are appropriate, unless specifically indicated otherwise. Drugslib.com's drug information does not endorse drugs, diagnose patients or recommend therapy. Drugslib.com's drug information is an informational resource designed to assist licensed healthcare practitioners in caring for their patients and/or to serve consumers viewing this service as a supplement to, and not a substitute for, the expertise, skill, knowledge and judgment of healthcare practitioners.

    The absence of a warning for a given drug or drug combination in no way should be construed to indicate that the drug or drug combination is safe, effective or appropriate for any given patient. Drugslib.com does not assume any responsibility for any aspect of healthcare administered with the aid of information Drugslib.com provides. The information contained herein is not intended to cover all possible uses, directions, precautions, warnings, drug interactions, allergic reactions, or adverse effects. If you have questions about the drugs you are taking, check with your doctor, nurse or pharmacist.

    Popular Keywords