Watch CBS News

Artificial intelligence and making medical decisions, one Pittsburgh doctor warns "ChatGPT can't listen to your heart."

Going to the internet when you have an ailment has become commonplace; Google it or go to WebMD to see what's really happening to you. 

Now, artificial intelligence is taking that practice to a new level, and it comes with potential dangers. 

We can all agree that your health is not something to leave to chance, and with that in mind, remember that things like ChatGPT are artificial intelligence, and not a doctor. 

These days, though, doctors are not surprised to hear patients consult the internet for their ailments. 

"You know, [they're] a little bit anxious or nervous about their symptoms, and it's very quick and easy to do, so we all do it," said AHN Medical Director of Information Technology Dr. James Solava. "Should you trust what ChatGPT comes up with? No, never, you always need to verify that information." 

Dr. Solava said that with good reason, explaining that ChatGPT and other LLMs often hallucinate - they generate content, and that content is not always correct, especially if you're having symptoms such as chest pains, shortness of breath, slurred speech, or other stroke symptoms. 

"It could be life or death in medicine," he said. "Time is critical, you're on the clock, you have to get to a health care provider to take care of that within a certain amount of time to have the best outcomes. You don't want to really be messing around, trying to look that kind of thing up in ChatGPT, Google, or WebMD." 

Dr. Solava said that not only might it be wrong, but to remember that ChatGPT aims to please and will tell you what you want to hear. 

"ChatGPT can't listen to your heart and ChatGPT can't listen to your lungs or feel your abdomen to see what's really going on with you," he said. 

Being in the room with a physician is something that can't be replicated by AI. 

"It doesn't have the ability to ask follow-up questions, and might not have the ability to ask the right follow-up questions, and that's really where the art of medicine and those years of experience really pay off," Dr. Solava said. 

So, the bottom line is, use it for minor things, but don't trust it to guide your self-treatment because it might be wrong. 

When Dr. Solava says it's a matter of life and death, he isn't kidding, nor is he being dramatic. 

View CBS News In
CBS News App Open
Chrome Safari Continue