New Delhi: People rated medical advice as “less trustworthy and empathetic” when they believed it was provided by artificial intelligence chatbots, according to a new study published in the journal Natural medicine The researchers found that this was also true when patients or those seeking medical advice were led to believe that a doctor had accepted help from artificial intelligence (AI) before sharing your guidance.
As a result, people were also found to be less willing to follow AI recommendations, compared to advice from human doctors provided solely based on their medical expertise, the study led by University of Wurzburg in Germany.
The results suggested that people Rely on medical advice less so if they suspect AI involvement, even as people around the world increasingly turn to ChatGPT for health related informationthe researchers said.
Given the recognition of AI’s potential to reduce bureaucracy and the daily workload of physicians, the authors argued that the findings are important because “confidence in medical diagnoses and therapy recommendations” are essential for the success of the treatment.
The study scenario is based on a digital health platform Where from Information on medical issues They can be obtained, the researchers said.
For the study, more than 2,000 participants received identical medical advice and were asked to rate their trustworthiness, understandability, and empathy.
Participants were divided into three groups. One group of participants was told that the advice came from a doctor, the second was told that it came from an AI chatbot. The third was led to believe that a doctor provided the medical advice with the help of an AI.
The researchers found that advice provided by human doctors also scored higher on empathy, compared to advice provided by AI.
The study represented a starting point for detailed research into the conditions under which AI can be used in diagnosis and therapy without jeopardizing patient trust and cooperation, the authors said.