谷歌浏览器插件
订阅小程序
在清言上使用

ChatGPT for Advice on Common GI Endoscopic Procedures: the Promise and the Peril

iGIE(2023)

引用 0|浏览14
暂无评分
摘要
Background and AimsArtificial intelligence (AI) chatbots may be used by patients to obtain information. However, no studies have examined whether factual inaccuracies or limitations occur with such information. Our study aimed to determine if ChatGPT, an AI chatbot, can provide correct responses to patient questions on common endoscopic procedures.MethodsStudy team members posed standard questions on EGD, colonoscopy, ERCP, and EUS to ChatGPT. The responses were recorded and systematically appraised for factual correctness and potential safety issues.ResultsChatGPT provided clear, plain English advice on a range of common questions related to endoscopy. It provided generally accurate information on periprocedural care. EGD and colonoscopy were accurately described, with indications, alternatives, risks, and follow-up correctly described with minor errors. However, ChatGPT provided multiple factually incorrect responses on indications, alternatives, and risks of ERCP. We postulate that these may be because of underlying AI biases, such as representational, learning, and historic bias.ConclusionsChatGPT provided generally correct and safe advice for EGD and colonoscopy but had major factual errors when providing advice on ERCP. Use of ChatGPT and other AI chatbots for patient counseling must take into account errors that can arise from AI biases.
更多
查看译文
关键词
AI
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要