谷歌浏览器插件
订阅小程序
在清言上使用

Multi-Label Classification in Patient-Doctor Dialogues with the RoBERTa-WWM-ext + CNN (robustly Optimized Bidirectional Encoder Representations from Transformers Pretraining Approach with Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.

JMIR MEDICAL INFORMATICS(2022)

引用 0|浏览10
暂无评分
关键词
online consultation,named entity,automatic classification,ERNIE,Enhanced Representation through Knowledge Integration,BERT,Bidirectional Encoder Representations from Transformers,machine learning,neural network,model,China,Chinese,classification,patient-physician dialogue,patient doctor dialogue,semantics,natural language processing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要