Work-in-Progress: Computing Sentence Similarity for Short Texts using Transformer models

PROCEEDINGS OF THE 2022 IEEE GLOBAL ENGINEERING EDUCATION CONFERENCE (EDUCON 2022)(2022)

引用 0|浏览2
暂无评分
摘要
The field of natural language processing is being revolutionized with transformers. The latter is based on a novel type of neural network framework that is already pre-trained. Hence, large datasets to train models are no longer required. This framework is suitable for automated assessment systems (AAS), where a large number of labeled data is needed. The larger the dataset, the higher the accuracy of the AAS. In this work-in-progress paper, a prototype for an AAS has been built where two transformer models, namely the Sentence-Transformers from hugging face and the OpenAI GPT-3 models have been used. The transformer models generate the similarity index between students' answers and reference answers from the Texas dataset. Then the similarity index is used to compute marks for students. The performance of the prototype is evaluated using the quadratic weighted kappa metric.
更多
查看译文
关键词
Sentence Similarity, Automated Assessment System, Transformers
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要