VerAs: Verify then Assess STEM Lab Reports
CoRR(2024)
摘要
With an increasing focus in STEM education on critical thinking skills,
science writing plays an ever more important role in curricula that stress
inquiry skills. A recently published dataset of two sets of college level lab
reports from an inquiry-based physics curriculum relies on analytic assessment
rubrics that utilize multiple dimensions, specifying subject matter knowledge
and general components of good explanations. Each analytic dimension is
assessed on a 6-point scale, to provide detailed feedback to students that can
help them improve their science writing skills. Manual assessment can be slow,
and difficult to calibrate for consistency across all students in large
classes. While much work exists on automated assessment of open-ended questions
in STEM subjects, there has been far less work on long-form writing such as lab
reports. We present an end-to-end neural architecture that has separate
verifier and assessment modules, inspired by approaches to Open Domain Question
Answering (OpenQA). VerAs first verifies whether a report contains any content
relevant to a given rubric dimension, and if so, assesses the relevant
sentences. On the lab reports, VerAs outperforms multiple baselines based on
OpenQA systems or Automated Essay Scoring (AES). VerAs also performs well on an
analytic rubric for middle school physics essays.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要