谷歌浏览器插件
订阅小程序
在清言上使用

Comparison of Moderated and Unmoderated Remote Usability Sessions for Web-Based Simulation Software: A Randomized Controlled Trial.

Pedram Khayyat Khoshnevis, Savanah Tillberg,Eric Latimer,Tim Aubry,Andrew Fisher,Vijay Mago

International Conference on Human-Computer Interaction (HCI International)(2022)

引用 1|浏览2
暂无评分
摘要
Usability studies are a crucial part of developing user-centered designs and they can be conducted using a variety of different methods. Unmoderated usability surveys are more efficient and cost-effective and lend themselves better to larger participant pools in comparison to moderated usability surveys. However, unmoderated usability surveys could increase the collection of unreliable data due to the survey participants' careless responding (CR). In this study, we compared the remote moderated and remote unmoderated usability testing sessions for a web-based simulation and modeling software. The usability study was conducted with 72 participants who were randomly assigned into a moderated and unmoderated groups. Our results show that moderated sessions produced more reliable data in most of the tested outcomes and that the data from unmoderated sessions needed some optimization in order to filter out unreliable data. We discuss methods to isolate unreliable data and recommend ways of managing it.
更多
查看译文
关键词
Usability evaluation methods,Careless responding,Insufficient effort responding,Online survey,Survey design
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要