Crowdae: A Crowdsourcing System With Human Inspection Quality Enhancement For Web Accessibility Evaluation
COMPUTERS HELPING PEOPLE WITH SPECIAL NEEDS, PT I(2018)
摘要
Crowdsourcing technology can help manual testing by soliciting the contributions from volunteer evaluators. But crowd evaluators may give inaccurate or invalid evaluation results. This paper proposes an advanced crowdsourcing-based web accessibility evaluation system called CrowdAE by enhancing the crowdsourcing-based manual testing module of the previous version. Through three main process namely learning system, task assignment and task review, we can improve the quality of evaluation results from the crowd. From the comparison on the two years' evaluation process of Chinese government websites, our CrowdAE outperforms the previous version and improve the accuracy of the evaluation results.
更多查看译文
关键词
Web accessibility evaluation, Crowdsourcing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络