谷歌Chrome浏览器插件
订阅小程序
在清言上使用

Towards a Data-Driven Method for RGB Video-Based Hand Action Quality Assessment in Real Time

Tianyu Wang, Minhao Jin, Jingying Wang, Yijie Wang, Mian Li

SAC '20: The 35th ACM/SIGAPP Symposium on Applied Computing Brno Czech Republic March, 2020(2020)

引用 3|浏览16
暂无评分
摘要
in recent years, the research community has begun to explore Video Based Action Quality Assessment on Human Body (VB-AQA), while few work focuses on Video-Based Action Quality Assessment on Human Hand (VH-AQA) yet. The current work on VB-AQA fails to deal with the inconsistency between captured features and the reality due to the changing angles of the camera, leaving a huge gap between VB-AOA and VH-AQA, while the computational efficiency is another critical problem. In this paper, a novel data-driven method for real-time VH-AQA is proposed. Features are formulated as spatio-temporal hand poses in this method and extracted via four steps: hand segmentation, 2D hand pose estimation, 3D hand pose estimation and hand pose organization. Based on the extracted features an assessment model is applied to evaluate the performance of actions and indicate the most promising adjustment as the feedback. We demonstrate the evaluation accuracy and computational efficiency of our method using our own Origami Video Dataset. For the latter, two new metrics are designed. It turns out that our method provides opportunities for real-time digital reconstruction of physical world activities and timely assessment.
更多
查看译文
关键词
Video-Based Action Quality Assessment on Human Hand,hand pose organization,real time,Origami dataset,data-driven
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要