Visual-Tactile Robot Grasping Based on Human Skill Learning From Demonstrations Using a Wearable Parallel Hand Exoskeleton

IEEE Robotics and Automation Letters(2023)

引用 1|浏览4
暂无评分
摘要
The soft fingers and strategic grasping skills enable the human hands to grasp objects in a stable manner. This letter is to model human grasping skills and transfer the learned skills to robots to improve grasping quality and success rate. First, we designed a wearable tool-like parallel hand exoskeleton equipped with optical tactile sensors to acquire multimodal information, including hand positions and postures, the relative distance of the exoskeleton claws, and tactile images. Using the demonstration data, we summarized three characteristics observed from human demonstrations, involving varying-speed actions, grasping effect read from tactile images and grasping strategies for different positions. The characteristics were then utilized in the robot skill modelling to achieve a more human-like grasp. Since no force sensors are fixed to the claws, we introduced a new variable, called “grasp depth”, to represent the grasping effect on the object. The robot grasping strategy diagram is constructed as follows: First, grasp quality is predicted using a linear array network (LAN) and global visual images as inputs. The conditions such as grasp width, depth, position, and angle are also predicted. Second, with the grasp width and depth of the object determined, dynamic movement primitives (DMPs) are employed to mimic human grasp actions with varying velocities. To further enhance grasp quality, a final action adjustment based on tactile detection is performed during the near-grasp time. The proposed strategy was validated through experiments conducted with a Franka robot with a self-designed gripper. The results demonstrate that robot grasping test achieved an increase in the grasping success rate from 82% to 96%, compared to the results obtained by pure LAN and constant grasp depth testing.
更多
查看译文
关键词
Index Terms-Force and tactile sensing, learning from demonstration, exoskeleton, data-driven human modeling, robot grasping
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要