谷歌浏览器插件
订阅小程序
在清言上使用

NeRF2Real: Sim2real Transfer of Vision-guided Bipedal Motion Skills using Neural Radiance Fields

CoRR(2023)

引用 13|浏览125
暂无评分
摘要
We present a system for applying sim2real approaches to "in the wild" scenes with realistic visuals, and to policies which rely on active perception using RGB cameras. Given a short video of a static scene collected using a generic phone, we learn the scene's contact geometry and a function for novel view synthesis using a Neural Radiance Field (NeRF). We augment the NeRF rendering of the static scene by overlaying the rendering of other dynamic objects (e.g. the robot's own body, a ball). A simulation is then created using the rendering engine in a physics simulator which computes contact dynamics from the static scene geometry (estimated from the NeRF volume density) and the dynamic objects' geometry and physical properties (assumed known). We demonstrate that we can use this simulation to learn vision-based whole body navigation and ball pushing policies for a 20 degree-of-freedom humanoid robot with an actuated head-mounted RGB camera, and we successfully transfer these policies to a real robot.
更多
查看译文
关键词
active perception,actuated head-mounted RGB camera,contact dynamics,dynamic objects,generic phone,NeRF rendering,NeRF vol-ume density,NeRF2real,Neural Radiance Field,Neural Radiance fields,physics simulator,realistic visuals,rendering engine,RGB cameras,short video,sim2real approaches,sim2real transfer,static scene geometry,view synthesis,vision-based whole body navigation,vision-guided bipedal motion skills,wild scenes
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要