Physical Priors Augmented Event-Based 3D Reconstruction
CoRR(2024)
摘要
3D neural implicit representations play a significant component in many
robotic applications. However, reconstructing neural radiance fields (NeRF)
from realistic event data remains a challenge due to the sparsities and the
lack of information when only event streams are available. In this paper, we
utilize motion, geometry, and density priors behind event data to impose strong
physical constraints to augment NeRF training. The proposed novel pipeline can
directly benefit from those priors to reconstruct 3D scenes without additional
inputs. Moreover, we present a novel density-guided patch-based sampling
strategy for robust and efficient learning, which not only accelerates training
procedures but also conduces to expressions of local geometries. More
importantly, we establish the first large dataset for event-based 3D
reconstruction, which contains 101 objects with various materials and
geometries, along with the groundtruth of images and depth maps for all camera
viewpoints, which significantly facilitates other research in the related
fields. The code and dataset will be publicly available at
https://github.com/Mercerai/PAEv3d.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要