360-GS: Layout-guided Panoramic Gaussian Splatting For Indoor Roaming
CoRR(2024)
摘要
3D Gaussian Splatting (3D-GS) has recently attracted great attention with
real-time and photo-realistic renderings. This technique typically takes
perspective images as input and optimizes a set of 3D elliptical Gaussians by
splatting them onto the image planes, resulting in 2D Gaussians. However,
applying 3D-GS to panoramic inputs presents challenges in effectively modeling
the projection onto the spherical surface of 360^∘ images using 2D
Gaussians. In practical applications, input panoramas are often sparse, leading
to unreliable initialization of 3D Gaussians and subsequent degradation of
3D-GS quality. In addition, due to the under-constrained geometry of
texture-less planes (e.g., walls and floors), 3D-GS struggles to model these
flat regions with elliptical Gaussians, resulting in significant floaters in
novel views. To address these issues, we propose 360-GS, a novel 360^∘
Gaussian splatting for a limited set of panoramic inputs. Instead of splatting
3D Gaussians directly onto the spherical surface, 360-GS projects them onto the
tangent plane of the unit sphere and then maps them to the spherical
projections. This adaptation enables the representation of the projection using
Gaussians. We guide the optimization of 360-GS by exploiting layout priors
within panoramas, which are simple to obtain and contain strong structural
information about the indoor scene. Our experimental results demonstrate that
360-GS allows panoramic rendering and outperforms state-of-the-art methods with
fewer artifacts in novel view synthesis, thus providing immersive roaming in
indoor scenarios.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要