谷歌浏览器插件
订阅小程序
在清言上使用

3D Scene Creation and Rendering Via Rough Meshes: A Lighting Transfer Avenue

IEEE transactions on pattern analysis and machine intelligence(2024)

引用 0|浏览18
暂无评分
摘要
This paper studies how to flexibly integrate reconstructed 3D models intopractical 3D modeling pipelines such as 3D scene creation and rendering. Due tothe technical difficulty, one can only obtain rough 3D models (R3DMs) for mostreal objects using existing 3D reconstruction techniques. As a result,physically-based rendering (PBR) would render low-quality images or videos forscenes that are constructed by R3DMs. One promising solution would berepresenting real-world objects as Neural Fields such as NeRFs, which are ableto generate photo-realistic renderings of an object under desired viewpoints.However, a drawback is that the synthesized views through Neural FieldsRendering (NFR) cannot reflect the simulated lighting details on R3DMs in PBRpipelines, especially when object interactions in the 3D scene creation causelocal shadows. To solve this dilemma, we propose a lighting transfer network(LighTNet) to bridge NFR and PBR, such that they can benefit from each other.LighTNet reasons about a simplified image composition model, remedies theuneven surface issue caused by R3DMs, and is empowered by severalperceptual-motivated constraints and a new Lab angle loss which enhances thecontrast between lighting strength and colors. Comparisons demonstrate thatLighTNet is superior in synthesizing impressive lighting, and is promising inpushing NFR further in practical 3D modeling workflows.
更多
查看译文
关键词
3D scene creation,lighting transfer,neural rendering,physically-based rendering,scene synthesis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要