Reflective Gaussian Splatting


1. School of Data Science, Fudan University
2. University of Surrey




Abstract


Novel view synthesis has experienced significant advancements owing to increasingly capable NeRF- and 3DGS-based methods. However, reflective object reconstruction remains challenging, lacking a proper solution to achieve real-time, high-quality rendering while accommodating inter-reflection. To fill this gap, we introduce a Reflective Gaussian splatting (Ref-Gaussian) framework characterized with two components: (I) Physically based deferred rendering that empowers the rendering equation with pixel-level material properties via formulating split-sum approximation; (II) Gaussian-grounded inter-reflection that realizes the desired inter-reflection function within a Gaussian splatting paradigm for the first time. To enhance geometry modeling, we further introduce material-aware normal propagation and an initial per-Gaussian shading stage, along with 2D Gaussian primitives. Extensive experiments on standard datasets demonstrate that Ref-Gaussian surpasses existing approaches in terms of quantitative metrics, visual quality, and compute efficiency. Further, we show that our method serves as a unified solution for both reflective and non-reflective scenes, going beyond the previous alternatives focusing on only reflective scenes. Also, we illustrate that Ref-Gaussian supports more applications such as relighting and editing.




Method Overview




Overview of the Ref-Gaussian framework:

First, starting from a set of 2D Gaussians equipped with material properties, we apply the splatting process to produce feature maps and perform ray-tracing on the extracted mesh to compute visibility for the specular term in the rendering equation. Next, we use the pixel-level feature maps to apply the rendering equation with split-sum approximation, yielding the final physically based rendering result.




Qualitative comparison



Comparison of novel view synthesis




Qualitative comparison of NVS results on Shiny Blender and Glossy synthetic dataset.



Comparison of geometry reconstruction


Qualitative comparison of geometry reconstruction results on Shiny Blender dataset.



Comparison of environment map estimation


Qualitative comparisons of the estimated environment maps on Glossy Synthetic dataset.





Inverse rendering results



Inverse rendering results on the Glossy Synthetic dataset






Inverse rendering results on Ref-Real dataset


Inverse rendering results on Ref-Real dataset. Indirect lighting: Only considering indirect light as specular component when rendering a focused view.





More qualitative results







Acknowledgements: The website template was borrowed from Lior Yariv. Image sliders are based on dics.