[go: up one dir, main page]

RadSplat: Radiance Field-Informed Gaussian Splatting for Robust Real-Time Rendering with 900+ FPS

Google

RadSplat enables high-quality real-time rendering of complex large-scale scenes at 900+ FPS.

Abstract

Recent advances in view synthesis and real-time rendering have achieved photorealistic quality at impressive rendering speeds. While Radiance Field-based methods achieve state-of-the-art quality in challenging scenarios such as in-the-wild captures and large-scale scenes, they often suffer from excessively high compute requirements linked to volumetric rendering. Gaussian Splatting-based methods, on the other hand, rely on rasterization and naturally achieve real-time rendering but suffer from brittle optimization heuristics that underperform on more challenging scenes. In this work, we present RadSplat, a lightweight method for robust real-time rendering of complex scenes. Our main contributions are threefold. First, we use radiance fields as a prior and supervision signal for optimizing point-based scene representations, leading to improved quality and more robust optimization. Next, we develop a novel pruning technique reducing the overall point count while maintaining high quality, leading to smaller and more compact scene representations with faster inference speeds. Finally, we propose a novel test-time filtering approach that further accelerates rendering and allows to scale to larger, house-sized scenes. We find that our method enables state-of-the-art synthesis of complex captures at 900+ FPS.

Method Overview

SVG mit img laden

Our method consists of three main steps:

  1. We first optimize a neural radiance field that acts as a robust prior and stable source of supervision.
  2. We use the trained NeRF model as initialization of a point-based Gaussian Splatting representation and optimize it with stable NeRF supervision. A novel pruning technique enables more compact scene representations that can be rendered at high FPS.
  3. As post-processing, we perform visibility filtering further speeding up rendering up to 900+ FPS.

Trajectory Comparison on Large-Scale Scenes

Comparison to 3DGS

Our method is more robust on challenging captures than 3D Gaussian Splatting (3DGS) while rendering at higher FPS.

Comparison to ZipNeRF

Our method achieves similar view synthesis quality as ZipNeRF while rendering 3000x faster.

View Comparison on Unbounded Scenes

Comparison to 3DGS

Our NeRF-based initialization and supervision leads to more stable and higher-quality view synthesis compared to 3DGS.

3DGS Ours 3DGS Ours 3DGS Ours 3DGS Ours

Comparison to ZipNeRF

We achieve higher SSIM and LPIPS than ZipNeRF on the Mip-NeRF360 benchmark while rendering 3000x faster.

ZipNeRF Ours ZipNeRF Ours ZipNeRF Ours ZipNeRF Ours

Trajectory Comparison on Unbounded Scenes

BibTeX

@article{niemeyer2024radsplat,
  author    = {Niemeyer, Michael and Manhardt, Fabian and Rakotosaona, Marie-Julie and Oechsle, Michael and Duckworth, Daniel and Gosula, Rama and Tateno, Keisuke and Bates, John and Kaeser, Dominik and Tombari, Federico},
  title     = {RadSplat: Radiance Field-Informed Gaussian Splatting for Robust Real-Time Rendering with 900+ FPS  },
  journal   = {arXiv.org},
  year      = {2024},
}

Acknowledgements

We would like to thank Georgios Kopanas, Peter Zhizhin, Peter Hedman, and Jon Barron for fruitful discussions and advice, Cengiz Oztireli for reviewing the draft, and Zhiwen Fan and Kevin Wang for sharing additional baseline results. The results we show above are from the Mip-NeRF360 and the ZipNeRF dataset. The website is built on top of the Nerfies template and uses the image slider, and the zoom-in video comparison is inspired by Binary Opacity Grids.