NeuRodin: A Two-stage Framework for
High-Fidelity Neural Surface Reconstruction


NeurIPS 2024

Yifan Wang1, Di Huang1, Weicai Ye†1,2, Guofeng Zhang2, Wanli Ouyang1, Tong He†1

† denotes corresponding author

1Shanghai AI Laboratory    2State Key Lab of CAD&CG, Zhejiang University   

Abstract


Signed Distance Function (SDF)-based volume rendering has demonstrated significant capabilities in surface reconstruction. Although promising, SDF-based methods often fail to capture detailed geometric structures, resulting in visible defects. By comparing SDF-based volume rendering to density-based volume rendering, we identify two main factors within the SDF-based approach that degrade surface quality: SDF-to-density representation and geometric regularization. These factors introduce challenges that hinder the optimization of the SDF field. To address these issues, we introduce NeuRodin, a novel two-stage neural surface reconstruction framework that not only achieves high-fidelity surface reconstruction but also retains the flexible optimization characteristics of density-based methods. NeuRodin incorporates innovative strategies that facilitate transformation of arbitrary topologies and reduce artifacts associated with density bias. Extensive evaluations on the Tanks and Temples and ScanNet++ datasets demonstrate the superiority of NeuRodin, showing strong reconstruction capabilities for both indoor and outdoor environments using solely posed RGB captures. All codes and models will be made public upon acceptance.


Results on the Tanks and Temples Training Set





Barn
Meetingroom
Ours
Neuralangelo
Ours
Neuralangelo


Results on the Tanks and Temples Advance Set


Ballroom
Museum
Ours
Neuralangelo
Ours
Neuralangelo


Results on the ScanNet++


0e75f3c4d9
355e5e32db
Ours
Neuralangelo
Ours
Neuralangelo


More results


Citation


@article{wang2024neurodin,
  title={NeuRodin: A Two-stage Framework for High-Fidelity Neural Surface Reconstruction.},
  author={Yifan Wang and Di Huang and Weicai Ye and Guofeng Zhang and Wanli Ouyang and Tong He},
    booktitle={arxiv preprint}, 
    year={2024}
}