Human-Centered Computing and Extended Reality, Friedrich-Alexander University (FAU) Erlangen-Nurnberg,
Erlangen, Germany
1
Institute for Distributed Intelligent Systems University of the Bundeswehr Munich Munich, Germany
2
Lehrstuhl fur Graphische Datenverarbeitung (LGDV) Friedrich-Alexander Universität (FAU)
Erlangen-Nürnberg Erlangen, Germany
3
Technical University of Munich, School of Medicine and Health, Klinikum rechts der Isar, Orthopaedics and Sports Orthopaedics, Munich, Germany
4
Abstract
Novel view synthesis using neural radiance fields (NeRF) is the state-of-the-art technique for generating
high-quality images from novel viewpoints. Existing methods require a priori knowledge about extrinsic and
intrinsic camera parameters. This limits their applicability to synthetic scenes, or real-world scenarios
with the necessity of a preprocessing step. Current research on the joint optimization of camera parameters
and NeRF focuses on refining noisy extrinsic camera parameters and often relies on the preprocessing of
intrinsic camera parameters. Further approaches are limited to cover only one single camera intrinsic. To
address these limitations, we propose a novel end-to-end trainable approach called NeRFtrinsic Four. We
utilize Gaussian Fourier features to estimate extrinsic camera parameters and dynamically predict varying
intrinsic camera parameters through the supervision of the projection error. Our approach outperforms
existing joint optimization methods on LLFF and BLEFF. In addition to these existing datasets, we introduce
a new dataset called iFF with varying intrinsic camera parameters. NeRFtrinsic Four is a step forward in
joint optimization NeRF-based view synthesis and enables more realistic and flexible rendering in real-world
scenarios with varying camera parameters.
@misc{schieber2023nerftrinsic,
  title={NeRFtrinsic Four: An End-To-End Trainable NeRF Jointly Optimizing Diverse Intrinsic and Extrinsic Camera Parameters},
  author={Hannah Schieber and Fabian Deuser and Bernhard Egger and Norbert Oswald and Daniel Roth},
  year={2023},
  eprint={2303.09412},
  archivePrefix={arXiv},
 primaryClass={cs.CV}
}