ThermoNeRF: A multimodal Neural Radiance Field for joint RGB-thermal novel view synthesis of building facades
Thermal scene reconstruction holds great potential for various applications, such as building energy analysis and non-destructive infrastructure testing. However, existing methods rely on dense scene measurements and use RGB images for 3D reconstruction, incorporating thermal data only through a post-hoc projection. Due to the lower resolution of thermal cameras and the challenges of RGB/Thermal camera calibration, this post-hoc projection often results in spatial discrepancies between temperatures projected onto the 3D model and real temperatures at the surface. We propose ThermoNeRF, a novel multimodal Neural Radiance Fields (NerF) that renders new RGB and thermal views of a scene with joint optimization of the geometry and thermal information while preventing cross-modal interference. To compensate for the lack of texture in thermal images, ThermoNeRF leverages paired RGB and thermal images to learn scene geometry while maintaining separate networks for reconstructing RGB color and temperature values, ensuring accurate and modality-specific representations. We also introduce ThermoScenes, a dataset of paired RGB+thermal images comprising 8 scenes of building facades and 8 scenes of everyday objects enabling evaluation in diverse scenarios. On ThermoScenes, ThermoNeRF achieves an average mean absolute error of 1.13 °C for buildings and 0.41 °C for other scenes when predicting temperatures of previously unobserved views. This improves accuracy by over 50% compared to concatenated RGB+thermal input in standard NeRF. While ThermoNeRF performs well on aligned RGB-thermal images, future work could address misaligned or unpaired data for better generalization. Code and dataset are available online.
2-s2.0-105003540039
École Polytechnique Fédérale de Lausanne
École Polytechnique Fédérale de Lausanne
École Polytechnique Fédérale de Lausanne
Schindler EPFL Lab
2025-05-01
65
103345
REVIEWED
EPFL