Depth and thermal sensor fusion to enhance 3D thermographic reconstruction
Three-dimensional (3D) geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a thermal-guided iterative closest point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through e ective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.
Publication Source (Journal or Book title)
Cao, Y., Xu, B., Ye, Z., Yang, J., Cao, Y., Tisse, C., & Li, X. (2018). Depth and thermal sensor fusion to enhance 3D thermographic reconstruction. Optics Express, 26 (7), 8179-8193. https://doi.org/10.1364/OE.26.00819