Document Type
Conference Proceeding
Publication Date
9-10-2018
Abstract
A reliable, real time, multi-sensor fusion functionality is crucial for localization of actively controlled capsule endoscopy robots, which are an emerging, minimally invasive diagnostic and therapeutic technology for the gastrointestinal (GI) tract. In this study, we propose a novel multi-sensor fusion approach based on a particle filter that incorporates an online estimation of sensor reliability and a non-linear kinematic model learned by a recurrent neural network. Our method sequentially estimates the true robot pose from noisy pose observations delivered by multiple sensors. We experimentally test the method using 5 degree-of-freedom (5-DoF) absolute pose measurement by a magnetic localization system and a 6-DoF relative pose measurement by visual odometry. In addition, the proposed method is capable of detecting and handling sensor failures by ignoring corrupted data, providing the robustness expected of a medical device. Detailed analyses and evaluations are presented using ex vivo experiments on a porcine stomach model, proving that our system achieves high translational and rotational accuracies for different types of endoscopic capsule robot trajectories.
Publication Source (Journal or Book title)
Proceedings - IEEE International Conference on Robotics and Automation
First Page
5393
Last Page
5400
Recommended Citation
Turan, M., Almalioglu, Y., Gilbert, H., Araujo, H., Cemgil, T., & Sitti, M. (2018). EndoSensorFusion: Particle Filtering-Based Multi-Sensory Data Fusion with Switching State-Space Model for Endoscopic Capsule Robots. Proceedings - IEEE International Conference on Robotics and Automation, 5393-5400. https://doi.org/10.1109/ICRA.2018.8460472