PERFORMANCE EVALUATION AND ANALYSIS OF A VISION-BASED MOBILE ROBOT WITH SENSOR FUSION CAPABILITIES
Keywords:
autonomous mobile robo, sensor fusion, vision based mobile robotAbstract
The study focuses on the performance evaluation and analysis of a vision-based mobile
robot equipped with sensor fusion capabilities for line following tasks. The combination of camera
and IR sensor array were aimed to enhance the robot's line following capabilities by combining
data received by the sensors. The first step is developing a base model of 4-wheel mobile robot
with esp32 as the main controller. Sensor fusion techniques including data fusion and feature-
level fusion are employed to combine visual data from the Pixy2 camera and positioning data from
the IR sensor array. This fusion process results in a comprehensive representation of the line's
position and orientation thus enabling precise and accurate line tracking. The performance
evaluation focuses on assessing line tracking accuracy, stability and responsiveness of the vision-
based mobile robot with sensor fusion capabilities. Experimental results demonstrate the
effectiveness of the fusion approach in improving line following compared to using individual
sensors. The system exhibits robustness and adaptability in varying environmental conditions and
noise interference. Overall, this study has been able to generate another option of vision-based
mobile robots with sensor fusion capabilities providing valuable insights into the combination of
visual and positioning sensing for precise line following. The evaluated system serves as a
foundation for further fun experiments in the field of mobile robotics enabling autonomous robots
to navigate accurately.
References
Abdulazeez, A. M., & Faizi, F. S. (2021). Vision-based mobile robot controllers: a scientific review. Turkish Journal of Computer and Mathematics Education (TURCOMAT), 12(6), 1563-1580.
Agarwal, H., Tiwari, P., & Tiwari, R. G. (2019, December). Exploiting sensor fusion for mobile robot localization. In 2019 third international conference on I-SMAC (IoT in social, mobile, analytics and cloud)(I-SMAC) (pp. 463-466). IEEE.
Alatise, M., & Hancke, G. (2017). Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter. Sensors, 17(10), 2164. MDPI AG. Retrieved from http://dx.doi.org/10.3390/s17102164
Alatise, Mary B., and Gerhard P. Hancke. "A review on challenges of autonomous mobile robot and sensor fusion methods." IEEE Access 8 (2020): 39830-39846.
Barreto-Cubero, A. J., Gómez-Espinosa, A., Escobedo Cabello, J. A., Cuan-Urquizo, E., & Cruz-Ramírez, S. R. (2021). Sensor Data Fusion for a Mobile Robot Using Neural Networks. Sensors, 22(1), 305. MDPI AG. Retrieved from http://dx.doi.org/10.3390/s22010305.
Shahria, M. T., Sunny, M. S. H., Zarif, M. I. I., Ghommam, J., Ahamed, S. I., & Rahman, M. H. (2022). A Comprehensive Review of Vision-Based Robotic Applications: Current State, Components, Approaches, Barriers, and Potential Solutions. Robotics, 11(6), 139. MDPI AG. Retrieved from http://dx.doi.org/10.3390/robotics11060139
Surmann, H., Jestel, C., Marchel, R., Musberg, F., Elhadj, H., & Ardani, M. (2020). Deep reinforcement learning for real autonomous mobile robot navigation in indoor environments. arXiv preprint arXiv:2005.13857.