Intelligent Autonomous Robotic Car for Real-Time Disaster Area Analysis and Navigation

Authors

E.A. Wanigasekara

Department of Electronics, Wayamba University of Sri Lanka, Kuliyapitiya (Sri Lanka)

Y.A.A. Kumarayapa

Department of Electronics, Wayamba University of Sri Lanka, Kuliyapitiya (Sri Lanka)

Article Information

DOI: 10.51584/IJRIAS.2025.10120082

Subject Category: Information Technology

Volume/Issue: 10/12 | Page No: 968-975

Publication Timeline

Submitted: 2025-12-29

Accepted: 2026-01-03

Published: 2026-01-17

Abstract

Efficient victim detection and reliable navigation remain major challenges in robotic search and rescue operations within disaster affected regions. This research describes the design and implementation of an AI-driven autonomous robot car capable of making real-time decisions in complex and hazardous environments. The proposed system employs a sensor fusion approach that combines visual human detection using YOLOv5, thermal-based classification through a convolutional neural network, and audio-based human voice detection. These AI modules are supported by additional sensors including ultrasonic sensors, INMP441 microphone, MPU6050 inertial unit, and gas sensors (MQ2 and MQ135), all coordinated using a Raspberry Pi 3B+, ESP32, and ESP32-CAM modules. Precise localization and remote communication are achieved using a NEO-6M GPS receiver and a SIM800L GSM module. A web-based monitoring platform is developed to display real-time sensor readings, survivor locations, and environmental hazard warnings at a base station. The system is validated using a physical prototype designed for low-cost, rapid deployment, and ease of use. Experimental observations indicate that the robot can autonomously navigate, identify potential survivors, and transmit critical information, highlighting its suitability for disaster-response applications.

Keywords

Autonomous Robot, Disaster Response, Sensor Fusion

Downloads

References

1. Anyfantis, A., Silis, A., & Blionas, S. (2021). A low-cost, mobile e-nose system with an effective user interface for real-time victim localization and hazard detection in USaR operations. Measurement: Sensors, 16, Article 100049. https://doi.org/10.1016/j.measen.2021.100049 [Google Scholar] [Crossref]

2. Biggie, H., & McGuire, S. (2022, March 23). Heterogeneous ground-air autonomous vehicle networking in austere environments: Practical implementation of a mesh network in the DARPA Subterranean Challenge. ResearchGate. https://www.researchgate.net/publication/359450481 [Google Scholar] [Crossref]

3. Leong, W. L., Cao, J., & Teo, R. (2024). Dynamic decentralized 3D urban coverage and patrol with UAVs. arXiv. https://arxiv.org/abs/2406.09828 [Google Scholar] [Crossref]

4. Mazhar, O., Babuska, R., & Kober, J. (2021). GEM: Glare or gloom, I can still see you – End-to-end multi-modal object detection. IEEE Robotics and Automation Letters, 6(4), 6321–6328. https://doi.org/10.1109/LRA.2021.3093871 [Google Scholar] [Crossref]

5. Mărieș, M., & Tătar, M. O. (2025). Design and simulation of mobile robots operating within networked architectures tailored for emergency situations. Preprints. https://doi.org/10.20944/preprints202504.1747.v1 [Google Scholar] [Crossref]

6. Merkle, N., et al. (2023). Drones4Good: Supporting disaster relief through remote sensing and AI. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops (ICCVW 2023) (pp. 3772–3776). IEEE. https://doi.org/10.1109/ICCVW60793.2023.00407 [Google Scholar] [Crossref]

7. Panagopoulos, D., Perrusquia, A., & Guo, W. (2024). Selective exploration and information gathering in search and rescue using hierarchical learning guided by natural language input. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC 2024) (pp. 1175–1180). IEEE. https://doi.org/10.1109/SMC54092.2024.10831125 [Google Scholar] [Crossref]

Metrics

Views & Downloads

Similar Articles