An IoT-Enabled Smart Navigation Aid Integrating Ultrasonic and GPS Technologies for Visually Impaired Mobility
- Eliyana Ruslan
- Sahazati Md Rozali
- Ernie Che Mid
- Dayanasari Abdul Hadi
- 9476-9485
- Oct 30, 2025
- Information Technology
An IoT-Enabled Smart Navigation Aid Integrating Ultrasonic and GPS Technologies for Visually Impaired Mobility
Eliyana Ruslan*1, Sahazati Md Rozali2, Ernie Che Mid3, Dayanasari Abdul Hadi4
1,4Fakulti Teknologi dan Kejuruteraan Elektronik dan Komputer, Universiti Teknikal Malaysia Melaka
2Fakulti Teknologi dan Kejuruteraan Elektrik, Universiti Teknikal Malaysia Melaka
3Fakulti Kejuruteraan & Teknologi Elektrik (FKTE), Universiti Malaysia Perlis
*Corresponding Author
DOI: https://dx.doi.org/10.47772/IJRISS.2025.909000779
Received: 28 September 2025; Accepted: 04 October 2025; Published: 30 October 2025
ABSTRACT
Visually impaired individuals face persistent challenges in achieving safe and independent mobility. Conventional aids such as white canes and guide dogs provide basic assistance but lack obstacle detection, navigation support, and real-time monitoring. The objective of this study was to design and evaluate a low-cost IoT-enabled navigation aid that combines obstacle detection, GPS-based location tracking, and cloud connectivity to enhance the safety and independence of visually impaired users. The system was developed using an Arduino Uno R3 microcontroller as the central unit, interfaced with HC-SR04 ultrasonic sensors for obstacle detection, a NEO-6M GPS module for real-time location tracking, and a NodeMCU ESP8266 Wi-Fi module for wireless transmission. A buzzer and vibration motor were incorporated to deliver multimodal feedback to the user, while the Arduino IoT Cloud enabled continuous caregiver monitoring. The software architecture was programmed in Arduino IDE, with control logic handling sensor acquisition, threshold-based obstacle detection, GPS updates, and IoT transmission. Experimental validation was conducted through indoor and outdoor trials. Obstacle detection accuracy reached 98% within 20–150 cm, with reliable detection up to 200 cm and an average latency of 180 ms. Outdoor GPS tests achieved an average accuracy of ±3 m in open environments and ±6 m in semi-obstructed areas, while IoT monitoring demonstrated a mean communication delay of 1.2 seconds with >98% event logging success. Power profiling confirmed a battery endurance of 6.5 hours on a 2200 mAh lithium-ion pack. A pilot usability study with five visually impaired volunteers reported positive feedback on the system’s intuitive multimodal alerts. In conclusion, the proposed system achieves a practical trade-off between affordability, accuracy, and IoT-enabled monitoring, distinguishing it from prior standalone or cost-intensive solutions. Future enhancements will focus on hybrid indoor positioning, solar-assisted power, and machine learning-based object classification to expand functionality and autonomy.
Keywords: Visually Impaired Navigation, Arduino, Ultrasonic Sensor, GPS, IoT, Assistive Technology
INTRODUCTION
Vision plays a central role in human interaction with the environment, yet millions of individuals worldwide face severe limitations due to visual impairment. According to the World Health Organization, approximately 10% of visually impaired individuals experience complete loss of functional sight, resulting in dependence on caregivers and limited mobility. These challenges significantly restrict independence, quality of life, and access to education, employment, and social participation.
Conventional mobility aids such as white canes and guide dogs remain widely used but have inherent limitations. White canes rely on physical contact and provide no anticipatory feedback on obstacles, while guide dogs require significant training, high costs, and may raise cultural or logistical concerns. More recent technological solutions, including smart canes and wearable devices, have attempted to address these gaps; however, many remain expensive, lack real-time environmental awareness, or are not scalable for broad adoption.
To address these limitations, this study proposes a Smart Navigation Aid that leverages an Arduino platform integrated with ultrasonic sensors, GPS modules, and NodeMCU ESP8266 for IoT connectivity. The device provides real-time obstacle detection and location tracking, with updates accessible through the Arduino Cloud IoT application. Unlike existing solutions, this system emphasizes affordability, scalability, and cloud-based monitoring, making it accessible to users across diverse socioeconomic contexts.
This work contributes by (i) designing and prototyping a low-cost navigation system capable of real-time obstacle detection and GPS tracking, (ii) validating its performance through indoor and outdoor testing, and (iii) positioning the device within the broader framework of assistive technologies aligned with Sustainable Development Goals (SDG 3, 9, and 10). The findings demonstrate the feasibility of deploying IoT-enabled assistive systems that promote independence, safety, and inclusivity for visually impaired individuals.
LITERATURE REVIEW
Assistive technologies for the visually impaired have evolved significantly, moving from traditional mechanical aids to electronic, sensor-driven, and IoT-enabled solutions. Early approaches largely relied on ultrasonic sensing for obstacle detection. Jawale et al. [1] developed an Arduino-based guidance system with sonar detection that allowed users to avoid obstacles in both indoor and outdoor environments. Similarly, James and Harsola [2] presented a navigation cane with ultrasonic arrays and GPS, which improved route guidance but lacked networked communication capabilities. Dhal et al. [3] advanced this model by integrating ultrasonic sensors with GPS and GSM modules, enabling obstacle detection and emergency alerts. Yasir et al. [4] refined ultrasonic-based detection by combining vibration and sound feedback, showing reliable detection within 200 cm.
To overcome the limitations of single-sensor systems, researchers began integrating multiple sensing modalities. Jawale et al. [1] incorporated both ultrasonic and water sensors into a blind aid, allowing detection of solid and liquid obstacles, while Apprey et al. [5] proposed a solar-powered smart stick with ultrasonic and water sensors, GSM, and vibration feedback, ensuring energy efficiency for users in resource-constrained environments. Bhorge et al. [6] advanced this concept by integrating GPS, GSM, and ultrasonic modules with machine learning-based object classification, enhancing situational awareness and adaptability.
The adoption of IoT frameworks has transformed assistive navigation systems. Roy and Shah [7] developed an IoT-enabled navigation aid with ultrasonic and GPS modules, linked to the cloud for remote monitoring, demonstrating scalability and cost-effectiveness. Mahmud et al. [8] designed a system that combined ultrasonic and GPS sensing with audio guidance, enhancing real-time navigation. Kumar et al. [9] integrated Arduino with Android applications for voice-assisted routing, while Shreyas et al. [10] extended IoT-enabled route assistance using RFID and GPS modules, balancing performance with affordability. Ram et al. [11] proposed a similar low-cost smart cane with Arduino and GSM modules, highlighting affordability for mass deployment.
Wearable feedback-based systems represent another important research direction. Safa et al. [12] proposed a vibrotactile feedback system with GPS and ultrasonic sensors, delivering tactile cues that improved navigation in both familiar and unfamiliar settings. Hummadi et al. [13] emphasized the utility of Arduino-based sensors to provide auditory and tactile feedback, reporting significant reductions in mobility-related accidents. Illakiya and Loganathan [14] introduced a comprehensive mobility aid with ultrasonic sensors and GSM modules, enabling both obstacle detection and emergency communication.
Recent advances increasingly focus on multi-sensor and cloud-enabled assistive systems. Abdul Rahaim Mousa et al. [15] integrated ultrasonic, GPS, gyroscope, and heart rate sensors with cloud-based analytics, offering both navigation and health monitoring. Siddesh and Srinivasa [16] highlighted IoT frameworks as critical in enhancing quality of life, demonstrating improved efficiency and throughput in IoT-enabled navigation systems. Sreenivasa [17] developed “Blinds Eye,” which combined GPS, ultrasonic, infrared, and RF modules to enhance navigation while addressing the problem of misplaced devices. Kishwar et al. [18] built a walking assistance system combining ultrasonic and water sensors with Bluetooth-enabled Android applications, underscoring the role of smartphones in accessible real-time navigation aids.
Overall, the literature shows a progression from single-sensor ultrasonic systems to multi-sensor, IoT-enabled, and cloud-connected assistive devices. While significant progress has been made, persistent challenges remain in ensuring affordability, minimizing power consumption, and scaling solutions for low-income populations. Addressing these gaps, the present study proposes a low-cost Arduino-based smart navigation aid that integrates ultrasonic sensing, GPS tracking, and IoT cloud monitoring, aiming to combine affordability with functionality and scalability.
METHODOLOGY
System Architecture
The overall system was designed as an IoT-enabled smart navigation aid with three core functions: obstacle detection, location tracking, and real-time monitoring. The system architecture is illustrated in Figure 1, which presents a block diagram of the hardware and software integration.
Figure 1. Overall System Architecture Block Diagram
As shown in Figure 1, the Arduino Uno functions as the central processing unit, interfacing with ultrasonic sensors for obstacle detection, the GPS module for location tracking, and the NodeMCU ESP8266 for wireless data transmission. Output components include a buzzer and a vibration motor that provide auditory and tactile feedback. A rechargeable power supply ensures portability, while cloud connectivity via the Arduino IoT platform supports remote monitoring by caregivers. This architecture was deliberately selected to balance affordability, portability, and functionality, addressing limitations of earlier systems that either lacked IoT connectivity or were cost-prohibitive for large-scale adoption.
Hardware Components
The system incorporates low-cost but reliable modules that are widely used in assistive technology research:
- Arduino Uno R3 – Chosen for its open-source ecosystem, low cost, and sufficient I/O pins for multi-sensor integration.
- Ultrasonic Sensors (HC-SR04) – Two sensors were positioned at the front and lateral directions to detect obstacles within 2–200 cm. The front sensor enabled primary path monitoring, while the lateral sensor reduced blind spots.
- NEO-6M GPS Module – Provided real-time geolocation data with outdoor accuracy of ±3 m, supporting mobility in urban settings.
- NodeMCU ESP8266 – Enabled Wi-Fi connectivity, allowing obstacle detection events and GPS data to be uploaded to the Arduino IoT Cloud in real time.
- Output Modules – A buzzer and vibration motor ensured multimodal alerts, supporting both auditory and tactile feedback.
- Power Supply – A 2200 mAh lithium-ion battery powered the system, providing over 6 hours of continuous operation during testing.
- Software Implementation
The software was developed in the Arduino IDE using integrated sensor libraries. The control flow follows these steps:
- Ultrasonic sensors continuously acquire distance measurements.
- When an obstacle is detected below a 20 cm threshold, the Arduino triggers the buzzer and vibration motor.
- Simultaneously, the GPS module updates the current location, which is processed by the Arduino Uno R3.
- The NodeMCU ESP8266 transmits the obstacle detection event and GPS coordinates to the Arduino IoT Cloud.
- The cloud dashboard stores logs and enables real-time monitoring accessible by caregivers.
Figure 2. Software Flowchart
This software-hardware synergy ensures that users receive immediate local feedback, while caregivers have continuous access to tracking information.
Testing and Validation
To evaluate system performance, four sets of experiments were conducted:
- Obstacle Detection Test – Indoor tests with obstacles placed at varying distances confirmed sensor accuracy within 20–200 cm, with an average latency of <200 ms between detection and feedback.
- GPS Accuracy Test – Outdoor trials demonstrated positional accuracy of ±3 m in open areas and ±6 m in semi-obstructed locations. Indoor GPS reliability was limited due to signal attenuation.
- IoT Monitoring Test – Continuous 2-hour operation showed reliable data logging and event reporting to the Arduino IoT Cloud, with average transmission delays of <1.2 s.
- Power Consumption Test – Battery endurance was evaluated under full load, confirming approximately 6.5 hours of continuous use.
Figure 3. Final Design of the Smart Navigation Aid
Figure 4. Interface of Real time location tracking in Arduino Cloud IoT Application
Figures 3 and 4 illustrate the hardware prototype assembly and the Arduino IoT dashboard interface, respectively. These demonstrate the practical implementation of the design, bridging the system diagram (Figure 1) with real-world application.
RESULTS AND DISCUSSION
The ultrasonic module was tested by placing obstacles at distances between 20 cm and 200 cm in 10 cm increments. The detection accuracy remained above 98% within 20–150 cm, with a slight decrease to 94% at 200 cm due to signal attenuation and reflection effects. Figure 5 shows the relationship between measured and actual obstacle distances, highlighting the high reliability of the sensor system.
Figure 5. Obstacle Detection Accuracy vs. Distance
These findings confirm that the proposed system achieves higher precision than earlier single-sensor prototypes, such as Yasir et al. [4], while maintaining affordability.
GPS accuracy was evaluated outdoors across both open and semi-obstructed environments. The system achieved an average accuracy of ±3 m in open areas and ±6 m in partially obstructed spaces. Indoor performance was inconsistent, reflecting the known limitations of GPS technology. Figure 6 illustrates sample GPS tracking results compared to ground-truth positions.
Figure 6. GPS Tracking Accuracy in Outdoor Trials
Compared with earlier navigation sticks without IoT integration [2], [3], this system provides an additional layer of real-time location monitoring via the IoT cloud.
IoT functionality was validated by transmitting data from the NodeMCU ESP8266 to the Arduino IoT Cloud. The average delay between obstacle detection and cloud update was 1.2 seconds, which is acceptable for mobility applications. Figure 7 presents the latency distribution observed during continuous two-hour trials.
Figure 7. IoT Latency Distribution during Monitoring Test
This performance compares favorably with Roy and Shah [7], who reported longer delays in GSM-based IoT communication.
The system was powered by a 2200 mAh lithium-ion battery. Continuous operation tests indicated an average endurance of 6.5 hours before recharge. Figure 8 depicts battery discharge.
Figure 8. Battery Endurance Curve of Prototype System
While adequate for daily use, integrating solar-assisted charging [5] could enhance operational autonomy.
Table 1 provides a comparison between the proposed system and selected related works. The results indicate that while multi-sensor solutions (e.g., Mousa et al. [15]) provide richer functionality, they increase cost and complexity. In contrast, the proposed design achieves a balance between accuracy, IoT integration, and affordability, making it suitable for wide adoption.
Table 1: Comparative Analysis of Assistive Navigation Systems for the Visually Impaired
CONCLUSION
This study presented the design and evaluation of an IoT-enabled smart navigation aid for visually impaired individuals, built around the Arduino Uno R3 microcontroller. By integrating ultrasonic sensors for obstacle detection, a GPS module for location tracking, and NodeMCU ESP8266 for cloud connectivity, the system successfully combined real-time feedback and remote monitoring in a cost-effective framework. Experimental validation demonstrated 98% accuracy in obstacle detection within 20–150 cm, an average ±3 m outdoor GPS accuracy, and <1.2 s IoT communication latency. Battery testing confirmed an endurance of approximately 6.5 hours, sufficient for daily use. These results illustrate that the system effectively balances affordability, reliability, and IoT functionality, setting it apart from traditional ultrasonic-only aids [2], [4], and more complex multi-sensor devices [6], [15]. The system’s primary limitations include reduced GPS reliability indoors and moderate battery life for extended outdoor usage. Future improvements will focus on hybrid positioning technologies (Bluetooth beacons, RFID, or computer vision), solar-assisted power supply, and machine learning-based object recognition. Expanding the system with health monitoring sensors (e.g., heart rate, fall detection) could transform it into a comprehensive assistive platform for visually impaired users. Overall, this research demonstrates that IoT-enabled, low-cost assistive navigation systems can provide visually impaired individuals with safer and more independent mobility, while offering caregivers a reliable means of real-time monitoring. The proposed design contributes to the broader goal of developing inclusive assistive technologies aligned with the Sustainable Development Goals (SDG 3, 9 and 10).
ACKNOWLEDGEMENT
This work was supported in part by Fakulti Teknologi dan Kejuruteraan Elektronik dan Komputer (FTKEK) and Universiti Teknikal Malaysia Melaka (UTeM).
REFERENCES
- Jawale, R. V., Kadam, M., Gaikawad, R. S., & Kondaka, L. (2017). Ultrasonic navigation based blind aid for the visually impaired. In Proceedings of the IEEE International Conference on Power, Control, Signals and Instrumentation Engineering (ICPCSI) (pp. 923–928). IEEE.
- James, N. B., & Harsola, A. (2015). Navigation aiding stick for the visually impaired. In Proceedings of the International Conference on Green Computing and Internet of Things (ICGCIoT) (pp. 1254–1257). IEEE.
- Dhal, S., Agarwal, A., & Agarwal, K. (2016). Smart electronic travel stick for the visually challenged. American Journal of Electrical and Electronic Engineering, 4(6), 177–181.
- Yasir, M., Lestari, I. N., Setiawan, C., Ulfiah, Effendi, M. R., & Hamidi, E. A. Z. (2021). Design and implementation of the blind navigation aids using ultrasonic sensor. In Proceedings of the International Conference on Wireless Technologies (ICWT) (pp. 1–6). IEEE.
- Apprey, M. W., Agbevanu, K. T., Gasper, G. K., & Akoi, P. O. (2022). Design and implementation of a solar powered navigation technology for the visually impaired. Sensors International. Elsevier.
- Bhorge, S., Kasodekar, M., Karmalkar, A., & Kulkarni, V. (2024). Smart blind stick: Object detection & GPS integration for enhanced mobility. In Proceedings of the IEEE International Conference on Artificial Intelligence in Information and Communication (ICAIIC) (pp. 1653–1658). IEEE.
- Roy, M., & Shah, P. (2022). Internet of Things (IoT) enabled smart navigation aid for visually impaired. In Smart Computing Techniques and Applications (pp. 232–244). Springer.
- Mahmud, N. A., Hossain, T., Biswas, R. V., Joy, M. F. A., Raihan, M. A., & Banik, D. (2023). Assistive system and GPS tracker for the visually impaired. In Proceedings of the IEEE International Conference on Computer and Information Technology (ICCIT) (pp. 1–5). IEEE.
- Kumar, A., Jain, M., Saxena, R., Jain, V., Jaidka, S., & Sadana, T. (2019). IoT-based navigation for visually impaired using Arduino and Android. In Advances in Communication, Devices and Networking (pp. 561–569). Springer.
- Shreyas, S. K., D. G., & Ramaiah, N. (2019). IoT based route assistance for visually challenged. Materials Performance eJournal.
- Ram, D. C. S., Bhavana, V., & Kameshwari, M. (2023). Smart cane for visually impaired people using Arduino. International Journal of Scientific Research in Engineering and Management.
- Safa, M., Geetha, G., Elakkiya, U., & Saranya, D. (2018). Vibrotactile feedback system for assisting the physically impaired persons for easy navigation. Journal of Physics: Conference Series, 1000, 1–8. IOP Publishing.
- Hummadi, A. N., Salman, H. M., Ahmed, A. I., & Diachenko, O. (2024). Arduino-based sensor system for safe mobility of people with visual impairments. In Proceedings of the IEEE Conference of Open Innovations Association (FRUCT) (pp. 267–274). IEEE.
- Illakiya, N., & Loganathan, V. (2024). Comprehensive assistive mobility system for the visually impaired: The smart blind stick with ultrasonic sensor solution. In Proceedings of the International Conference on Innovative Computing and Technologies (ICICT) (pp. 2088–2094). IEEE.
- Mousa, A. R., et al. (2024). Enhancement of a smart assistive cane for visually impaired people based on IoT. In Proceedings of the International Conference on Advanced Engineering, Science, and Technology (AEST) (pp. 78–83). IEEE.
- Siddesh, K., & Srinivasa, S. (2021). IoT solution for enhancing the quality of life of visually impaired people. International Journal of Grid and High Performance Computing, 13(4), 1–23. IGI Global.
- Sreenivasa, S. M. (2024). BLINDS EYE: Advancing navigation for the visually impaired with Arduino Uno. International Journal of Scientific Research in Engineering and Management.
- Bh, M. K. H., Sikder, M. I. I., & Hasan, R. M. M. (2015). Walking assistance system for visually impaired people. In Proceedings of Conference (pp. 1–5). IEEE.








