A Vibrotactile Bracelet for Emergency Alertsfor the Deaf Community in Day-To-Day Life

Authors

Shaharil Mad Saad

Faculty of Mechanical Engineering,Universiti Teknologi Malaysia 81310 UTM Johor BahruJohor (Malaysia)

Nor Ahsan nor Azman

Faculty of Mechanical Engineering,Universiti Teknologi Malaysia 81310 UTM Johor BahruJohor (Malaysia)

Article Information

DOI: 10.47772/IJRISS.2026.10100113

Subject Category: Social science

Volume/Issue: 10/1 | Page No: 1395-1411

Publication Timeline

Submitted: 2025-12-26

Accepted: 2026-01-03

Published: 2026-01-23

Abstract

Emergency alert systems that rely solely on auditory signals pose significant risks tothe Deaf and Hard- of-Hearing (DHH) community, especially in public and crowded environments. In response to this accessibility gap, this project focuses on the design and implementation of a vibrotactile emergency alert bracelet that delivers non-auditory feedback using real-time environmental sound recognition. The bracelet integrates an Arduino Nano RP2040 Connect microcontroller, which featuresan onboard MP34DT06JTR MEMS microphone (Arduino, n.d.), and employs a mini vibration motor and OLED display to provide tactile and visual alerts. Emergency sound types, including fire alarms, sirens, and public announcements, are classified using machine learning models trained with Edge Impulse. The vibration feedback is controlled through Linear Resonant Actuators (LRAs), chosen for their efficient, low- power haptic performance in wearable devices. Feature extraction is performed using Mel-Frequency Cepstral Coefficients (MFCC), and classification models are evaluated based on accuracy, latency, and robustness to untrained samples. The system was validated through real-world testing, and results demonstrate high classification accuracy for tonal alerts and effective user recognition of vibration patterns. Limitations remain in detecting speech-based announcements. Battery drains tests and user surveys confirm the system’s reliability for daily short-term usage. This project presents a cost-effective, wearable solution that enhances situational awareness and safety for the DHH community in emergency scenarios.

Keywords

Vibrotactile alert system, MEMS microphone, Arduino RP2040 Connect

Downloads

References

1. Basner, M., Babisch, W., Davis, A., Brink, M., Clark, C., Janssen, S. and Stansfeld, S. (2014) ‘Auditory and non-auditory effects of noise on health’, The Lancet, 383(9925), pp. 1325–1332. [Google Scholar] [Crossref]

2. Berglund, B., Lindvall, T. and Schwela, D. H. (1999) Guidelines for community noise. World Health Organization. [Google Scholar] [Crossref]

3. Borenstein, J. T., Weinberg, E. J. and Kaazempur Mofrad, M. R. (2002) ‘Microfabrication technologies for microfluidic devices’, Annual Review of Biomedical Engineering, 4, pp. 261–286.Cogan, S. F. (2008) ‘Neural stimulation and recording electrodes’, Annual Review of Biomedical Engineering, 10, [Google Scholar] [Crossref]

4. pp. 275–309. [Google Scholar] [Crossref]

5. Dahiya, R. S., Metta, G., Valle, M. and Sandini, G. (2010) ‘Tactile sensing—from humans to humanoids’, IEEE Transactions on Robotics, 26(1), pp. 1–20. [Google Scholar] [Crossref]

6. Arduino. (n.d.). Arduino Nano RP2040 Connect. Retrieved from https://docs.arduino.cc/hardware/nano- rp2040-connect [Google Scholar] [Crossref]

7. Edge Impulse. (n.d.). Edge Impulse Documentation. Retrieved from https://docs.edgeimpulse.com/ Adafruit. (n.d.). SSD1306 OLED Display Breakout Board. Retrieved from https://learn.adafruit.com/monochrome-oled-breakouts [Google Scholar] [Crossref]

8. TP4056. (n.d.). TP4056 Li-Ion Battery Charger Module. Retrieved from https://datasheet.lcsc.com/lcsc/1811141430_Nanjing-Top-Power-TP4056_C16518.pdf [Google Scholar] [Crossref]

9. Precision Microdrives. (n.d.). Vibration Motors Product Guide. Retrieved from https://www.precisionmicrodrives.com/ [Google Scholar] [Crossref]

10. Davis, S., & Mermelstein, P. (1980). Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences. IEEE Transactions on Acoustics, Speech, and Signal Processing, 28(4), 357–366. https://doi.org/10.1109/TASSP.1980.1163420 [Google Scholar] [Crossref]

11. Warden, P. (2018). Speech commands: A dataset for limited-vocabulary speech recognition. arXiv preprint arXiv:1804.03209. [Google Scholar] [Crossref]

12. Zhang, X., Wang, J., & Zhao, Z. (2022). Real-time sound classification on embedded devices using TinyML. Sensors, 22(4), 1675. https://doi.org/10.3390/s22041675 [Google Scholar] [Crossref]

13. Yin, Y., You, Z., & Cui, H. (2021). A lightweight neural network for real-time sound classificationon edge devices. Journal of Intelligent & Fuzzy Systems, 40(3), 4367–4378. https://doi.org/10.3233/JIFS- 202742 [Google Scholar] [Crossref]

14. Podlubny, I. (1999). Fractional-order systems and fractional-order controllers. Institute of Experimental Physics, Slovak Academy of Sciences. [Google Scholar] [Crossref]

15. Yang, X. S., & Deb, S. (2009). Cuckoo search via Lévy flights. In 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC) (pp. 210–214). IEEE. https://doi.org/10.1109/NABIC.2009.5393690 [Google Scholar] [Crossref]

16. Wang, Y., Duan, H., & Yu, Y. (2012). A modified cuckoo search algorithm for nonlinear system identification. Expert Systems with Applications, 39(10), 8545–8551. [Google Scholar] [Crossref]

18. https://doi.org/10.1016/j.eswa.2012.01.199 [Google Scholar] [Crossref]

19. World Health Organization. (2021). World report on hearing. Retrieved from https://www.who.int/publications/i/item/world-report-on-hearing [Google Scholar] [Crossref]

20. Jayaraman, S., & Sun, Y. (2017). Wearable assistive devices for the blind and deaf. Wearable and Implantable Medical Devices, 253–269. https://doi.org/10.1016/B978-0-12-811994-4.00012-5 Mauriello, M. L., Ganesan, D., & Fan, J. (2020). Human-centered wearable design for just-in-time situational awareness in the Deaf and Hard-of-Hearing community. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 4(2), 1–26. https://doi.org/10.1145/3397327 [Google Scholar] [Crossref]

Metrics

Views & Downloads

Similar Articles