A Linguistic Communication Interpretation Wearable Device for Deaf and Mute User

Main Article Content

Adil Rehman
Abdulhadi Shoufan

Keywords

American Linguistic Communication, Metacarpophalangeal joints, Arduino Uno, Flex Sensors, Distal & Proximal interphalangeal joints, Sign to Speech App

Abstract

There is a segment of society, which does not have access to today's sophisticated acoustics, but gesture-based sign language, such as using the hands or the shoulders of the eyes, can be a vital tool for making sure their audio is audible. The most widely used sign language in the world, known as ALC—American Linguistic Communication—varies slightly depending on the nation. Deaf and mute people can communicate effectively by using hand gestures to convey their message. The wearable good glove we developed for this study will translate ALC motions into the proper alphabets and words. It makes use of a glove with a number of flex sensors on the fingers' distal and proximal interphalangeal joints as well as the metacarpophalangeal joint to detect finger bending. The complete system is divided into three units: a wearable hand glove unit with a flexible device that records user-created ALC gestures, a processing unit in charge of taking sensor data, and a final unit that uses a machine classifier to identify the appropriate alphabet. In order to receive known alphabet data in text form through a wired channel via the mobile "Sign to Speech App," which presented that text data into this app, the smartphone unit is linked to the processing unit. Its user-friendly design, low cost, and availability on mobile platforms give it an edge over traditional gesture language techniques.

References

El-Din, Salma A. Essam, and Mohamed A. Abd El-Ghany. "Sign Language Interpreter System: An alternative system for machine learning." In 2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES), pp. 332-337. IEEE, 2020.

Lako, Majlinda, Konstantina M. Stankovic, and Miodrag Stojkovic. "Special Series: Stem Cells and Hearing Loss." Stem Cells 39, no. 7 (2021): 835-837.

Yudhana, Anton, J. Rahmawan, and C. U. P. Negara. "Flex sensors and MPU6050 sensors responses on smart glove for sign language translation." In IOP conference series: materials science and engineering, vol. 403, no. 1, p. 012032. IOP Publishing, 2018.

Newport, Elissa L., and Richard P. Meier. "The acquisition of American sign language." In The crosslinguistic study of language acquisition, pp. 881-938. Psychology Press, 2017.

B. M. Sprenger, “Lifeprint,” [Online]. Available: https://www.lifeprint.com/asl101/pages- layout/clerc-laurent2.htm. [Accessed 27 February 2005].

Wen, Feng, Zixuan Zhang, Tianyiyi He, and Chengkuo Lee. "AI enabled sign language recognition and VR space bidirectional communication using triboelectric smart glove." Nature communications 12, no. 1 (2021): 1-13.

Schlenker, Philippe. "Sign language semantics: Problems and prospects." Theoretical Linguistics 44, no. 3-4 (2018): 295-353.

Rautaray, Siddharth S. "Real time hand gesture recognition system for dynamic applications." International Journal of ubicomp (IJU) 3.1 (2012).

Arif, Arslan, Syed Tahir Hussain Rizvi, Iqra Jawaid, Muhammad Adam Waleed, and Muhammad Raheel Shakeel. "Techno-talk: An american sign language (asl) translator." In 2016 International Conference on Control, Decision and Information Technologies (CoDIT), pp. 665-670. IEEE, 2016.

Mohandes, Mohamed, Mohamed Deriche, and Junzhao Liu. "Image-based and sensor-based approaches to Arabic sign language recognition." IEEE transactions on human-machine systems 44, no. 4 (2014): 551-557.

O’Connor, Timothy F., Matthew E. Fach, Rachel Miller, Samuel E. Root, Patrick P. Mercier, and Darren J. Lipomi. "The Language of Glove: Wireless gesture decoder with low-power and stretchable hybrid electronics." PloS one 12, no. 7 (2017): e0179766.

Liang, Rung-Huei, and Ming Ouhyoung. "A real-time continuous gesture recognition system for sign language." In Proceedings third IEEE international conference on automatic face and gesture recognition, pp. 558-567. IEEE, 1998.

Praveen, Nikhita, Naveen Karanth, and M. S. Megha. "Sign language interpreter using a smart glove." In 2014 International Conference on Advances in Electronics Computers and Communications, pp. 1-5. IEEE, 2014.

Starner, Thad, Joshua Weaver, and Alex Pentland. "Real-time american sign language recognition using desk and wearable computer based video." IEEE Transactions on pattern analysis and machine intelligence 20, no. 12 (1998): 1371-1375.

Lee, Boon Giin, and Su Min Lee. "Smart wearable hand device for sign language interpretation system with sensors fusion." IEEE Sensors Journal 18, no. 3 (2017): 1224-1232.