Modelling of basic Indonesian Sign Language translator based on Raspberry Pi technology

Язык статьи - English

Ссылка для цитирования:
Fadlilah U., Prasetyo R.A.R., Mahamad A.K., Handaga B., Saon S., Sudarmilah E. Modelling of basic Indonesian Sign Language translator based on Raspberry Pi technology. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 2022, vol. 22, no. 3, pp. 574–584. doi: 10.17586/2226-1494-2022-22-3-574-584

Deaf people have hearing loss from mild to very severe. Such people have difficulty processing language information both with and without hearing aids. Deaf people who do not use hearing aids use sign language in their everyday conversations. At the same time, it is difficult for general people to communicate with the deaf, so in order to communicate with the deaf they must know sign language. There are two sign languages in Indonesia, namely SIBI (Indonesian Sign Language System) and BISINDO (Indonesian Sign Language). To help with communication between deaf and normal people, we developed a model using the one-handed SIBI method as an example, and then further developed it using the one-handed and two-handed BISINDO. The main function of the method is the recognition of basic letters, words, sentences and numbers using a Raspberry Pi single-board computer and a camera which are designed to detect the movements of language gestures. With the help of a special program, images are translated into text on the monitor screen. The method used is image processing and machine learning using the Python programming language and Convolutional Neural Network techniques. The device prototype issues a warning to repeat the sign language if the translation fails, and delete the translation if it doesn’t match the database. The prototype of the device requires further development providing its flexibility: to provide reading of dynamic movements, facial expressions, to provide translation of words not included in the existing database. You need to add a database other than SIBI, such as BISINDO, or sign languages from other regions or countries.

Ключевые слова: CNN, deaf people, droidcam, image processing, machine learning, Python, Raspberry Pi, SIBI, sign language, smartphone camera, webcam

Благодарности. This work was supported by a research grant TIER 1, H917. The researchers thank all those who contributed to the preparation of the work, especially the UMS (Muhammadiyya University Surakarta) Electrical and Electronics Engineering Curriculum and Tun Hussein Onn Malaysia (UTHM) Electrical Engineering Curriculum, Research Management. We thank the Center, which contributed to the implementation and testing of this study.

Список литературы
1. Hernawati T. Pengembangan kemampuan berbahasa dan berbicara anak tunarungu. Jurnal Jurusan PLB FIP Universitas Pendidikan Indonesia, 2007, vol. 7, no. 1, pp. 101110.
2. Shiraishi Y., Zhang J., Wakatsuki D., Kumai K., Morishima A. Crowdsourced real-time captioning of sign language by deaf and hard-of-hearing people. International Journal of Pervasive Computing and Communications, 2017, vol. 13, no. 1, pp. 2–25.
3. Islalm M.S., Rahman M.M., Rahman M.H., Arifuzzaman M., Sassi R., Aktaruzzaman M. Recognition bangla sign language using convolutional neural network. Proc. of the 2019 International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), 2019, pp. 8910301.
4. Lim M.Y. MySlang – An Electronic Malaysian Sign Language Dictionary. 2008, pp. 145.
5. Karpov A., Kipyatkova I., Zelezny M. Automatic Technologies for processing spoken sign languages. Procedia Computer Science, 2016, vol. 81, pp. 201–207.
6. Brour M., Benabbou A. ATLASLang NMT: Arabic text language into Arabic sign language neural machine translation. Journal of King Saud University - Computer and Information Sciences, 2019, vol. 33, no. 9, pp. 1121–1131.
7. Latif G., Mohammad N., Alghazo J., AlKhalaf R., AlKhalaf R. ArASL: Arabic alphabets sign language dataset. Data in Brief, 2019, vol. 23, pp. 103777.
8. Suharjito S., Thiracitta N., Gunawan H., Witjaksono G. The comparison of some hidden Markov models for sign language recognition. Proc. of the 1st Indonesian Association for Pattern Recognition International Conference (INAPR), 2019, pp. 6–10.
9. Muthu Mariappan H., Gomathi V. Real-time recognition of Indian sign language. Proc. of the 2nd International Conference on Computational Intelligence in Data Science (ICCIDS), 2019, pp. 8862125.
10. Kadiyala A., Kumar A. Applications of Python to evaluate environmental data science problems. Environmental Progress and Sustainable Energy, 2017, vol. 36, no. 6, pp. 1580–1586.
11. Xie M., Ma X. End-to-End residual neural network with data augmentation for sign language recognition. Proc. of the 4th IEEE Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), 2019, pp. 1629–1633.
12. Inoue K., Shiraishi T., Yoshioka M., Yanagimoto H. Depth sensor based automatic hand region extraction by using time-series curve and its application to Japanese finger-spelled sign language recognition. Procedia Computer Science, 2015, vol. 60, no. 1, pp. 371–380.
13. Xue Y., Gao S., Sun H., Qin W. A Chinese sign language recognition system using leap motion. Proc. of the 7th International Conference on Virtual Reality and Visualization (ICVRV), 2017, pp. 180–185.
14. Kumar E.K., Kishore P.V.V., Kiran Kumar M.T., Kumar D.A. 3D sign language recognition with joint distance and angular coded color topographical descriptor on a 2 – stream CNN. Neurocomputing, 2020, vol. 372, pp. 40–54.
15. Yang X., Chen X., Cao X., Wei S., Zhang X. Chinese sign language recognition based on an optimized tree-structure framework. IEEE Journal of Biomedical and Health Informatics, 2017, vol. 21, no. 4, pp. 994–1004.
16. Rincon Vega A.M., Vasquez A., Amador W., Rojas A. Deep learning for the recognition of facial expression in the Colombian sign language. Annals of Physical and Rehabilitation Medicine, 2018, vol. 61, pp. e96.
17. Joy J., Balakrishnan K., Sreeraj M. SignQuiz: A quiz based tool for learning fingerspelled signs in indian sign language using ASLR. IEEE Access, 2019, vol. 7, pp. 28363–28371.
18. Gupta B., Shukla P., Mittal A. K-nearest correlated neighbor classification for Indian sign language gesture recognition using feature fusion. Proc. of the 6th International Conference on Computer Communication and Informatics (ICCCI), 2016, pp. 7479951.
19. Singh S. Sign language to text by feed forward back propagation neural network. International Journal of Advanced Research in Computer Science, 2012, vol. 3, no. 3, pp. 146–152.
20. Abid M.R., Petriu E.M., Amjadian E. Dynamic sign language recognition for smart home interactive application using stochastic linear formal grammar. IEEE Transactions on Instrumentation and Measurement, 2015, vol. 64, no. 3, pp. 596–605.
21. Bayu Ramadhani Fajri B., Ramadhani Fajri B., Kusumastuti G. Perceptions of ‘hearing’ people on sign language learning. Advances in Social Science, Education and Humanities Research, 2019, vol. 382, pp. 364–367.
22. Abbas A., Sarfraz S. Developing a prototype to translate text and speech to Pakistan sign language with bilingual subtitles: A framework. Journal of Educational Technology Systems, 2018, vol. 47, no. 2, pp. 248–266.
23. Tao W., Leu M.C., Yin Z. American Sign Language alphabet recognition using Convolutional Neural Networks with multiview augmentation and inference fusion. Engineering Applications of Artificial Intelligence, 2018, vol. 76, pp. 202–213.
24. Sudharshan D.P., Raj S. Object recognition in images using convolutional neural network. Proc. of the 2nd International Conference on Inventive Systems and Control (ICISC), 2018, pp. 718–821.
25. Hernández-Blanco A., Herrera-Flores B., Tomás D., Navarro-Colorado B. A Systematic review of deep learning approaches to educational data mining. Complexity, 2019, vol. 2019, pp. 1306039.
26. Zhang Q., Zhang M., Chen T., Sun Z., Ma Y., Yu B. Recent advances in convolutional neural network acceleration. Neurocomputing, 2019, vol. 323, pp. 37–51.
27. Bantupalli K., Xie Y. American sign language recognition using deep learning and computer vision. Proc. of the 2018 IEEE International Conference on Big Data, 2018, pp. 4896–4899.
28. Liao Y., Xiong P., Min W., Min W., Lu J. Dynamic sign language recognition based on video sequence with BLSTM-3D residual networks. IEEE Access, 2019, vol. 7, pp. 38044–38054.
29. Nasreddine K., Benzinou A. Shape geodesics for robust sign language recognition. IET Image Process, 2019, vol. 13, no. 5, pp. 825–832.
30. Tolentino L.K.S., Serfa Juan R.O., Thio-ac A.C., Pamahoy M.A.B., Forteza J.R.R., Garcia X.J.O. Static sign language recognition using deep learning. International Journal of Machine Learning and Computing, 2019, vol. 9, no. 6, pp. 821–827.
31. Sreedevi J., Rama Bai M., Ahmed M.M. An approach to translate hand gestures into telugu text. International Journal of Scientific and Technology Research, 2020, vol. 9, no. 4, pp. 258–264.
32. Azar S.G., Seyedarabi H. Trajectory-based recognition of dynamic Persian sign language using hidden Markov model. Computer Speech and Language, 2020, vol. 61, pp. 101053.
33. Rajam P.S., Balakrishnan G. Recognition of Tamil sign language alphabet using image processing to aid deaf-dumb people. Procedia Engineering, 2012, vol. 30, pp. 861–868.
34. Tan T.S., Salleh S.H., Ariff A.K., Ting C.M., Siew K.S., Leong S.H. Malay sign language gesture recognition system. Proc. of the 2007 International Conference on Intelligent and Advanced Systems (ICIAS), 2007, pp. 982–985.
35. Gulhane V.M., Joshi M.S., Ingole M.D. Neural network based hand gesture recognition. International Journal of Advanced Research in Computer Science, 2012, vol. 3, no. 3, pp. 849–853.
36. Indra D., Purnawansyah S., Madenda S., Wibowo E.P. Indonesian sign language recognition based on shape of hand gesture. Procedia Computer Science, 2019, vol. 161, pp. 74–81.
37. Cheok M.J., Omar Z., Jaward M.H. A review of hand gesture and sign language recognition techniques. International Journal of Machine Learning and Cybernetics, 2019, vol. 10, no. 1, pp. 131–153.
38. Tewari D., Srivastava S.K. A visual recognition of static hand gestures in Indian sign language based on kohonen self-organizing map algorithm. International Journal of Engineering and Advanced Technology, 2012, no. 2, pp. 165–170.
39. Rajaganapathy S., Aravind B., Keerthana B., Sivagami M. Conversation of sign language to speech with human gestures. Procedia Computer Science, 2015, vol. 50, pp. 10–15.
40. Chen Y., Zhang W. Research and implementation of sign language recognition method based on Kinect. Proc. of the 2nd International Conference on Computer and Communications (ICCC), 2016, pp. 1947–1951.
41. Abed A.A., Rahman S.A. Python-based Raspberry Pi for hand gesture recognition. International Journal of Computer Applications, 2017, vol. 173, no. 4, pp. 18–24.
42. Suharjito, Ariesta M.C., Wiryana F., Kusuma G.P. A survey of hand gesture recognition methods in sign language recognition. Pertanika Journal of Science and Technology, 2018, vol. 26, no. 4, pp. 1659–1675.
43. Asadi-Aghbolaghi M., Clapés A., Bellantonio M., Escalante H., Ponce-López V., Baró X., Guyon I., Kasaei S., Escalera S. Deep learning for action and gesture recognition in image sequences: A survey. Gesture Recognition. Springer, 2017, pp. 539–578.
44. Manikandan K., Patidar A., Walia P., Roy A.B. Hand gesture detection and conversion to speech and text. arXiv, 2018, arXiv:1811.11997.
45. Cañas J.M., Martín-Martín D., Arias P., Vega J., Roldán-álvarez D., García-Pérez I., Fernández-Conde J. Open-source drone programming course for distance engineering education. Electronics, 2020, vol. 9, no. 12, pp. 1–18.
46. Masood S., Srivastava A., Thuwal H.C., Ahmad M. Real-time sign language gesture (word) recognition from video sequences using CNN and RNN. Advances in Intelligent Systems and Computing, 2018, vol. 695, pp. 623–632.
47. Islam M.S., Mousumi S.S.S., Azad Rabby A.K.M.S., Hossain S.A., Abujar S. A potent model to recognize bangla sign language digits using convolutional neural network. Procedia Computer Science, 2018, vol. 143, pp. 611–618.
48. Zhong B., Xing X., Love P., Wang X., Luo H. Convolutional neural network: Deep learning-based classification of building quality problems. Advanced Engineering Informatics, 2019, vol. 40, pp. 46–57.
49. Dawane S.P., Sayyed H.G.A. Hand gesture recognition for deaf and dumb people using GSM module. International Journal of Science and Research, 2017, vol. 6, no. 5, pp. 2226–2230.
50. Handhika T., Zen R.I.M., Murni, Lestari D.P., Sari I. Gesture recognition for Indonesian Sign Language (BISINDO). Journal of Physics: Conference Series, 2018, vol. 1028, no. 1, pp. 012173.
51. Fadlilah U., Wismoyohadi D., Mahamad A.K., Handaga B. Bisindo information system as potential daily sign language learning. AIP Conference Proceedings, 2019, vol. 2114, pp. 060021.
52. Palfreyman N. Social meanings of linguistic variation in BISINDO (Indonesian sign language). Asia-Pacific Language Variation, 2020, vol. 6, no. 1, pp. 89–118.
53. Fadlilah U., Mahamad A.K., Handaga B. The Development of android for Indonesian sign language using tensorflow lite and CNN: An initial study. Journal of Physics: Conference Series, 2021, vol. 1858, no. 1, pp. 012085.
54. Dewa C.K., Fadhilah A.L., Afiahayati A. Convolutional neural networks for handwritten Javanese character recognition. IJCCS (Indonesian Journal of Computing and Cybernetics Systems), 2018, vol. 12, no 1, pp. 83.
55. Kurien M., Kim M.K., Kopsida M., Brilakis I. Real-time simulation of construction workers using combined human body and hand tracking for robotic construction worker system. Automation in Construction, 2018, vol. 86, pp. 125–137.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Информация 2001-2024 ©
Научно-технический вестник информационных технологий, механики и оптики.
Все права защищены.