Hand Gesture to Control Virtual Keyboard using Neural Network

Arrya Anandika (1), Muhammad Ilhamdi Rusydi (2), Pepi Putri Utami (3), Rizka Hadelina (4), Minoru Sasaki (5)
(1) Universitas Andalas
(2) Universitas Andalas
(3) Universitas Andalas
(4) Universitas Andalas
(5) Gifu University Yanagido
Fulltext View | Download
How to cite (IJASEIT) :
Anandika, A., Rusydi, M. I., Utami, P. P., Hadelina, R., & Sasaki, M. (2023). Hand Gesture to Control Virtual Keyboard using Neural Network. JITCE (Journal of Information Technology and Computer Engineering), 7(01), 40–48. https://doi.org/10.25077/jitce.7.01.40-48.2023

Disability is one of a person's physical and mental conditions that can inhibit normal daily activities. One of the disabilities that can be found in disability is speech without fingers. Persons with disabilities have obstacles in communicating with people around both verbally and in writing. Communication tools to help people with disabilities without finger fingers continue to be developed, one of them is by creating a virtual keyboard using a Leap Motion sensor. The hand gestures are captured using the Leap Motion sensor so that the direction of the hand gesture in the form of pitch, yaw, and roll is obtained. The direction values are grouped into normal, right, left, up, down, and rotating gestures to control the virtual keyboard. The amount of data used for gesture recognition in this study was 5400 data consisting of 3780 training data and 1620 test data. The results of data testing conducted using the Artificial Neural Network method obtained an accuracy value of 98.82%. This study also performed a virtual keyboard performance test directly by typing 20 types of characters conducted by 15 respondents three times. The average time needed by respondents in typing is 5.45 seconds per character.

[1] D. F. V. Anaya and Mehmet. R. Yuce, �A Hands-free Human-Computer-Interface Platform for Paralyzed Patients Using a TENG-based Eyelash Motion Sensor,� 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 2020, doi: 10.0/Linux-x86_64.
[2] G. Jialu et al., �Offline Analysis for Designing Electrooculogram Based Human Computer Interface Control for Paralyzed Patients,� IEEE Access, vol. 6, pp. 79151�79161, 2018, doi: 10.1109/ACCESS.2018.2884411.
[3] K. A. S. V Rathnayake, W. K. I. L. Wanniarachchi, and W. H. K. P. Nanayakkara, �Human Computer Interaction System for Impaired People by using Kinect Motion Sensor?: Voice and Gesture Integrated Smart Home,� Proceedings of the 2nd International Conference on Inventive Communication and Computational Technologies (ICICCT), pp. 531�536, 2018.
[4] X. Li, K. Wan, R. Wen, and Y. Hu, Development of Finger Motion Reconstruction System Based on Leap Motion Controller. 2018.
[5] S. S. Khan, Md. S. H. Sunny, M. S. Hossain, E. Hossain, and M. Ahmad, �Nose Tracking Cursor Control for the People with Disabilities: An Improved HCI,� 3rd International Conference on Electrical Information and Communication Technology (EICT), Dec. 2017.
[6] M. I. Rusydi, Oktrison, W. Azhar, S. W. Oluwarotimi, and F. Rusydi, �Towards hand gesture-based control of virtual keyboards for effective communication,� in IOP Conference Series: Materials Science and Engineering, Institute of Physics Publishing, Sep. 2019. doi: 10.1088/1757-899X/602/1/012030.
[7] N. Wang, Y. Chen, and X. Zhang, �Realtime recognition of multi-finger prehensile gestures,� Biomed Signal Process Control, vol. 13, no. 1, pp. 262�269, 2014, doi: 10.1016/j.bspc.2014.05.007.
[8] I. A. S. Filho et al., �Gesture Recognition Using Leap Motion: A Machine Learning-based Controller Interface.� [Online]. Available: https://drive.google.com/open?id=
[9] W.-J. Li, C.-Y. Hsieh, L.-F. Lin, and W.-C. Chu, �Hand Gesture Recognition for Post-stroke Rehabilitation Using Leap Motion,� IEEE International Conference on Applied System Innovation (IEEE-ICASI 2017), pp. 386�388, 2017.
[10] R. Zaman Khan, �Hand Gesture Recognition: A Literature Review,� International Journal of Artificial Intelligence & Applications, vol. 3, no. 4, pp. 161�174, Jul. 2012, doi: 10.5121/ijaia.2012.3412.
[11] P. D. S. H. Gunawardane and N. T. Medagedara, �Comparison of Hand Gesture inputs of Leap Motion Controller & Data Glove in to a Soft Finger,� in IEEE 5th International Symposium on Robotics and Intelligent Sensors?: proceedings, 2017, pp. 62�68.
[12] S. Ameur, A. Ben Khalifa, and M. S. Bouhlel, �Hand-Gesture-Based Touchless Exploration of Medical Images with Leap Motion Controller,� in Proceedings of the 17th International Multi-Conference on Systems, Signals and Devices, SSD 2020, Institute of Electrical and Electronics Engineers Inc., Jul. 2020, pp. 1116�1121. doi: 10.1109/SSD49366.2020.9364244.
[13] Q. Wang, Y. Wang, F. Liu, and W. Zeng, �Hand Gesture Recognition of Arabic Numbers Using Leap Motion via Deterministic Learning,� in Proceedings of the 36th Chinese Control Conference, 2017, pp. 10823�10828.
[14] M. I. Rusydi et al., �Electric Wheelchair Control using Wrist Rotation based on Analysis of Muscle Fatigue,� IEEE Access, vol. 10, pp. 1�1, Sep. 2022, doi: 10.1109/access.2022.3208151.
[15] M. I. Rusydi et al., �The Use of Two Fingers to Control Virtual Keyboards with Leap Motion Sensor,� 2017 5th International Conference on Instrumentation, Communications, Information Technology, and Biomedical Engineering (ICICI-BME), pp. 255�260, 2017.
[16] M. H. Rahman and J. Afrin, �Hand Gesture Recognition using Multiclass Support Vector Machine,� 2013.
[17] N. R. N, S. Michahial, A. G. N, B. H. Azeez, J. M. R, and R. K. Rani, �Hand gesture recognition using support vector machine,� The International Journal Of Engineering And Science (IJES), vol. 4, no. 6, pp. 42�46, 2015, [Online]. Available: www.theijes.com
[18] Y. Zhou, G. Jiang, and Y. Lin, �A novel finger and hand pose estimation technique for real-time hand gesture recognition,� Pattern Recognit, vol. 49, pp. 102�114, Jan. 2016, doi: 10.1016/j.patcog.2015.07.014.
[19] Z. H. Chen, J. T. Kim, J. Liang, J. Zhang, and Y. B. Yuan, �Real-time hand gesture recognition using finger segmentation,� Scientific World Journal, vol. 2014, 2014, doi: 10.1155/2014/267872.
[20] W. Lu, Z. Tong, and J. Chu, �Dynamic hand gesture recognition with leap motion controller,� IEEE Signal Process Lett, vol. 23, no. 9, pp. 1188�1192, Sep. 2016, doi: 10.1109/LSP.2016.2590470.
[21] M. I. Rusydi, A. Anandika, R. Adnan, K. Matsuhita, and M. Sasaki, �Adaptive Symmetrical Virtual Keyboard Based on EOG Signal,� pp. 22�26, 2019.
[22] M. I. Rusydi, D. Saputra, D. Anugrah, S. Syafii, A. W. Setiawan, and M. Sasaki, �Real Time Control of Virtual Menu Based on EMG Signal from Jaw,� 3rd Asia-Pacific Conference on Intelligent Robot Systems?: ACIRS, pp. 18�22, 2018.
[23] J. Hossain Gourob, S. Raxit, and A. Hasan, �A Robotic Hand: Controlled With Vision Based Hand Gesture Recognition System,� in 2021 International Conference on Automation, Control and Mechatronics for Industry 4.0 (ACMI), 2021, pp. 1�4. doi: 10.1109/ACMI53878.2021.9528192.
[24] Y. Zhang, W. Yan, and A. Narayanan, �A Virtual Keyboard Implementation Based on Finger Recognition,� in International Conference on Image and Vision Computing New Zealand (IVCNZ), 2017.
[25] T.-H. Lee and H.-J. Lee, �Ambidextrous Virtual Keyboard Design with Finger Gesture Recognition,� in IEEE International Symposium on Circuits and Systems (ISCAS), IEEE, 2018.
[26] X. Chu, J. Liu, and S. Shimamoto, �A sensor-based hand gesture recognition system for Japanese sign language,� in LifeTech 2021 - 2021 IEEE 3rd Global Conference on Life Sciences and Technologies, Institute of Electrical and Electronics Engineers Inc., Mar. 2021, pp. 311�312. doi: 10.1109/LifeTech52111.2021.9391981.
[27] M. R. Islam, U. K. Mitu, R. A. Bhuiyan, and J. Shin, �Hand Gesture Feature Extraction Using Deep Convolutional Neural Network for Recognizing American Sign Language,� in 4th International Conference on Frontiers of Signal Processing (ICFSP), 2018, pp. 115�119.
[28] S. Chaman, D. D�souza, B. D�mello, K. Bhavsar, and J. D�souza, �Real-time Hand Gesture Communication System in Hindi for Speech and Hearing Impaired,� in International Conference on Intelligent Computing and Control Systems (ICICCS), 2018, pp. 1954�1958.
[29] M. I. Rusydi, Syafii, R. Hadelina, E. Kimin, A. W. Setiawan, and A. Rusydi, �Recognition of sign language hand gestures using leap motion sensor based on threshold and ANN models,� Bulletin of Electrical Engineering and Informatics, vol. 9, no. 2, pp. 473�483, Apr. 2020, doi: 10.11591/eei.v9i2.1194.
[30] M. S. Verdadero, C. O. Martinez-Ojeda, and J. C. Dela Cruz, �Hand Gesture Recognition System as an Alternative Interface for Remote Controlled Home Appliances,� in IEEE 10th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), 2018.
[31] V. V. T. Reddy, T. Dhyanchand, G. V. Krishna, and S. Maheshwaram, �Virtual Mouse Control Using Colored Finger Tips and Hand Gesture Recognition,� in Proceedings of 2020 IEEE-HYDCON International Conference on Engineering in the 4th Industrial Revolution, HYDCON 2020, Institute of Electrical and Electronics Engineers Inc., Sep. 2020. doi: 10.1109/HYDCON48903.2020.9242677.
[32] Vi. I. Saraswati, R. Sigit, and T. Harsono, �Eye Gaze System to Operate Virtual Keyboard,� in International Electronics Symposium (IES), 2016, pp. 175�179.
[33] Y. Kim and B. Toomajian, �Hand Gesture Recognition Using Micro-Doppler Signatures with Convolutional Neural Network,� IEEE Access, vol. 4, pp. 7125�7130, 2016, doi: 10.1109/ACCESS.2016.2617282

1. License

Creative Commons License

 

The non-commercial use of the article will be governed by the Creative Commons Attribution license as currently displayed on Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

2. Author(s)’ Warranties

The author(s) warrants that the article is original, written by stated author(s), has not been published before, contains no unlawful statements, does not infringe the rights of others, is subject to copyright that is vested exclusively in the author and free of any third party rights, and that any necessary permissions to quote from other sources have been obtained by the author(s).

3. User Rights

JITCE adopts the spirit of open access and open science, which disseminates articles published as free as possible under the Creative Commons license. JITCE permits users to copy, distribute, display, and perform the work for non-commercial purposes only. Users will also need to attribute authors and JITCE on distributing works in the journal.

4. Rights of Authors

Authors retain the following rights:

  • Copyright, and other proprietary rights relating to the article, such as patent rights,
  • the right to use the substance of the article in future own works, including lectures and books,
  • the right to reproduce the article for own purposes, 
  • the right to self-archive the article.
  • the right to enter into separate, additional contractual arrangements for the non-exclusive distribution of the article's published version (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal (Journal of Information Technology and Computer Engineering).

5. Co-Authorship

If the article was jointly prepared by other authors; upon submitting the article, the author is agreed on this form and warrants that he/she has been authorized by all co-authors on their behalf, and agrees to inform his/her co-authors. JITCE will be freed on any disputes that will occur regarding this issue. 

7. Royalties

By submitting the articles, the authors agreed that no fees are payable from JITCE.

 

8. Miscellaneous

JITCE will publish the article (or have it published) in the journal if the article’s editorial process is successfully completed and JITCE or its sublicensee has become obligated to have the article published. JITCE may adjust the article to a style of punctuation, spelling, capitalization, referencing and usage that it deems appropriate. The author acknowledges that the article may be published so that it will be publicly accessible and such access will be free of charge for the readers. 

Downloads

Download data is not yet available.