UAV With the Ability to Control with Sign Language and Hand by Image Processing

Main Article Content

Hediyeh Hojaji Alireza Delisnav Mohammad Hossein Ghafouri Moghaddam Fariba Ghorbani Shadi Shafaghi Masoud Shafaghi

Abstract

Automatic recognition of sign language from hand gesture images is crucial for enhancing human-robot interaction, especially in critical scenarios such as rescue operations. In this study, we employed a DJI TELLO drone equipped with advanced machine vision capabilities to recognize and classify sign language gestures accurately. We developed an experimental setup where the drone, integrated with state-of-the-art radio control systems and machine vision techniques, navigated through simulated disaster environments to interact with human subjects using sign language. Data collection involved capturing various hand gestures under various environmental conditions to train and validate our recognition algorithms, including implementing YOLO V5 alongside Python libraries with OpenCV. This setup enabled precise hand and body detection, allowing the drone to navigate and interact effectively. We assessed the system's performance by its ability to accurately recognize gestures in both controlled and complex, cluttered backgrounds. Additionally, we developed robust debris and damage-resistant shielding mechanisms to safeguard the drone's integrity. Our drone fleet also established a resilient communication network via Wi-Fi, ensuring uninterrupted data transmission even with connectivity disruptions. These findings underscore the potential of AI-driven drones to engage in natural conversational interactions with humans, thereby providing vital information to assist decision-making processes during emergencies. In conclusion, our approach promises to revolutionize the efficacy of rescue operations by facilitating rapid and accurate communication of critical information to rescue teams.

Downloads

Download data is not yet available.

Article Details

How to Cite
Hojaji, H., Delisnav, A., Ghafouri Moghaddam, M. H., Ghorbani, F., Shafaghi, S., & Shafaghi, M. (2024, September 30). UAV With the Ability to Control with Sign Language and Hand by Image Processing. JITCE (Journal of Information Technology and Computer Engineering), 8(2). Retrieved from http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/246
Section
Articles

References

[1] S. Bhushan, M. Alshehri, I. Keshta, A. K. Chakraverti, J. Rajpurohit, and A. Abugabah, “An experimental analysis of various machine learning algorithms for hand gesture recognition,” Electronics, vol. 11, no. 6, p. 968, 2022.
[2] D. Dhake, M. P. Kamble, S. S. Kumbhar, and S. M. Patil, “Sign language communication with dumb and deaf people,” Int J Eng Appl Sci Technol, vol. 5, no. 4, pp. 254–258, 2020.
[3] W. L. Passos, G. M. Araujo, J. N. Gois, and A. A. de Lima, “A gait energy image-based system for Brazilian sign language recognition,” IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 68, no. 11, pp. 4761–4771, 2021.
[4] M. Bassiony, A. V. Aluko, and J. A. Radosevich, “Immunotherapy and cancer,” Precision Medicine in Oncology, pp. 133–156, 2020.
[5] R. Patel and P. Kaur, “Hand Gesture Digit Recognition using Machine Learning with Image Processing”.
[6] B. I. Ismail et al., “Evaluation of docker as edge computing platform,” in 2015 IEEE conference on open systems (ICOS), IEEE, 2015, pp. 130–135.
[7] G. Borg, “Drone object detection based on real-time sign language communication.” University of Malta, 2021.
[8] F. Patrona, I. Mademlis, and I. Pitas, “An overview of hand gesture languages for autonomous UAV handling,” 2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO), pp. 1–7, 2021.
[9] D. Talukder and F. Jahara, “Real-time bangla sign language detection with sentence and speech generation,” in 2020 23rd International Conference on Computer and Information Technology (ICCIT), IEEE, 2020, pp. 1–6.
[10] K. Amrutha and P. Prabu, “ML based sign language recognition system,” in 2021 International Conference on Innovative Trends in Information Technology (ICITIIT), IEEE, 2021, pp. 1–6.
[11] N. Saquib and A. Rahman, “Application of machine learning techniques for real-time sign language detection using wearable sensors,” in Proceedings of the 11th ACM Multimedia Systems Conference, 2020, pp. 178–189.
[12] S. Shrenika and M. M. Bala, “Sign language recognition using template matching technique,” in 2020 international conference on computer science, engineering and applications (ICCSEA), IEEE, 2020, pp. 1–5.
[13] R. Contreras, A. Ayala, and F. Cruz, “Unmanned aerial vehicle control through domain-based automatic speech recognition,” Computers, vol. 9, no. 3, p. 75, 2020.
[14] K. Grispino, D. Lyons, and T.-H. D. Nguyen, “Evaluating the Potential of Drone Swarms in Nonverbal HRI Communication,” in 2020 IEEE International Conference on Human-Machine Systems (ICHMS), IEEE, 2020, pp. 1–6.
[15] C.-C. Tsai, C.-C. Kuo, and Y.-L. Chen, “3D hand gesture recognition for drone control in unity,” in 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE), IEEE, 2020, pp. 985–988.