ANN Models for Shoulder Pain Detection based on Human Facial Expression Covered by Mask

Main Article Content

Rizka Hadelina Muhammad Ilhamdi Rusydi Mutia Firza Oluwarotimi Williams Samuel

Abstract

Facial expressions are a method to communicate if someone feels pain. Moreover, coding facial movements to assess pain requires extensive training and is time-consuming for clinical practice. In addition, in Covid 19 pandemic, it was difficult to determine this expression due to the mask on the face. There for, it needs to develop a system that can detect the pain from facial expressions when a person is wearing a mask. There are 41 points used to form 19 geometrical features. It used 20.000 frames of 24 respondents from the dataset as secondary data . From these data, training, and testing were carried out using the ANN (Artificial Neural Network) method with a variation of the number of neurons in the hidden layer, i.e., 5, 10, 15, and 20 neurons. The results obtained from testing these data are the highest accuracy of 86% with the number of 20 hidden layers.

Downloads

Download data is not yet available.

Article Details

How to Cite
Hadelina, R., Rusydi, M., Firza, M., & Samuel, O. (2023, March 31). ANN Models for Shoulder Pain Detection based on Human Facial Expression Covered by Mask. JITCE (Journal of Information Technology and Computer Engineering), 7(01), 49-55. https://doi.org/https://doi.org/10.25077/jitce.7.01.49-55.2023
Section
Articles

References

[1] S. E. E. Mills, K. P. Nicolson, and B. H. Smith, �Chronic pain: a review of its epidemiology and associated factors in population-based studies,� Br. J. Anaesth., vol. 123, no. 2, pp. e273�e283, 2019.
[2] A. Pathak, S. Sharma, and M. P. Jensen, �The utility and validity of pain intensity rating scales for use in developing countries,� Pain Reports, vol. 3, no. 5, pp. 1�8, 2018.
[3] D. Liu, D. Cheng, T. T. Houle, L. Chen, W. Zhang, and H. Deng, �Machine learning methods for automatic pain assessment using facial expression information: Protocol for a systematic review and meta-analysis,� Med. (United States), vol. 97, no. 49, pp. 0�5, 2018
[4] M. Lee et al., �Pain Intensity Estimation from Mobile Video Using 2D and 3D Facial Keypoints,� 2020
[5] J. Wang and H. Sun, �Pain intensity estimation using deep spatiotemporal and handcrafted features,� IEICE Trans. Inf. Syst., vol. E101D, no. 6, pp. 1572�1580, 2018.
[6] K. Herr, P. J. Coyne, E. Ely, C. G�linas, and R. C. B. Manworren, �Pain Assessment in the Patient Unable to Self-Report: Clinical Practice Recommendations in Support of the ASPMN 2019 Position Statement,� Pain Manag. Nurs., vol. 20, no. 5, pp. 404�417, 2019
[7] T. Li, W. Hou, F. Lyu, Y. Lei, and C. Xiao, �Face detection based-on depth information using HOG-LBP,� Proc. - 2016 6th Int. Conf. Instrum. Meas. Comput. Commun. Control. IMCCC 2016, pp. 779�784, 2016.
[8] O. M. Al-Omair and S. Huang, �A comparative study of algorithms and methods for facial expression recognition,� SysCon 2019 - 13th Annu. IEEE Int. Syst. Conf. Proc., pp. 1�6, 2019.
[9] A. I. Gaidar and P. Y. Yakimov, �Real-time fatigue features detection,� J. Phys. Conf. Ser., vol. 1368, no. 5, 2019.
[10] M. J. M. Zedan and M. A. Al-jbaar, �Face Recognition Security System Based on Convolutional Neural Networks Face Recognition Security System Based on Convolutional Neural Networks,� no. June, 2020.
[11] J. Jeon et al., �A real-time facial expression recognizer using deep neural network,� ACM IMCOM 2016 Proc. 10th Int. Conf. Ubiquitous Inf. Manag. Commun., 2016.
[12] R. Fitzgerald, J. Carqueville, and P. T. Yang, �An approach to structural facial rejuvenation with fillers in women,� Int. J. Women�s Dermatology, vol. 5, no. 1, pp. 52�67, 2019.
[13] P. W. Hashim, J. K. Nia, M. Taliercio, and G. Goldenberg, �Ideals of facial beauty,� Cutis, vol. 100, no. 4, pp. 222�224, 2017.
[14] C. M. Sadacharan, �Facial Proportions of Indian Americans and Its Clinical Applications,� MOJ Anat. Physiol., vol. 1, no. 4, 2015.