ANN Models for Shoulder Pain Detection based on Human Facial Expression Covered by Mask
Main Article Content
Abstract
Facial expressions are a method to communicate if someone feels pain. Moreover, coding facial movements to assess pain requires extensive training and is time-consuming for clinical practice. In addition, in Covid 19 pandemic, it was difficult to determine this expression due to the mask on the face. There for, it needs to develop a system that can detect the pain from facial expressions when a person is wearing a mask. There are 41 points used to form 19 geometrical features. It used 20.000 frames of 24 respondents from the dataset as secondary data . From these data, training, and testing were carried out using the ANN (Artificial Neural Network) method with a variation of the number of neurons in the hidden layer, i.e., 5, 10, 15, and 20 neurons. The results obtained from testing these data are the highest accuracy of 86% with the number of 20 hidden layers.
Downloads
Article Details
Please find the rights and licenses in the Journal of Information Technology and Computer Engineering (JITCE).
1. License
The non-commercial use of the article will be governed by the Creative Commons Attribution license as currently displayed on Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
2. Author(s)’ Warranties
The author(s) warrants that the article is original, written by stated author(s), has not been published before, contains no unlawful statements, does not infringe the rights of others, is subject to copyright that is vested exclusively in the author and free of any third party rights, and that any necessary permissions to quote from other sources have been obtained by the author(s).
3. User Rights
JITCE adopts the spirit of open access and open science, which disseminates articles published as free as possible under the Creative Commons license. JITCE permits users to copy, distribute, display, and perform the work for non-commercial purposes only. Users will also need to attribute authors and JITCE on distributing works in the journal.
4. Rights of Authors
Authors retain the following rights:
- Copyright, and other proprietary rights relating to the article, such as patent rights,
- the right to use the substance of the article in future own works, including lectures and books,
- the right to reproduce the article for own purposes,
- the right to self-archive the article.
- the right to enter into separate, additional contractual arrangements for the non-exclusive distribution of the article's published version (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal (Journal of Information Technology and Computer Engineering).
5. Co-Authorship
If the article was jointly prepared by other authors; upon submitting the article, the author is agreed on this form and warrants that he/she has been authorized by all co-authors on their behalf, and agrees to inform his/her co-authors. JITCE will be freed on any disputes that will occur regarding this issue.
7. Royalties
By submitting the articles, the authors agreed that no fees are payable from JITCE.
8. Miscellaneous
JITCE will publish the article (or have it published) in the journal if the article’s editorial process is successfully completed and JITCE or its sublicensee has become obligated to have the article published. JITCE may adjust the article to a style of punctuation, spelling, capitalization, referencing and usage that it deems appropriate. The author acknowledges that the article may be published so that it will be publicly accessible and such access will be free of charge for the readers.
References
[2] A. Pathak, S. Sharma, and M. P. Jensen, �The utility and validity of pain intensity rating scales for use in developing countries,� Pain Reports, vol. 3, no. 5, pp. 1�8, 2018.
[3] D. Liu, D. Cheng, T. T. Houle, L. Chen, W. Zhang, and H. Deng, �Machine learning methods for automatic pain assessment using facial expression information: Protocol for a systematic review and meta-analysis,� Med. (United States), vol. 97, no. 49, pp. 0�5, 2018
[4] M. Lee et al., �Pain Intensity Estimation from Mobile Video Using 2D and 3D Facial Keypoints,� 2020
[5] J. Wang and H. Sun, �Pain intensity estimation using deep spatiotemporal and handcrafted features,� IEICE Trans. Inf. Syst., vol. E101D, no. 6, pp. 1572�1580, 2018.
[6] K. Herr, P. J. Coyne, E. Ely, C. G�linas, and R. C. B. Manworren, �Pain Assessment in the Patient Unable to Self-Report: Clinical Practice Recommendations in Support of the ASPMN 2019 Position Statement,� Pain Manag. Nurs., vol. 20, no. 5, pp. 404�417, 2019
[7] T. Li, W. Hou, F. Lyu, Y. Lei, and C. Xiao, �Face detection based-on depth information using HOG-LBP,� Proc. - 2016 6th Int. Conf. Instrum. Meas. Comput. Commun. Control. IMCCC 2016, pp. 779�784, 2016.
[8] O. M. Al-Omair and S. Huang, �A comparative study of algorithms and methods for facial expression recognition,� SysCon 2019 - 13th Annu. IEEE Int. Syst. Conf. Proc., pp. 1�6, 2019.
[9] A. I. Gaidar and P. Y. Yakimov, �Real-time fatigue features detection,� J. Phys. Conf. Ser., vol. 1368, no. 5, 2019.
[10] M. J. M. Zedan and M. A. Al-jbaar, �Face Recognition Security System Based on Convolutional Neural Networks Face Recognition Security System Based on Convolutional Neural Networks,� no. June, 2020.
[11] J. Jeon et al., �A real-time facial expression recognizer using deep neural network,� ACM IMCOM 2016 Proc. 10th Int. Conf. Ubiquitous Inf. Manag. Commun., 2016.
[12] R. Fitzgerald, J. Carqueville, and P. T. Yang, �An approach to structural facial rejuvenation with fillers in women,� Int. J. Women�s Dermatology, vol. 5, no. 1, pp. 52�67, 2019.
[13] P. W. Hashim, J. K. Nia, M. Taliercio, and G. Goldenberg, �Ideals of facial beauty,� Cutis, vol. 100, no. 4, pp. 222�224, 2017.
[14] C. M. Sadacharan, �Facial Proportions of Indian Americans and Its Clinical Applications,� MOJ Anat. Physiol., vol. 1, no. 4, 2015.