JITCE (Journal of Information Technology and Computer Engineering) http://jitce.fti.unand.ac.id/index.php/JITCE <div class="jumbotron text-white jumbotron-image container-full-bg" style="background-image: url('/public/site/images/redaksi/JITCEY3.png'); background-size: cover;"> <h2 class="mb-4">Journal of Information Technology and Computer Engineering</h2> <hr class="my-0 border-top-6 border-dark border-2"> <h5 class="mb-4 text-sm">JITCE (Journal of Information Technology and Computer Engineering) is a scholarly periodical, peer-reviewed, and open access journal. JITCE focuses on Internet of Things (IoT) applications and sensor networks as its main scope.</h5> <br> <h5 class="mb-4"><strong>Publish With Us</strong></h5> <a class="btn btn-primary" href="http://jitce.fti.unand.ac.id/index.php/JITCE/about/submissions">SUBMIT AN ARTICLE</a></div> <div class="row"> <div class="col-sm-6 col-md-3"><a class="thumbnail" href="http://jitce.fti.unand.ac.id/index.php/JITCE/template"> <img src="/public/site/images/redaksi/Template.png"> </a></div> <div class="col-sm-6 col-md-3"><a class="thumbnail" href="http://jitce.fti.unand.ac.id/index.php/JITCE/fees"> <img src="/public/site/images/redaksi/COST.png"> </a></div> <div class="col-sm-6 col-md-3"><a class="thumbnail" href="http://jitce.fti.unand.ac.id/index.php/JITCE/scope"> <img src="/public/site/images/redaksi/Focus1.png"> </a></div> <div class="col-sm-6 col-md-3"><a class="thumbnail" href="http://jitce.fti.unand.ac.id/index.php/JITCE/guide"> <img src="/public/site/images/redaksi/Guide1.png"> </a></div> </div> <p><strong>JITCE (Journal of Information Technology and Computer Engineering)</strong> is a journal platform for the dissemination of cutting-edge research in the fields of information technology and computer engineering. Our journal serves as a conduit for high-quality research that drives innovation and addresses the critical challenges of our time. Our robust peer review process ensures that each submission undergoes thorough evaluation by respected experts in the field, upholding the highest standards of academic integrity and scientific rigor. JITCE offers an extensive reach within the global academic and professional communities, ensuring your research is widely accessible and influential.Benefit from the guidance and support of our distinguished editorial board, composed of leading scholars and practitioners dedicated to advancing the field of information technology and computer engineering. We invite you to contribute to JITCE, where your research will not only reach a global audience but also contribute to the ongoing research in technology and engineering. Your work has the potential to influence the future of these critical fields, and we are here to help you achieve that impact.</p> <p>&nbsp;</p> <div class="card mb-3"> <div class="row g-0"> <div class="col-md-4 thumbnail"><img class="img-fluid rounded-start mx-auto" style="object-fit: cover; width: 200px;" src="/public/site/images/redaksi/backjit.png"></div> <div class="col-md-8"> <div class="card-body"> <table class="table table-hover table-dark table-striped"> <tbody> <tr> <td>Journal Title</td> <td>JITCE (Journal of Information Technology and Computer Engineering)</td> </tr> <tr> <td>e-ISSN</td> <td><a href="https://portal.issn.org/resource/ISSN-L/2599-1663" target="_blank" rel="noopener">2599-1663 </a></td> </tr> <tr> <td>Editor-in-chief</td> <td>Dr.Eng Rian Ferdian (Universitas Andalas)</td> </tr> <tr> <td>Organized</td> <td>Computer Engineering Departemen, Universitas Andalas, Indonesia</td> </tr> <tr> <td>Frequency</td> <td>Semiannual</td> </tr> <tr> <td>Indexed By</td> <td><a href="https://doaj.org/toc/2599-1663" target="_blank" rel="noopener"> DOAJ </a> , <a href="https://search.crossref.org/?q=+2599-1663&amp;from_ui=yes" target="_blank" rel="noopener">CROSSREF</a></td> </tr> </tbody> </table> </div> </div> </div> </div> Universitas Andalas en-US JITCE (Journal of Information Technology and Computer Engineering) 2599-1663 <div id="copyright"> <p>Please find the rights and licenses in the Journal of Information Technology and Computer Engineering (JITCE).</p> </div> <p>1. License</p> <p style="text-align: center;"><a href="http://creativecommons.org/licenses/by-nc-sa/4.0/" target="_blank" rel="license noopener"><img style="border-width: 0;" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" alt="Creative Commons License"></a></p> <p>&nbsp;</p> <p>The non-commercial use of the article will be governed by the Creative Commons Attribution license as currently displayed on&nbsp;<a href="http://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.&nbsp;</p> <p>2. Author(s)’ Warranties</p> <p>The author(s) warrants that the article is original, written by stated author(s), has not been published before, contains no unlawful statements, does not infringe the rights of others, is subject to copyright that is vested exclusively in the author and free of any third party rights, and that any necessary permissions to quote from other sources have been obtained by the author(s).</p> <p>3. User Rights</p> <p>JITCE adopts the spirit of open access and open science, which disseminates articles published as free as possible under the Creative Commons license. JITCE permits users to copy, distribute, display, and perform the work for non-commercial purposes only. Users will also need to attribute authors and JITCE on distributing works in the journal.</p> <p>4. Rights of Authors</p> <p>Authors retain the following rights:</p> <ul> <li>Copyright, and other proprietary rights relating to the article, such as patent rights,</li> <li>the right to use the substance of the article in future own works, including lectures and books,</li> <li>the right to reproduce the article for own purposes,&nbsp;</li> <li>the right to self-archive the article.</li> <li>the right to enter into separate, additional contractual arrangements for the non-exclusive distribution of the article's published version (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal (Journal of Information Technology and Computer Engineering).</li> </ul> <p>5. Co-Authorship</p> <p>If the article was jointly prepared by other authors; upon submitting the article, the author is agreed on this form and warrants that he/she has been authorized by all co-authors on their behalf, and agrees to inform his/her co-authors. JITCE will be freed on any disputes that will occur regarding this issue.&nbsp;</p> <p>7. Royalties</p> <p>By submitting the articles, the authors agreed that no fees are payable from JITCE.</p> <p>&nbsp;</p> <p>8. Miscellaneous</p> <p>JITCE will publish the article (or have it published) in the journal if the article’s editorial process is successfully completed and JITCE or its sublicensee has become obligated to have the article published. JITCE may&nbsp;adjust the article to a style of punctuation, spelling, capitalization, referencing and usage that it deems appropriate. The author acknowledges that the article may be published so that it will be publicly accessible and such access will be free of charge for the readers.&nbsp;</p> The Influence of Physical Tuning Technology on Voice Over LTE (VoLTE) http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/253 <p>The Long-Term Evolution (LTE) technology is currently evolving in the cellular communication system. Currently, LTE technology is only used for faster internet data activities. Unfortunately, phone calls still rely on second-generation (2G) or third-generation (3G) networks. To improve the quality of voice calls, one of the ways is through the utilization of Voice Over LTE (VoLTE) technology. The reason for using VoLTE in fourth-generation (4G) networks includes the voice quality based on Internet Protocol (IP). This study analyzes the performance of VoLTE technology networks. Based on the data collected, the Reference Signal Received Power (RSRP) with a percentage of 37.73% falls into the Good category, Signal to Interference Noise Ratio (SINR) with a percentage of 55.32% falls into the Fair category, and Throughput with a percentage of 66.16% falls into the Poor category. In terms of delay, it has a score of 4, categorized as very good, jitter has a score of 3, categorized as good, and packet loss has a score of 4, categorized as very good. The optimization results using physical tuning show that the Reference Signal Received Power (RSRP) falls into the Good category with a percentage of 52.8%, Signal to Interference Noise Ratio (SINR) falls into the Good category with a percentage of 70%, and Throughput falls into the Very Good category with a percentage of 64.50%.</p> zurnawita zurnawita Dikky Chandra fajru ju zulya ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-09-30 2024-09-30 8 2 UAV With the Ability to Control with Sign Language and Hand by Image Processing http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/246 <p>Automatic recognition of sign language from hand gesture images is crucial for enhancing human-robot interaction, especially in critical scenarios such as rescue operations. In this study, we employed a DJI TELLO drone equipped with advanced machine vision capabilities to recognize and classify sign language gestures accurately. We developed an experimental setup where the drone, integrated with state-of-the-art radio control systems and machine vision techniques, navigated through simulated disaster environments to interact with human subjects using sign language. Data collection involved capturing various hand gestures under various environmental conditions to train and validate our recognition algorithms, including implementing YOLO V5 alongside Python libraries with OpenCV. This setup enabled precise hand and body detection, allowing the drone to navigate and interact effectively. We assessed the system's performance by its ability to accurately recognize gestures in both controlled and complex, cluttered backgrounds. Additionally, we developed robust debris and damage-resistant shielding mechanisms to safeguard the drone's integrity. Our drone fleet also established a resilient communication network via Wi-Fi, ensuring uninterrupted data transmission even with connectivity disruptions. These findings underscore the potential of AI-driven drones to engage in natural conversational interactions with humans, thereby providing vital information to assist decision-making processes during emergencies. In conclusion, our approach promises to revolutionize the efficacy of rescue operations by facilitating rapid and accurate communication of critical information to rescue teams.</p> Hediyeh Hojaji Alireza Delisnav Mohammad Hossein Ghafouri Moghaddam Fariba Ghorbani Shadi Shafaghi Masoud Shafaghi ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-09-30 2024-09-30 8 2 Deep Learning-Based Dzongkha Handwritten Digit Classification http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/202 <p>In computer vision applications, pattern recognition is one of the important fields in artificial intelligence. With the advancement in deep learning technology, many machine learning algorithms were developed to tackle the problem of pattern recognition. The purpose of conducting the research is to create the first-ever Dzongkha handwritten digit dataset and develop a model to classify the digit. In the study, the 3 layer set of CONV → ReLU → POOL, followed by a fully connected layer, dropout layer, and softmax function were used to train the digit. In the dataset, each class (0-9) contains 1500 images which are split into train, validation, and test sets: 70:20:10. The model was trained on three different image dimensions: 28 by 28, 32 by 32, and 64 by 64. Compared to image dimensions 28 by 28 and 32 by 32, 64&nbsp;by 64 gave the highest train, validation, and test accuracy of 98.66%, 98.9%, and 99.13% respectively. In the future, the sample of digits needs to be increased and use the transfer learning concept to train the model.</p> Yonten Jamtsho Pema Yangden Sonam Wangmo Nima Dema ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 2 1 7 10.25077/jitce.8.1.1-7.2024 The Evaluation of LSB Steganography on Image File Using 3DES and MD5 Key http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/177 <p>Information security is paramount for individuals, companies, and governments. Threats to data confidentiality are increasingly complex, demanding strong protection. Therefore, cryptography and steganography play pivotal roles. This study proposes the utilization of LSB (Least Significant Bit) steganography on image files employing the 3DES (Triple Data Encryption Standard) algorithm. The aim is to facilitate secure transmission and reception of data, including confidential messages, within digital information media. The research methodology involves implementing 3DES + LSB using Image Citra and innovating 3DES + MD5 Hash in .txt files. The results and discussions described include, among others, Pseudocode, Cryptographic Testing, and Steganography Testing. Based on the results of program analysis and testing, it can be concluded that the more messages that are inserted in the image, the more pixel differences there are in the stego image. The more colors in the image to which the message will be inserted, the more pixel differences in the stego image will be. The images that stego objects can present are only images with .png and .jpeg extensions. Testing from the fidelity aspect, the average PSNR obtained is 66,365, meaning that the stego image quality is very good. Testing from the recovery aspect, from 4 tested stego images, showed that messages can be extracted again. Testing of the robustness spec using two attack techniques, namely rotation, and robustness, shows that the message cannot be extracted from the image. Testers of the computation time, from testing 1-1000 characters, show the average time required for computation is about 0.798 seconds.</p> Ilham Firman Ashari Eko Dwi Nugroho Dodi Devrian Andrianto M. Asyroful Nur Maulana Yusuf Makruf Alkarkhi ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 2 8 18 10.25077/jitce.8.1.8-18.2024 Design of a Drowsiness Prevention Helmet with Vibration and IoT-Based Theft Detection Alarms http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/204 <p>Ensuring safety while riding a motorbike is an imperative task. Currently, safety products such as helmets have the capability to provide protection to users without the additional feature of issuing warnings. Consequently, a preemptive alert system is developed to offer timely notifications to drivers. The experimental setup involves the utilization of a Max30100 sensor that is linked to a microcontroller and integrated into a helmet. The objective of this final project is to offer a timely alert to the rider and utilize the Max30100 sensor for pulse detection in order to ascertain the normalcy of the rider's pulse. In instances where the rider encounters tiredness and fatigue, it is common for the pulse intensity to exhibit a reduction. The Blynk application presents the detection pulse findings on the smartphone screen, while the buzzer on the helmet will activate in response to vibrations and sounds once the pulse has diminished. Based on testing, the average pulse rate on quiet road conditions is 78.58 BPM. On busy road conditions, the average pulse rate is 73.25 BPM. While in traffic conditions, the average pulse rate is 73.5 BPM. The helmet theft detector uses a Sharp GP2Y0A21 sensor that can only detect object distances up to 10 cm.</p> Aditya Putra Perdana Prasetyo Harlis Richard Sitorus Rahmat Fadli Isnanto Adi Hermansyah ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 2 19 29 10.25077/jitce.8.1.19-29.2024