JITCE (Journal of Information Technology and Computer Engineering) http://jitce.fti.unand.ac.id/index.php/JITCE <p><strong>JITCE (Journal of Information Technology and Computer Engineering)</strong>&nbsp;is a scholarly periodical. JITCE will publish research papers, technical papers, conceptual papers, and case study reports. This journal is published by<a href="http://sk.fti.unand.ac.id"> Computer System Department</a> at&nbsp;<a href="http://www.unand.ac.id/" target="_blank" rel="noopener">Universitas Andalas</a>, Padang, West Sumatra, Indonesia.</p> <p>One volume of JITCE consisted of two editions, which are published in March and September each year. Articles are written in Bahasa Indonesia (Indonesian language) OR English. Abstracts&nbsp;<strong>must be in English</strong>.</p> Universitas Andalas en-US JITCE (Journal of Information Technology and Computer Engineering) 2599-1663 <div id="copyright"> <p>Please find the rights and licenses in the Journal of Information Technology and Computer Engineering (JITCE).</p> </div> <p>1. License</p> <p style="text-align: center;"><a href="http://creativecommons.org/licenses/by-nc-sa/4.0/" target="_blank" rel="license noopener"><img style="border-width: 0;" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" alt="Creative Commons License"></a></p> <p>&nbsp;</p> <p>The non-commercial use of the article will be governed by the Creative Commons Attribution license as currently displayed on&nbsp;<a href="http://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>.&nbsp;</p> <p>2. Author(s)’ Warranties</p> <p>The author(s) warrants that the article is original, written by stated author(s), has not been published before, contains no unlawful statements, does not infringe the rights of others, is subject to copyright that is vested exclusively in the author and free of any third party rights, and that any necessary permissions to quote from other sources have been obtained by the author(s).</p> <p>3. User Rights</p> <p>JITCE adopts the spirit of open access and open science, which disseminates articles published as free as possible under the Creative Commons license. JITCE permits users to copy, distribute, display, and perform the work for non-commercial purposes only. Users will also need to attribute authors and JITCE on distributing works in the journal.</p> <p>4. Rights of Authors</p> <p>Authors retain the following rights:</p> <ul> <li>Copyright, and other proprietary rights relating to the article, such as patent rights,</li> <li>the right to use the substance of the article in future own works, including lectures and books,</li> <li>the right to reproduce the article for own purposes,&nbsp;</li> <li>the right to self-archive the article.</li> <li>the right to enter into separate, additional contractual arrangements for the non-exclusive distribution of the article's published version (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal (Journal of Information Technology and Computer Engineering).</li> </ul> <p>5. Co-Authorship</p> <p>If the article was jointly prepared by other authors; upon submitting the article, the author is agreed on this form and warrants that he/she has been authorized by all co-authors on their behalf, and agrees to inform his/her co-authors. JITCE will be freed on any disputes that will occur regarding this issue.&nbsp;</p> <p>7. Royalties</p> <p>By submitting the articles, the authors agreed that no fees are payable from JITCE.</p> <p>&nbsp;</p> <p>8. Miscellaneous</p> <p>JITCE will publish the article (or have it published) in the journal if the article’s editorial process is successfully completed and JITCE or its sublicensee has become obligated to have the article published. JITCE may&nbsp;adjust the article to a style of punctuation, spelling, capitalization, referencing and usage that it deems appropriate. The author acknowledges that the article may be published so that it will be publicly accessible and such access will be free of charge for the readers.&nbsp;</p> Control System Strategy for Ring Thrower Robot Based on PID-CSA for ABU Robocon 2023 http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/238 <p><em>Thrower Robot need to maintain system stability in carrying out their tasks, which require accuracy and stability while throwing the ring in different zones and have different distances and height, so the thrower robot needs to maintain the stability of the system in order to reach target properly. Maintain the stability of the thrower is important because of the physical task like throwing the ring. However, disturbance from external systems can affect the accuracy which can reduce the performance of the robot while performing their task. Therefore, system needs stable accuracy in performing the tasks despite interference. The control system is used to maintain acceleration and elevation in the process of throwing the ring so that it can reach the specified target. The implemented system uses Proportional, Integral and Derivative (PID) control based on the Cuckoo Search Algorithm (CSA). Function of PID control is to maintain a constant position at a certain target and CSA is used to simplify PID control tunning when it has some parameter modifications. Therefore, combination of PID-CSA is applied for this system to produce a control system that aims to maintain stability and reduce disturbances contained in the ring throwing robot based on manipulation. From the result obtained, the PID-CSA method has a better level of stability because it can reduce the percentage value of the error which produced by PID-TE by showing the percentage value of distance error up to 0.68% and value of angle error up to 2.39%.</em></p> Rizky Andhika Akbar Aris Budiyarto Ridwan Ridwan ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 1 Nudibranch Suborders Classification based on Densely Connected Convolutional Networks http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/201 <p>Nudibranchs, often called sea slugs, are a group of soft-bodied marine gastropod mollusks that shed their shells after their larval stage. With their body structure that is very similar between one suborder and another, sometimes it is hard to tell apart the suborder of a nudibranch. In this work, we make an Image Classification model for determining the suborder of a nudibranch using deep learning algorithms DenseNet and EfficientNet. The experiment is conducted using Google Colaboratory environment. For DenseNet, we use 121, 169, and 201 layers; meanwhile, we only use the baseline algorithm for EfficientNet. The dataset for research is randomly taken from marine fauna forums on the internet. DenseNet with 201 layers shows a better generalization than other classifiers (accuracy of DenseNet 121, 169, 201, and baseline EfficientNet, respectively 53%, 41%, 73%, and 47%). The research produces a decent system for classifying the suborder of the Nudibranch. Usage of image recognition or background blurring systems in future research can improve the system's accuracy.</p> Timothy Christyan Safitri Yuliana Utama Bagus Tri Yulianto Darmawan Faisal Dharma Adhinata ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 1 Deep Learning-Based Dzongkha Handwritten Digit Classification http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/202 <p>In computer vision applications, pattern recognition is one of the important fields in artificial intelligence. With the advancement in deep learning technology, many machine learning algorithms were developed to tackle the problem of pattern recognition. The purpose of conducting the research is to create the first-ever Dzongkha handwritten digit dataset and develop a model to classify the digit. In the study, the 3 layer set of CONV → ReLU → POOL, followed by a fully connected layer, dropout layer, and softmax function were used to train the digit. In the dataset, each class (0-9) contains 1500 images which are split into train, validation, and test sets: 70:20:10. The model was trained on three different image dimensions: 28 by 28, 32 by 32, and 64 by 64. Compared to image dimensions 28 by 28 and 32 by 32, 64&nbsp;by 64 gave the highest train, validation, and test accuracy of 98.66%, 98.9%, and 99.13% respectively. In the future, the sample of digits needs to be increased and use the transfer learning concept to train the model.</p> Yonten Jamtsho Pema Yangden Sonam Wangmo Nima Dema ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 1 1 7 10.25077/jitce.8.1.1-7.2024 The Evaluation of LSB Steganography on Image File Using 3DES and MD5 Key http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/177 <p>Information security is paramount for individuals, companies, and governments. Threats to data confidentiality are increasingly complex, demanding strong protection. Therefore, cryptography and steganography play pivotal roles. This study proposes the utilization of LSB (Least Significant Bit) steganography on image files employing the 3DES (Triple Data Encryption Standard) algorithm. The aim is to facilitate secure transmission and reception of data, including confidential messages, within digital information media. The research methodology involves implementing 3DES + LSB using Image Citra and innovating 3DES + MD5 Hash in .txt files. The results and discussions described include, among others, Pseudocode, Cryptographic Testing, and Steganography Testing. Based on the results of program analysis and testing, it can be concluded that the more messages that are inserted in the image, the more pixel differences there are in the stego image. The more colors in the image to which the message will be inserted, the more pixel differences in the stego image will be. The images that stego objects can present are only images with .png and .jpeg extensions. Testing from the fidelity aspect, the average PSNR obtained is 66,365, meaning that the stego image quality is very good. Testing from the recovery aspect, from 4 tested stego images, showed that messages can be extracted again. Testing of the robustness spec using two attack techniques, namely rotation, and robustness, shows that the message cannot be extracted from the image. Testers of the computation time, from testing 1-1000 characters, show the average time required for computation is about 0.798 seconds.</p> Ilham Firman Ashari Eko Dwi Nugroho Dodi Devrian Andrianto M. Asyroful Nur Maulana Yusuf Makruf Alkarkhi ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 1 8 18 10.25077/jitce.8.1.8-18.2024 Design of a Drowsiness Prevention Helmet with Vibration and IoT-Based Theft Detection Alarms http://jitce.fti.unand.ac.id/index.php/JITCE/article/view/204 <p>Ensuring safety while riding a motorbike is an imperative task. Currently, safety products such as helmets have the capability to provide protection to users without the additional feature of issuing warnings. Consequently, a preemptive alert system is developed to offer timely notifications to drivers. The experimental setup involves the utilization of a Max30100 sensor that is linked to a microcontroller and integrated into a helmet. The objective of this final project is to offer a timely alert to the rider and utilize the Max30100 sensor for pulse detection in order to ascertain the normalcy of the rider's pulse. In instances where the rider encounters tiredness and fatigue, it is common for the pulse intensity to exhibit a reduction. The Blynk application presents the detection pulse findings on the smartphone screen, while the buzzer on the helmet will activate in response to vibrations and sounds once the pulse has diminished. Based on testing, the average pulse rate on quiet road conditions is 78.58 BPM. On busy road conditions, the average pulse rate is 73.25 BPM. While in traffic conditions, the average pulse rate is 73.5 BPM. The helmet theft detector uses a Sharp GP2Y0A21 sensor that can only detect object distances up to 10 cm.</p> Aditya Putra Perdana Prasetyo Harlis Richard Sitorus Rahmat Fadli Isnanto Adi Hermansyah ##submission.copyrightStatement## http://creativecommons.org/licenses/by-nc-sa/4.0 2024-03-31 2024-03-31 8 1 19 29 10.25077/jitce.8.1.19-29.2024