The Smart Agriculture based on Reconstructed Thermal Image
Main Article Content
Abstract
The utilization of thermal image in supporting precision agriculture is tremendous nowadays. There are many applications of thermal images in agricultural fields, such as detecting crop water stress, monitoring of free-range rabbits, measuring of crop canopy temperature and so on. Furthermore, the importance of thermal camera became the urgent need of perform the smart agriculture. Otherwise, the price of thermal camera is very expensive todays. Then, this kind of camera is not easy to find in the market. Therefore, it makes the utilization of implementation thermal images difficult. In order to handle this problem, the proposed method intends to generate thermal image from visible images. Further, the thermal information concerning with the agriculture, especially the fertility of leaves in paddy fields and the water stress can be monitored. The proposed method uses deep learning architecture to learn the thermal and visible image dataset. It applies Generative Adversarial Network architecture. This GAN pre-trained model trained using 150 images of training dataset and tested using many images of testset. The obtained model is used for generating thermal images from visible images. The results show the constructed thermal image has high accuracy. The assessment metric uses SSIM and PSNR methods. Their indexes show that the results have the high accuracy. The visual assessment shows the reconstructed thermal images also have high precision. Finally, the constructed thermal images can be implemented in smart agriculture purposes.
Downloads
Article Details
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Please find the rights and licenses in the Journal of Information Technology and Computer Engineering (JITCE).
1. License
The non-commercial use of the article will be governed by the Creative Commons Attribution license as currently displayed on Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
2. Author(s)’ Warranties
The author(s) warrants that the article is original, written by stated author(s), has not been published before, contains no unlawful statements, does not infringe the rights of others, is subject to copyright that is vested exclusively in the author and free of any third party rights, and that any necessary permissions to quote from other sources have been obtained by the author(s).
3. User Rights
JITCE adopts the spirit of open access and open science, which disseminates articles published as free as possible under the Creative Commons license. JITCE permits users to copy, distribute, display, and perform the work for non-commercial purposes only. Users will also need to attribute authors and JITCE on distributing works in the journal.
4. Rights of Authors
Authors retain the following rights:
- Copyright, and other proprietary rights relating to the article, such as patent rights,
- the right to use the substance of the article in future own works, including lectures and books,
- the right to reproduce the article for own purposes,
- the right to self-archive the article.
- the right to enter into separate, additional contractual arrangements for the non-exclusive distribution of the article's published version (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal (Journal of Information Technology and Computer Engineering).
5. Co-Authorship
If the article was jointly prepared by other authors; upon submitting the article, the author is agreed on this form and warrants that he/she has been authorized by all co-authors on their behalf, and agrees to inform his/her co-authors. JITCE will be freed on any disputes that will occur regarding this issue.
7. Royalties
By submitting the articles, the authors agreed that no fees are payable from JITCE.
8. Miscellaneous
JITCE will publish the article (or have it published) in the journal if the article’s editorial process is successfully completed and JITCE or its sublicensee has become obligated to have the article published. JITCE may adjust the article to a style of punctuation, spelling, capitalization, referencing and usage that it deems appropriate. The author acknowledges that the article may be published so that it will be publicly accessible and such access will be free of charge for the readers.
References
[2] Y. P.Dang, J. T.Christopher, andR. C.Dalal, “Genetic diversity in barley and wheat for tolerance to soil constraints,” Agronomy, vol. 6, no. 4, 2016.
[3] M.Stoll andH. G.Jones, “Thermal imaging as a viable tool for monitoring plant stress,” J. Int. des Sci. la Vigne du Vin, vol. 41, no. 2, pp. 77–84, 2007.
[4] O. M.Grant, Ł.Tronina, H. G.Jones, andM. M.Chaves, “Exploring thermal imaging variables for the detection of stress responses in grapevine under different irrigation regimes,” J. Exp. Bot., vol. 58, no. 4, pp. 815–825, 2007.
[5] Y.Lecun, Y.Bengio, andG.Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
[6] A.Rasouli, “Deep Learning for Vision-based Prediction: A Survey,” pp. 1–40, 2020.
[7] V.Psiroukis, I.Malounas, N.Mylonas, K.-E.Grivakis, S.Fountas, andI.Hadjigeorgiou, “Monitoring of free-range rabbits using aerial thermal imaging,” Smart Agric. Technol., vol. 1, no. June, p. 100002, 2021.
[8] Z.Zhou, Y.Majeed, G.Diverres Naranjo, andE. M. T.Gambacorta, “Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications,” Comput. Electron. Agric., vol. 182, no. November 2020, p. 106019, 2021.
[9] J.Giménez-Gallego, J. D.González-Teruel, F.Soto-Valles, M.Jiménez-Buendía, H.Navarro-Hellín, andR.Torres-Sánchez, “Intelligent thermal image-based sensor for affordable measurement of crop canopy temperature,” Comput. Electron. Agric., vol. 188, 2021.
[10] Y.Pang, J.Lin, T.Qin, andZ.Chen, “Image-to-Image Translation: Methods and Applications,” IEEE Trans. Multimed., pp. 1–24, 2021.
[11] D. P.Kingma andM.Welling, “Auto-encoding variational bayes,” 2nd Int. Conf. Learn. Represent. ICLR 2014 - Conf. Track Proc., no. Ml, pp. 1–14, 2014.
[12] I.Goodfellow et al., “Generative adversarial networks,” Commun. ACM, vol. 63, no. 11, pp. 139–144, 2020.
[13] M. Y.Liu, T.Breuel, andJ.Kautz, “Unsupervised image-to-image translation networks,” Adv. Neural Inf. Process. Syst., vol. 2017-Decem, no. Nips, pp. 701–709, 2017.