The Smart Agriculture based on Reconstructed Thermal Image

Main Article Content

Ismail Ismail

Abstract

The utilization of thermal image in supporting precision agriculture is tremendous nowadays. There are many applications of thermal images in agricultural fields, such as detecting crop water stress, monitoring of free-range rabbits, measuring of crop canopy temperature and so on. Furthermore, the importance of thermal camera became the urgent need of perform the smart agriculture. Otherwise, the price of thermal camera is very expensive todays. Then, this kind of camera is not easy to find in the market. Therefore, it makes the utilization of implementation thermal images difficult. In order to handle this problem, the proposed method intends to generate thermal image from visible images. Further, the thermal information concerning with the agriculture, especially the fertility of leaves in paddy fields and the water stress can be monitored. The proposed method uses deep learning architecture to learn the thermal and visible image dataset. It applies Generative Adversarial Network architecture. This GAN pre-trained model trained using 150 images of training dataset and tested using many images of testset. The obtained model is used for generating thermal images from visible images. The results show the constructed thermal image has high accuracy. The assessment metric uses SSIM and PSNR methods. Their indexes show that the results have the high accuracy. The visual assessment shows the reconstructed thermal images also have high precision. Finally, the constructed thermal images can be implemented in smart agriculture purposes.

Downloads

Download data is not yet available.

Article Details

How to Cite
Ismail, I. (2022, March 31). The Smart Agriculture based on Reconstructed Thermal Image. JITCE (Journal of Information Technology and Computer Engineering), 6(01), 8-13. https://doi.org/https://doi.org/10.25077/jitce.6.01.8-13.2022
Section
Articles

References

[1] S.Das et al., “UAV-thermal imaging: A technological breakthrough for monitoring and quantifying crop abiotic stress to help sustain productivity on sodic soils – A case review on wheat,” Remote Sens. Appl. Soc. Environ., vol. 23, no. April, p. 100583, 2021.
[2] Y. P.Dang, J. T.Christopher, andR. C.Dalal, “Genetic diversity in barley and wheat for tolerance to soil constraints,” Agronomy, vol. 6, no. 4, 2016.
[3] M.Stoll andH. G.Jones, “Thermal imaging as a viable tool for monitoring plant stress,” J. Int. des Sci. la Vigne du Vin, vol. 41, no. 2, pp. 77–84, 2007.
[4] O. M.Grant, Ł.Tronina, H. G.Jones, andM. M.Chaves, “Exploring thermal imaging variables for the detection of stress responses in grapevine under different irrigation regimes,” J. Exp. Bot., vol. 58, no. 4, pp. 815–825, 2007.
[5] Y.Lecun, Y.Bengio, andG.Hinton, “Deep learning,” Nature, vol. 521, no. 7553, pp. 436–444, 2015.
[6] A.Rasouli, “Deep Learning for Vision-based Prediction: A Survey,” pp. 1–40, 2020.
[7] V.Psiroukis, I.Malounas, N.Mylonas, K.-E.Grivakis, S.Fountas, andI.Hadjigeorgiou, “Monitoring of free-range rabbits using aerial thermal imaging,” Smart Agric. Technol., vol. 1, no. June, p. 100002, 2021.
[8] Z.Zhou, Y.Majeed, G.Diverres Naranjo, andE. M. T.Gambacorta, “Assessment for crop water stress with infrared thermal imagery in precision agriculture: A review and future prospects for deep learning applications,” Comput. Electron. Agric., vol. 182, no. November 2020, p. 106019, 2021.
[9] J.Giménez-Gallego, J. D.González-Teruel, F.Soto-Valles, M.Jiménez-Buendía, H.Navarro-Hellín, andR.Torres-Sánchez, “Intelligent thermal image-based sensor for affordable measurement of crop canopy temperature,” Comput. Electron. Agric., vol. 188, 2021.
[10] Y.Pang, J.Lin, T.Qin, andZ.Chen, “Image-to-Image Translation: Methods and Applications,” IEEE Trans. Multimed., pp. 1–24, 2021.
[11] D. P.Kingma andM.Welling, “Auto-encoding variational bayes,” 2nd Int. Conf. Learn. Represent. ICLR 2014 - Conf. Track Proc., no. Ml, pp. 1–14, 2014.
[12] I.Goodfellow et al., “Generative adversarial networks,” Commun. ACM, vol. 63, no. 11, pp. 139–144, 2020.
[13] M. Y.Liu, T.Breuel, andJ.Kautz, “Unsupervised image-to-image translation networks,” Adv. Neural Inf. Process. Syst., vol. 2017-Decem, no. Nips, pp. 701–709, 2017.