353
Views
0
CrossRef citations to date
0
Altmetric
Review Article

Images and CNN applications in smart agriculture

, &
Article: 2352386 | Received 15 Jan 2024, Accepted 03 May 2024, Published online: 14 May 2024

References

  • Adarsh, P., Rathi, P., & Kumar, M. (2020). YOLO v3-tiny: Object detection and recognition using one stage improved model. In 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), 687–29. https://doi.org/10.1109/ICACCS48705.2020.9074315
  • Agarwal, M., Gupta, S. K., & Biswas, K. (2020). Development of efficient CNN model for tomato crop disease identification. Sustainable Computing: Informatics and Systems, 28, 100407. https://doi.org/10.1016/j.suscom.2020.100407
  • Agarwal, M., Singh, A., Arjaria, S., Sinha, A., & Gupta, S. (2020). ToLeD: Tomato leaf disease detection using convolution neural network. Procedia Computer Science, 167, 293–301. https://doi.org/10.1016/j.procs.2020.03.225
  • Alencastre-Miranda, M., Davidson, J. R., Johnson, R. M., Waguespack, H., & Krebs, H. I. (2018). Robotics for sugarcane cultivation: Analysis of billet quality using computer vision. IEEE Robotics and Automation Letters, 3(4), 3828–3835. https://doi.org/10.1109/LRA.2018.2856999
  • Altaheri, H., Alsulaiman, M., Muhammad, G., Amin, S. U., Bencherif, M., & Mekhtiche, M. (2019). Date fruit dataset for intelligent harvesting. Data in Brief, 26, 104514. https://doi.org/10.1016/j.dib.2019.104514
  • Amara, J., Bouaziz, B., & Algergawy, A. (2017). A deep learning-based approach for banana leaf diseases classification. Datenbanksysteme für Business, Technologie und Web (BTW 2017) - Workshopband (pp. 79–88). https://dl.gi.de/items/13766147-8092-4f0a-b4e1-8a11a9046bdf
  • Amatya, S., Karkee, M., Gongal, A., Zhang, Q., & Whiting, M. D. (2016). Detection of cherry tree branches with full foliage in planar architecture for automated sweet-cherry harvesting. Biosystems Engineering, 146, 3–15. https://doi.org/10.1016/j.biosystemseng.2015.10.003
  • Angelo Randaci, Earth’s Ally. (2021, March 10). Common plant diseases & disease control for organic gardens. https://earthsally.com/disease-control/common-plant-diseases.html
  • An, J., Li, W., Li, M., Cui, S., & Yue, H. (2019). Identification and classification of maize drought stress using deep convolutional neural network. Symmetry, 11(2), 256. https://doi.org/10.3390/sym11020256
  • Atila, Ü., Uçar, M., Akyol, K., & Uçar, E. (2021). Plant leaf disease classification using EfficientNet deep learning model. Ecological Informatics, 61, 101182. https://doi.org/10.1016/j.ecoinf.2020.101182
  • Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/TPAMI.2016.2644615
  • Bai, Y., Mei, J., Yuille, A. L., & Xie, C. (2021). Are transformers more robust than CNNs? Advances in Neural Information Processing Systems, 34, 26831–26843. https://proceedings.neurips.cc/paper/2021/hash/e19347e1c3ca0c0b97de5fb3b690855a-Abstract.html
  • Bannari, A., Morin, D., Bonn, F., & Huete, A. (1995). A review of vegetation indices. Remote Sensing Reviews, 13(1–2), 95–120. https://doi.org/10.1080/02757259509532298
  • Barbedo, J. G. A. (2019). Plant disease identification from individual lesions and spots using deep learning. Biosystems Engineering, 180, 96–107. https://doi.org/10.1016/j.biosystemseng.2019.02.002
  • Barbedo, J. G. A., Koenigkan, L. V., Halfeld-Vieira, B. A., Costa, R. V., Nechet, K. L., Godoy, C. V., Junior, M. L., Patricio, F. R. A., Talamini, V., Chitarra, L. G., Alves Santos Oliveira, S., Nakasone Ishida, A. K., Cunha Fernandes, J. M., Teixeira Santos, T., Rossi Cavalcanti, F., Terao, D., Angelotti, F., & & others. (2018). Annotated plant pathology databases for image-based detection and recognition of diseases. IEEE Latin America Transactions, 16(6), 1749–1757. https://doi.org/10.1109/TLA.2018.8444395
  • Birth, G. S., & McVey, G. R. (1968). Measuring the color of growing turf with a reflectance Spectrophotometer1. Agronomy Journal, 60(6), 640–643. https://doi.org/10.2134/agronj1968.00021962006000060016x
  • Boegh, E., Soegaard, H., Broge, N., Hasager, C. B., Jensen, N. O., Schelde, K., & Thomsen, A. (2002). Airborne multispectral data for quantifying leaf area index, nitrogen concentration, and photosynthetic efficiency in agriculture. Remote Sensing of Environment, 81(2), 179–193. https://doi.org/10.1016/S0034-4257(01)00342-X
  • Bosilj, P., Aptoula, E., Duckett, T., & Cielniak, G. (2019). Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1), 7–19. https://doi.org/10.1002/rob.21869 to be determined (published online).
  • Bosilj, P., Aptoula, E., Duckett, T., & Cielniak, G. (2020). Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1), 7–19. https://doi.org/10.1002/rob.21869
  • Burkov, A. (2019). The hundred-page machine learning book. Andriy Burkov. https://books.google.fr/books?id=0jbxwQEACAAJ
  • Butte, S., Vakanski, A., Duellman, K., Wang, H., & Mirkouei, A. (2021). Potato crop stress identification in aerial images using deep learning-based object detection. Agronomy Journal, 113(5), 3991–4002. https://doi.org/10.1002/agj2.20841
  • Ceccato, P., Flasse, S., Tarantola, S., Jacquemoud, S., & Grégoire, J.-M. (2001). Detecting vegetation leaf water content using reflectance in the optical domain. Remote Sensing of Environment, 77(1), 22–33. https://doi.org/10.1016/S0034-4257(01)00191-2
  • Chandel, N. S., Chakraborty, S. K., Rajwade, Y. A., Dubey, K., Tiwari, M. K., & Jat, D. (2021). Identifying crop water stress using deep learning models. Neural Computing and Applications, 33(10), 5353–5367. https://doi.org/10.1007/s00521-020-05325-4
  • Chattopadhay, A., Sarkar, A., Howlader, P., & Balasubramanian, V. N. (2018). Grad-CAM++: Generalized gradient-based visual explanations for deep convolutional networks. 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), 839–847. https://doi.org/10.1109/WACV.2018.00097
  • Chebrolu, N., Lottes, P., Schaefer, A., Winterhalter, W., Burgard, W., & Stachniss, C. (2017). Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. The International Journal of Robotics Research, 36(10), 1045–1052. https://doi.org/10.1177/0278364917720510
  • Chen, L., Li, S., Bai, Q., Yang, J., Jiang, S., & Miao, Y. (2021). Review of image classification algorithms based on convolutional neural networks. Remote Sensing, 13(22), 4712. https://doi.org/10.3390/rs13224712 Article 22.
  • Chicco, D., & Jurman, G. (2020). The advantages of the matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation. BMC Genomics, 21(1), 6. https://doi.org/10.1186/s12864-019-6413-7
  • Chiu, M. T., Xu, X., Wei, Y., Huang, Z., Schwing, A., Brunner, R., Khachatrian, H., Karapetyan, H., Dozier, I., Rose, G., & & others. (2020). Agriculture-vision: A large aerial image database for agricultural pattern analysis. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 2828–2838). https://openaccess.thecvf.com/content_CVPR_2020/html/Chiu_Agriculture-Vision_A_Large_Aerial_Image_Database_for_Agricultural_Pattern_Analysis_CVPR_2020_paper.html
  • Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37–46. https://doi.org/10.1177/001316446002000104
  • Czymmek, V., Harders, L. O., Knoll, F. J., & Hussmann, S. (2019). Vision-based deep learning approach for real-time detection of weeds in organic farming. In 2019 IEEE International Instrumentation and Measurement Technology Conference (I2MTC) (pp. 1–5). https://doi.org/10.1109/I2MTC.2019.8826921
  • Deininger, L., Stimpel, B., Yuce, A., Abbasi-Sureshjani, S., Schönenberger, S., Ocampo, P., Korski, K., & Gaire, F. (2022). A comparative study between vision transformers and CNNs in digital pathology ( arXiv:2206.00389). arXiv. http://arxiv.org/abs/2206.00389
  • Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. In 2009 IEEE Conference on Computer Vision and Pattern Recognition (pp. 248–255). https://doi.org/10.1109/CVPR.2009.5206848
  • dos Santos Ferreira, A., Freitas, D. M., da Silva, G. G., Pistori, H., & Folhes, M. T. (2017). Weed detection in soybean crops using ConvNets. Computers and Electronics in Agriculture, 143, 314–324. https://doi.org/10.1016/j.compag.2017.10.027
  • Espejo-Garcia, B., Mylonas, N., Athanasakos, L., Fountas, S., & Vasilakoglou, I. (2020). Towards weeds identification assistance through transfer learning. Computers and Electronics in Agriculture, 171, 105306. https://doi.org/10.1016/j.compag.2020.105306
  • Fawakherji, M., Youssef, A., Bloisi, D., Pretto, A., & Nardi, D. (2019). Crop and weeds classification for precision agriculture using context-independent pixel-wise segmentation (pp. 146–152). https://doi.org/10.1109/IRC.2019.00029
  • Food and Agriculture Organization of the United Nations. (2019). FAO – News article: New standards to curb the global spread of plant pests and diseases. https://www.fao.org/news/story/en/item/1187738/icode/
  • Gao, B.-C. (1996). NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sensing of Environment, 58(3), 257–266. https://doi.org/10.1016/S0034-4257(96)00067-3
  • Gao, J., French, A. P., Pound, M. P., He, Y., Pridmore, T. P., & Pieters, J. G. (2020). Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant Methods, 16(1), 1–12. https://doi.org/10.1186/s13007-020-00570-z
  • Gao, W., Zhang, X., Yang, L., & Liu, H. (2010). An improved Sobel edge detection. 2010 3rd International Conference on Computer Science and Information Technology, 5, 67–71. https://doi.org/10.1109/ICCSIT.2010.5563693
  • Gatti, A., & Bertolini, A. (2013). Sentinel-2 products specification document. Retrieved February 23, 2015, from https://Earth.Esa.Int/Documents/247904/685211/Sentinel-2+Products+Specification+Document
  • Geetha, V., Punitha, A., Abarna, M., Akshaya, M., Illakiya, S., & Janani, A. (2020). An effective crop prediction using random forest algorithm. In 2020 International Conference on System, Computation, Automation and Networking (ICSCAN) (pp. 1–5). https://doi.org/10.1109/ICSCAN49426.2020.9262311
  • Giselsson, T. M., Dyrmann, M., Jørgensen, R. N., Jensen, P. K., & Midtiby, H. S. (2017). A public image database for benchmark of plant seedling classification algorithms. ArXiv Preprint. https://doi.org/10.48550/arXiv.1711.05458
  • Gitelson, A. A., Merzlyak, M. N., & Lichtenthaler, H. K. (1996). Detection of red edge position and chlorophyll content by reflectance measurements near 700 nm. Journal of Plant Physiology, 148(3), 501–508. https://doi.org/10.1016/S0176-1617(96)80285-9
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.
  • Grinblat, G. L., Uzal, L. C., Larese, M. G., & Granitto, P. M. (2016). Deep learning for plant identification using vein morphological patterns. Computers and Electronics in Agriculture, 127, 418–424. https://doi.org/10.1016/j.compag.2016.07.003
  • Gull, A., Lone, A. A., & Wani, N. U. I. (2019). Abiotic and Biotic Stress in Plants, 1–19. https://doi.org/10.5772/intechopen.77845
  • Hasan, A. M., Sohel, F., Diepeveen, D., Laga, H., & Jones, M. G. (2021). A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture, 184, 106067. https://doi.org/10.1016/j.compag.2021.106067
  • Haug, S., & Ostermann, J. (2015). A Crop/Weed Field Image Dataset for the Evaluation of Computer Vision Based Precision Agriculture Tasks. In: L. Agapito, M. Bronstein, & C. Rother (Eds.), Computer Vision - ECCV 2014 Workshops. ECCV 2014. Lecture Notes in Computer Science (pp. 105–116). Cham: Springer. https://doi.org/10.1007/978-3-319-16220-1_8
  • Hegazi, E. H., Samak, A. A., Yang, L., Huang, R., & Huang, J. (2023). Prediction of soil moisture content from sentinel-2 images using convolutional neural network (CNN). Agronomy, 13(3), 656. https://doi.org/10.3390/agronomy13030656
  • Helber, P., Bischke, B., Dengel, A., & Borth, D. (2019). Eurosat: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(7), 2217–2226. https://doi.org/10.1109/JSTARS.2019.2918242
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 770–778). https://doi.org/10.1109/CVPR.2016.90
  • Huang, M., & Chuang, T. (2020). A database of eight common tomato pest images. Mendeley Data.
  • Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 4700–4708). https://doi.org/10.1109/CVPR.2017.243
  • Huang, G.-B., Zhu, Q.-Y., & Siew, C.-K. (2006). Extreme learning machine: Theory and applications. Neurocomputing, 70(1–3), 489–501. https://doi.org/10.1016/j.neucom.2005.12.126
  • Hu, K., Coleman, G., Zeng, S., Wang, Z., & Walsh, M. (2020). Graph weeds net: A graph-based deep learning method for weed recognition. Computers and Electronics in Agriculture, 174, 105520. https://doi.org/10.1016/j.compag.2020.105520
  • Huete, A. R. (1988). A soil-adjusted vegetation index (SAVI). Remote Sensing of Environment, 25(3), 295–309. https://doi.org/10.1016/0034-4257(88)90106-X
  • Huete, A., Didan, K., Miura, T., Rodriguez, E. P., Gao, X., & Ferreira, L. G. (2002). Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sensing of Environment, 83(1), 195–213. https://doi.org/10.1016/S0034-4257(02)00096-2
  • Hughes, D., & Salathé, M. (2015). An open access repository of images on plant health to enable the development of mobile disease diagnostics. https://doi.org/10.48550/arXiv.1511.08060
  • Hu, K., Wang, Z., Coleman, G., Bender, A., Yao, T., Zeng, S., Song, D., Schumann, A., & Walsh, M. (2021). Deep learning techniques for in-crop weed identification: A review. arXiv Preprint arXiv:2103.14872. https://doi.org/10.48550/arXiv.2103.14872
  • Hu, G., Wu, H., Zhang, Y., & Wan, M. (2019). A low shot learning method for tea leaf’s disease identification. Computers and Electronics in Agriculture, 163, 104852. https://doi.org/10.1016/j.compag.2019.104852
  • Jackson, R. D., & Huete, A. R. (1991). Interpreting vegetation indices. Preventive Veterinary Medicine, 11(3–4), 185–200. https://doi.org/10.1016/S0167-5877(05)80004-2
  • Jayaraman, P. P., Yavari, A., Georgakopoulos, D., Morshed, A., & Zaslavsky, A. (2016). Internet of things platform for smart farming: Experiences and lessons learnt. Sensors, 16(11), 1884. https://doi.org/10.3390/s16111884
  • Jeong, J., Park, H., & Kwak, N. (2017). Enhancement of SSD by concatenating feature maps for object detection. arXiv Preprint arXiv:1705.09587. https://doi.org/10.48550/arXiv.1705.09587
  • Jha, K., Doshi, A., Patel, P., & Shah, M. (2019). A comprehensive review on automation in agriculture using artificial intelligence. Artificial Intelligence in Agriculture, 2, 1–12. https://doi.org/10.1016/j.aiia.2019.05.004
  • Jiang, P., Chen, Y., Liu, B., He, D., & Liang, C. (2019). Real-time detection of apple leaf diseases using deep learning approach based on improved convolutional neural networks. IEEE Access, 7, 59069–59080. https://doi.org/10.1109/ACCESS.2019.2914929
  • Jiang, H., Zhang, C., Qiao, Y., Zhang, Z., Zhang, W., & Song, C. (2020). CNN feature based graph convolutional network for weed and crop recognition in smart farming. Computers and Electronics in Agriculture, 174, 105450. https://doi.org/10.1016/j.compag.2020.105450
  • Johann, A. L., de Araújo, A. G., Delalibera, H. C., & Hirakawa, A. R. (2016). Soil moisture modeling based on stochastic behavior of forces on a no-till chisel opener. Computers and Electronics in Agriculture, 121, 420–428. https://doi.org/10.1016/j.compag.2015.12.020
  • Kaggle. (2019). New plant diseases dataset. https://www.kaggle.com/datasets/vipoooool/new-plant-diseases-dataset
  • Kamarudin, M. H., Ismail, Z. H., & Saidi, N. B. (2021). Deep learning sensor fusion in plant water stress assessment: A comprehensive review. Applied Sciences, 11(4), 1403. https://doi.org/10.3390/app11041403
  • Kamilaris, A., & Prenafeta-Boldú, F. X. (2018a). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147, 70–90. https://doi.org/10.1016/j.compag.2018.02.016
  • Kamilaris, A., & Prenafeta-Boldú, F. X. (2018b). A review of the use of convolutional neural networks in agriculture. The Journal of Agricultural Science, 156(3), 312–322. https://doi.org/10.1017/S0021859618000436
  • Kelleher, J. D. (2019). Deep learning. MIT press.
  • Khaki, S., Pham, H., & Wang, L. (2020). Yieldnet: A convolutional neural network for simultaneous corn and soybean yield prediction based on remote sensing data. bioRxiv, 2020–12. https://doi.org/10.1101/2020.12.05.413203
  • Khan, S., Naseer, M., Hayat, M., Zamir, S. W., Khan, F. S., & Shah, M. (2022). Transformers in vision: A survey. ACM Computing Surveys, 54(10s), 1–41. https://doi.org/10.1145/3505244
  • Kiranyaz, S., Avci, O., Abdeljaber, O., Ince, T., Gabbouj, M., & Inman, D. J. (2021). 1D convolutional neural networks and applications: A survey. Mechanical Systems and Signal Processing, 151, 107398. https://doi.org/10.1016/j.ymssp.2020.107398
  • Kok, Z. H., Shariff, A. R. M., Alfatni, M. S. M., & Khairunniza-Bejo, S. (2021). Support vector machine in precision agriculture: A review. Computers and Electronics in Agriculture, 191, 106546. https://doi.org/10.1016/j.compag.2021.106546
  • Kordi, F., & Yousefi, H. (2022). Crop classification based on phenology information by using time series of optical and synthetic-aperture radar images. Remote Sensing Applications: Society & Environment, 27, 100812. https://doi.org/10.1016/j.rsase.2022.100812
  • Kounalakis, T., Triantafyllidis, G. A., & Nalpantidis, L. (2019). Deep learning-based visual recognition of rumex for robotic precision farming. Computers and Electronics in Agriculture, 165, 104973. https://doi.org/10.1016/j.compag.2019.104973
  • Koutsoukas, A., Monaghan, K. J., Li, X., & Huan, J. (2017). Deep-learning: Investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data. Journal of Cheminformatics, 9(1), 42. https://doi.org/10.1186/s13321-017-0226-y
  • Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2017). Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6), 84–90. https://doi.org/10.1145/3065386
  • Kuhn, M., & Johnson, K. (2013). Measuring performance in regression models. Applied Predictive Modeling, 95–100. https://doi.org/10.1007/978-1-4614-6849-3_5
  • Kussul, N., Lavreniuk, M., Skakun, S., & Shelestov, A. (2017). Deep learning classification of land cover and crop types using remote sensing data. IEEE Geoscience and Remote Sensing Letters, 14(5), 778–782. https://doi.org/10.1109/LGRS.2017.2681128
  • Laben, C. A., & Brower, B. V. (2000). Process for enhancing the spatial resolution of multispectral imagery using pan-sharpening. Google Patents.
  • Lathuilière, S., Mesejo, P., Alameda-Pineda, X., & Horaud, R. (2020). A comprehensive analysis of deep regression. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(9), 2065–2081. https://doi.org/10.1109/TPAMI.2019.2910523
  • LeCun, Y., Bottou, L., Bengio, Y., & Haffner, P. (1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11), 2278–2324. https://doi.org/10.1109/5.726791
  • Liakos, K. G., Busato, P., Moshou, D., Pearson, S., & Bochtis, D. (2018). Machine learning in agriculture: A review. Sensors, 18(8), 2674. https://doi.org/10.3390/s18082674
  • Li, W., Fu, H., Yu, L., & Cracknell, A. (2016). Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sensing, 9(1), 22. https://doi.org/10.3390/rs9010022
  • Linardatos, P., Papastefanopoulos, V., & Kotsiantis, S. (2021). Explainable AI: A review of machine learning interpretability methods. Entropy, 23(1), 18. https://doi.org/10.3390/e23010018 Article 1.
  • Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., & Berg, A. C. (2016). Ssd: Single shot multibox detector. Computer Vision – ECCV 2016. ECCV 2016. Lecture Notes in Computer Science (pp. 21–37). Cham: Springer. https://doi.org/10.1007/978-3-319-46448-0_2
  • Liu, J., & Wang, X. (2021). Plant diseases and pests detection based on deep learning: A review. Plant Methods, 17(1), 1–18. https://doi.org/10.1186/s13007-021-00722-9
  • Liu, B., Zhang, Y., He, D., & Li, Y. (2017). Identification of apple leaf diseases based on deep convolutional neural networks. Symmetry, 10(1), 11. https://doi.org/10.3390/sym10010011
  • Li, X., Xiong, H., Li, X., Wu, X., Zhang, X., Liu, J., Bian, J., & Dou, D. (2022). Interpretable deep learning: Interpretation, interpretability, trustworthiness, and beyond. Knowledge and Information Systems, 64(12), 3197–3234. https://doi.org/10.1007/s10115-022-01756-8
  • Li, Z., Yang, W., Peng, S., & Liu, F. (2020). A survey of convolutional neural networks: Analysis, applications, and prospects (arXiv: 2004.02806). arXiv. http://arxiv.org/abs/2004.02806
  • Li, N., Zhang, X., Zhang, C., Guo, H., Sun, Z., & Wu, X. (2019). Real-time crop recognition in transplanted fields with prominent weed growth: A visual-attention-based approach. IEEE Access, 7, 185310–185321. https://doi.org/10.1109/ACCESS.2019.2942158
  • Lu, Y., & Young, S. (2020). A survey of public datasets for computer vision tasks in precision agriculture. Computers and Electronics in Agriculture, 178, 105760. https://doi.org/10.1016/j.compag.2020.105760
  • Ma, X., Deng, X., Qi, L., Jiang, Y., Li, H., Wang, Y., Xing, X., & Zhang, J. (2019). Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. Public Library of Science ONE, 14(4), e0215676. https://doi.org/10.1371/journal.pone.0215676
  • Maione, C., Batista, B. L., Campiglia, A. D., Barbosa, F., & Barbosa, R. M. (2016). Classification of geographic origin of rice by data mining and inductively coupled plasma mass spectrometry. Computers and Electronics in Agriculture, 121, 101–107. https://doi.org/10.1016/j.compag.2015.11.009
  • Matsoukas, C., Haslum, J. F., Söderberg, M., & Smith, K. (2021). Is it time to replace CNNs with transformers for medical images? ( arXiv:2108.09038). arXiv. http://arxiv.org/abs/2108.09038
  • Mehdizadeh, S., Behmanesh, J., & Khalili, K. (2017). Using MARS, SVM, GEP and empirical equations for estimation of monthly mean reference evapotranspiration. Computers and Electronics in Agriculture, 139, 103–114. https://doi.org/10.1016/j.compag.2017.05.002
  • Meshram, V., & Patil, K. (2022). FruitNet: Indian fruits image dataset with quality for machine learning applications. Data in Brief, 40, 107686. https://doi.org/10.1016/j.dib.2021.107686
  • Mignoni, M. E., Honorato, A., Kunst, R., Righi, R., & Massuquetti, A. (2022). Soybean images dataset for caterpillar and diabrotica speciosa pest detection and classification. Data in Brief, 40, 107756. https://doi.org/10.1016/j.dib.2021.107756
  • Milioto, A., Lottes, P., & Stachniss, C. (2018). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In 2018 IEEE International Conference on Robotics and Automation (ICRA) (pp. 2229–2235). https://doi.org/10.1109/ICRA.2018.8460962
  • Minaee, S., Boykov, Y., Porikli, F., Plaza, A., Kehtarnavaz, N., & Terzopoulos, D. (2020). Image segmentation using deep learning: A survey. (arXiv: 2001.05566) arXiv. http://arxiv.org/abs/2001.05566
  • Mirza, M., & Osindero, S. (2014). Conditional generative adversarial nets. arXiv Preprint arXiv:1411.1784. https://doi.org/10.48550/arXiv.1411.1784
  • Mlsna, P. A., & Rodríguez, J. J. (2009). Chapter 19—gradient and laplacian edge detection. In A. Bovik (Ed.), The essential Guide to image processing (pp. 495–524). Academic Press. https://doi.org/10.1016/B978-0-12-374457-9.00019-6
  • Moshou, D., Pantazi, X.-E., Kateris, D., & Gravalos, I. (2014). Water stress detection based on optical multisensor fusion with a least squares support vector machine classifier. Biosystems Engineering, 117, 15–22. https://doi.org/10.1016/j.biosystemseng.2013.07.008
  • Moutik, O., Sekkat, H., Tigani, S., Chehri, A., Saadane, R., Tchakoucht, T. A., & Paul, A. (2023). Convolutional neural networks or vision transformers: Who will win the race for action recognitions in visual data? Sensors, 23(2), 734. Article 2. https://doi.org/10.3390/s23020734
  • M, H., & Sulaim, M. N. (2015). A review on evaluation metrics for data classification evaluations. International Journal of Data Mining & Knowledge Management Process, 5(2), 01–11. https://doi.org/10.5121/ijdkp.2015.5201
  • Nevavuori, P., Narra, N., & Lipping, T. (2019). Crop yield prediction with deep convolutional neural networks. Computers and Electronics in Agriculture, 163, 104859. https://doi.org/10.1016/j.compag.2019.104859
  • Nguyen, T. T., Hoang, T. D., Pham, M. T., Vu, T. T., Nguyen, T. H., Huynh, Q.-T., & Jo, J. (2020). Monitoring agriculture areas with satellite images and deep learning. Applied Soft Computing, 95, 106565. https://doi.org/10.1016/j.asoc.2020.106565
  • Noyan, M. A. (2022). Uncovering bias in the PlantVillage dataset. arXiv Preprint arXiv:2206.04374. https://doi.org/10.48550/arXiv.2206.04374
  • O’Shea, K., & Nash, R. (2015). An introduction to convolutional neural networks. arXiv Preprint arXiv: 1503.02531 2. https://doi.org/10.48550/arXiv.1511.08458
  • Pandian, J. A., & Gopal, G. (2019). Data for: Identification of plant leaf diseases using a 9-layer deep convolutional neural network. Computers & Electrical Engineering, 76, 323–338. https://doi.org/10.17632/tywbtsjrjv.1
  • Pantazi, X.-E., Moshou, D., & Bravo, C. (2016). Active learning system for weed species recognition based on hyperspectral sensing. Biosystems Engineering, 146, 193–202. https://doi.org/10.1016/j.biosystemseng.2016.01.014
  • Pegorini, V., Zen Karam, L., Pitta, C. S. R., Cardoso, R., Da Silva, J. C. C., Kalinowski, H. J., Ribeiro, R., Bertotti, F. L., & Assmann, T. S. (2015). In vivo pattern classification of ingestive behavior in ruminants using FBG sensors and machine learning. Sensors, 15(11). Article 11. https://doi.org/10.3390/s151128456
  • Peterson, L. E. (2009). K-nearest neighbor. Scholarpedia, 4(2), 1883. https://doi.org/10.4249/scholarpedia.1883
  • Petrich, L., Lohrmann, G., Neumann, M., Martin, F., Frey, A., Stoll, A., & Schmidt, V. (2020). Detection of colchicum autumnale in drone images, using a machine-learning approach. Precision Agriculture, 21(6), 1291–1303. https://doi.org/10.1007/s11119-020-09721-7
  • Phiri, D., Simwanda, M., Salekin, S., Nyirenda, V. R., Murayama, Y., & Ranagalage, M. (2020). Sentinel-2 data for land cover/use mapping: A review. Remote Sensing, 12(14), 2291. https://doi.org/10.3390/rs12142291
  • Pinto, F., Torr, P. H. S., & Dokania, K. (2022). An impartial take to the CNN vs transformer robustness contest. In S. Avidan, G. Brostow, M. Cissé, G.M. Farinella, & T. Hassner (Eds.), Computer vision – ECCV 2022 (pp. 466–480). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-19778-9_27
  • Prajapati, H. B., Shah, J. P., & Dabhi, V. K. (2017). Detection and classification of rice plant diseases. Intelligent Decision Technologies, 11(3), 357–373. https://doi.org/10.3233/IDT-170301
  • Pratyush Reddy, K. S., Roopa, Y. M., Kovvada Rajeev, L. N., & Nandan, N. S. (2020). IoT based smart agriculture using machine learning. 2020 Second International Conference on Inventive Research in Computing Applications (ICIRCA), 130–134. https://doi.org/10.1109/ICIRCA48905.2020.9183373
  • Rahnemoonfar, M., & Sheppard, C. (2017). Deep count: Fruit counting based on deep simulated learning. Sensors, 17(4), 905. https://doi.org/10.3390/s17040905
  • Rajesh, B., Vardhan, M. V. S., & Sujihelen, L. (2020). Leaf disease detection and classification by decision tree. 2020 4th International Conference on Trends in Electronics and Informatics (ICOEI)(48184), 705–708. https://doi.org/10.1109/ICOEI48184.2020.9142988
  • Razfar, N., True, J., Bassiouny, R., Venkatesh, V., & Kashef, R. (2022). Weed detection in soybean crops using custom lightweight deep learning models. Journal of Agriculture and Food Research, 8, 100308. https://doi.org/10.1016/j.jafr.2022.100308
  • Redmon, J., & Farhadi, A. (2018). Yolov3: An incremental improvement. arXiv Preprint arXiv:1804.02767. https://doi.org/10.48550/arXiv.1804.02767
  • Ribera, J., Chen, Y., Boomsma, C., & Delp, E. J. (2017). Counting plants using deep learning. 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 1344–1348. https://doi.org/10.1109/GlobalSIP.2017.8309180
  • Roelofs, R., Shankar, V., Recht, B., Fridovich-Keil, S., Hardt, M., Miller, J., & Schmidt, L. (2019). A meta-analysis of overfitting in machine learning. Advances in Neural Information Processing Systems, 32 doi:. https://proceedings.neurips.cc/paper/2019/hash/ee39e503b6bedf0c98c388b7e8589aca-Abstract.html
  • Ronneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional networks for biomedical image segmentation. In N. Navab, J. Hornegger, W.M. Wells, & A.F. Frangi (Eds.), Medical Image computing and Computer-Assisted Intervention – MICCAI 2015 (pp. 234–241). Springer International Publishing. https://doi.org/10.1007/978-3-319-24574-4_28
  • Rouse, J. W., Jr., Haas, R. H., Schell, J., & Deering, D. (1973). Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation (No. NASA-CR-132982).
  • Roy, D. P., Wulder, M. A., Loveland, T. R., Woodcock, C. E., Allen, R. G., Anderson, M. C., Helder, D., Irons, J. R., Johnson, D. M., Kennedy, R., Scambos, T. A., Schaaf, C. B., Schott, J. R., Sheng, Y., Vermote, E. F., Belward, A. S., Bindschadler, R., Cohen, W. B. … Wynne, R. H. (2014). Landsat-8: Science and product vision for terrestrial global change research. Remote Sensing of Environment, 145, 154–172. https://doi.org/10.1016/j.rse.2014.02.001
  • Ruckelshausen, A., Biber, P., Dorna, M., Gremmes, H., Klose, R., Linz, A., Rahe, F., Resch, R., Thiel, M., Trautz, D., & & others. (2009). BoniRob–an autonomous field robot platform for individual plant phenotyping. Precision Agriculture, 9(841), 1.
  • Sa, I., Chen, Z., Popović, M., Khanna, R., Liebisch, F., Nieto, J., & Siegwart, R. (2018). weedNet: Dense semantic weed classification using multispectral images and MAV for smart farming. IEEE Robotics and Automation Letters, 3(1), 588–595. https://doi.org/10.1109/LRA.2017.2774979
  • Saleem, M. H., Potgieter, J., & Arif, K. M. (2019). Plant disease detection and classification by deep learning. Plants, 8(11), 468. https://doi.org/10.3390/plants8110468
  • Segaran, T. (2007). Programming collective intelligence: Building smart web 2.0 applications. O’Reilly Media. https://books.google.fr/books?id=fEsZ3Ey-Hq4C
  • Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., & Batra, D. (2017). Grad-CAM: Visual explanations from deep networks via gradient-based localization. 618–626. https://openaccess.thecvf.com/contenticcv2017/html/SelvarajuGrad-CAMVisualExplanationsICCV2017paper.html.
  • Serrano, L., Penuelas, J., & Ustin, S. L. (2002). Remote sensing of nitrogen and lignin in Mediterranean vegetation from AVIRIS data: Decomposing biochemical from structural signals. Remote Sensing of Environment, 81(2–3), 355–364. https://doi.org/10.1016/S0034-4257(02)00011-1
  • Serrano, L., Peñuelas, J., & Ustin, S. L. (2002). Remote sensing of nitrogen and lignin in Mediterranean vegetation from AVIRIS data: Decomposing biochemical from structural signals. Remote Sensing of Environment, 81(2–3), 355–364. https://doi.org/10.1016/S0034-4257(02)00011-1
  • Shrikumar, A., Greenside, P., & Kundaje, A. (2017). Learning important features through propagating activation differences. Proceedings of the 34th International Conference on Machine Learning, 3145–3153. https://proceedings.mlr.press/v70/shrikumar17a.html
  • Simonyan, K., Vedaldi, A., & Zisserman, A. (2014). Deep inside convolutional networks: Visualising image classification models and saliency maps. (arXiv: 1312.6034) arXiv. https://doi.org/10.48550/arXiv.1312.6034
  • Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv Preprint arXiv:1409.1556. https://doi.org/10.48550/arXiv.1409.1556
  • Singh, D., Jain, N., Jain, P., Kayal, P., Kumawat, S., & Batra, N. (2020). PlantDoc: A dataset for visual plant disease detection. Proceedings of the 7th ACM IKDD CoDS and 25th COMAD, 249–253. https://doi.org/10.1145/3371158.3371196
  • Sishodia, R. P., Ray, R. L., & Singh, S. K. (2020). Applications of remote sensing in precision agriculture: A review. Remote Sensing, 12(19), 3136. https://doi.org/10.3390/rs12193136
  • Skovsen, S., Dyrmann, M., Mortensen, A. K., Laursen, M. S., Gislum, R., Eriksen, J., Farkhani, S., Karstoft, H., & Jorgensen, R. N. (2019, June). The GrassClover image dataset for semantic and hierarchical species understanding in agriculture. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops. https://doi.org/10.1109/CVPRW.2019.00325
  • Soltani, N., Dille, J. A., Burke, I. C., Everman, W. J., VanGessel, M. J., Davis, V. M., & Sikkema, P. H. (2016). Potential corn yield losses from weeds in North America. Weed Technology, 30(4), 979–984. https://doi.org/10.1614/WT-D-16-00046.1
  • Song, R., Zhang, Z., & Liu, H. (2017). Edge connection based canny edge detection algorithm. Pattern Recognition and Image Analysis, 27(4), 740–747. https://doi.org/10.1134/S1054661817040162
  • Sothearith, Y., Appiah, K. S., Mardani, H., Motobayashi, T., Yoko, S., Eang Hourt, K., Sugiyama, A., & Fujii, Y. (2021). Determination of the allelopathic potential of Cambodia’s medicinal plants using the dish pack method. Sustainability, 13(16), 9062. Article 16. https://doi.org/10.3390/su13169062
  • Sripada, R. P., Heiniger, R. W., White, J. G., & Meijer, A. D. (2006). Aerial color infrared photography for determining early In-season nitrogen requirements in corn. Agronomy Journal, 98(4), 968–977. https://doi.org/10.2134/agronj2005.0200
  • Srivastava, A. K., Safaei, N., Khaki, S., Lopez, G., Zeng, W., Ewert, F., Gaiser, T., & Rahimi, J. (2022). Winter wheat yield prediction using convolutional neural networks from environmental and phenological data. Scientific Reports, 12(1), 3215. https://doi.org/10.1038/s41598-022-06249-w
  • Subeesh, A., Bhole, S., Singh, K., Chandel, N. S., Rajwade, Y. A., Rao, K., Kumar, S., & Jat, D. (2022). Deep convolutional neural network models for weed detection in polyhouse grown bell peppers. Artificial Intelligence in Agriculture, 6, 47–54. https://doi.org/10.1016/j.aiia.2022.01.002
  • Sudars, K., Jasko, J., Namatevs, I., Ozola, L., & Badaukis, N. (2020). Dataset of annotated food crops and weed images for robotic computer vision control. Data in Brief, 31, 105833. https://doi.org/10.1016/j.dib.2020.105833
  • Suh, H. K., Ijsselmuiden, J., Hofstee, J. W., & van Henten, E. J. (2018). Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosystems Engineering, 174, 50–65. https://doi.org/10.1016/j.biosystemseng.2018.06.017
  • Sultana, N., Jahan, M., & Uddin, M. S. (2022). An extensive dataset for successful recognition of fresh and rotten fruits. Data in Brief, 44, 108552. https://doi.org/10.1016/j.dib.2022.108552
  • Sun, Z., Di, L., Fang, H., & Burgess, A. (2020). Deep learning classification for crop types in north dakota. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 13, 2200–2213. https://doi.org/10.1109/JSTARS.2020.2990104
  • Sun, H., Liu, H., Ma, Y., & Xia, Q. (2021). Optical remote sensing indexes of soil moisture: Evaluation and improvement based on aircraft experiment observations. Remote Sensing, 13(22), 4638. https://doi.org/10.3390/rs13224638
  • Suresha, M., Shreekanth, K. N., & Thirumalesh, B. V. (2017). Recognition of diseases in paddy leaves using knn classifier. 2017 2nd International Conference for Convergence in Technology (I2CT), 663–666. https://doi.org/10.1109/I2CT.2017.8226213
  • Szegedy, C., Ioffe, S., Vanhoucke, V., & Alemi, A. A. (2017). Inception-v4, inception-resnet and the impact of residual connections on learning. Thirty-First AAAI Conference on Artificial Intelligence, 31(1). https://doi.org/10.1609/aaai.v31i1.11231
  • Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., & Rabinovich, A. (2015). Going deeper with convolutions. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1–9. https://doi.org/10.1109/CVPR.2015.7298594
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., & Wojna, Z. (2016). Rethinking the inception architecture for computer vision. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2818–2826. https://doi.org/10.1109/CVPR.2016.308
  • Taha, A. A., & Hanbury, A. (2015). Metrics for evaluating 3D medical image segmentation: Analysis, selection, and tool. BMC Medical Imaging, 15(1), 1–28. https://doi.org/10.1186/s12880-015-0068-x
  • Thompson, C. N., Guo, W., Sharma, B., & Ritchie, G. L. (2019). Using normalized difference red edge index to assess maturity in cotton. Crop Science, 59(5), 2167–2177. https://doi.org/10.2135/cropsci2019.04.0227
  • Tucker, C. J. (1979). Red and photographic infrared linear combinations for monitoring vegetation. Remote Sensing of Environment, 8(2), 127–150. https://doi.org/10.1016/0034-4257(79)90013-0
  • Türkoğlu, M., & Hanbay, D. (2019). Plant disease and pest detection using deep learning-based features. Turkish Journal of Electrical Engineering and Computer Sciences, 27(3), 1636–1651. https://doi.org/10.3906/elk-1809-181
  • Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., & Polosukhin, I. (2023). Attention is all you need (arXiv: 1706.03762). arXiv. https://doi.org/10.48550/arXiv.1706.03762
  • Veeranampalayam Sivakumar, A. N., Li, J., Scott, S., Psota, E., Jhala, A. J., Luck, J. D., & Shi, Y. (2020). Comparison of object detection and patch-based classification deep learning models on mid-to late-season weed detection in UAV imagery. Remote Sensing, 12(13), 2136.
  • Vincent, B., & Dardenne, P. (2021). Application of NIR in agriculture. In Y. Ozaki, C. Huck, S. Tsuchikawa, & S.B. Engelsen (Eds.), Near-infrared spectroscopy: Theory, spectral analysis, instrumentation, and applications (pp. 331–345). Springer. https://doi.org/10.1007/978-981-15-8648-4_14
  • Wang, L., Qu, J. J., & Hao, X. (2008). Forest fire detection using the normalized multi-band drought index (NMDI) with satellite measurements. Agricultural and Forest Meteorology, 148(11), 1767–1776. https://doi.org/10.1016/j.agrformet.2008.06.005
  • Weng, L., Kang, Y., Jiang, K., & Chen, C. (2022). Time gated convolutional neural networks for crop classification. arXiv Preprint arXiv:2206.09756. https://doi.org/10.48550/arXiv.2206.09756
  • Wiesner-Hanks, T., & Brahimi, M. (2018). Image set for deep learning: Field images of maize annotated with disease symptoms. https://osf.io/p67rz/
  • Woebbecke, D. M., Meyer, G. E., Von Bargen, K., & Mortensen, D. A. (1995). Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE, 38(1), 259–269. https://doi.org/10.13031/2013.27838
  • Wu, Z., Chen, Y., Zhao, B., Kang, X., & Ding, Y. (2021). Review of weed detection methods based on computer vision. Sensors, 21(11), 3647. https://doi.org/10.3390/s21113647
  • Wu, X., Zhan, C., Lai, Y., Cheng, M.-M., & Yang, J. (2019). IP102: A large-scale benchmark dataset for insect pest recognition. Ieee Cvpr, 8787–8796. https://doi.org/10.1109/CVPR.2019.00899
  • Yaloveha, V., Podorozhniak, A., & Kuchuk, H. (2022). Convolutional neural network hyperparameter optimization applied to land cover classification. Radioelectronic and Computer Systems, 1(1), 115–128. https://doi.org/10.32620/reks.2022.1.09
  • Yin, Y., Li, H., & Fu, W. (2020). Faster-YOLO: An accurate and faster object detection method. Digital Signal Processing, 102, 102756. https://doi.org/10.1016/j.dsp.2020.102756
  • Yoo, H.-J. (2015). Deep convolution neural networks in computer vision: A review. IEIE Transactions on Smart Processing and Computing, 4(1), 35–43. https://doi.org/10.5573/IEIESPC.2015.4.1.035
  • Yue, J., Tian, J., Tian, Q., Xu, K., & Xu, N. (2019). Development of soil moisture indices from differences in water absorption between shortwave-infrared bands. ISPRS Journal of Photogrammetry and Remote Sensing, 154, 216–230. https://doi.org/10.1016/j.isprsjprs.2019.06.012
  • Yu, J., Schumann, A. W., Cao, Z., Sharpe, S. M., & Boyd, N. S. (2019). Weed detection in perennial ryegrass with deep learning convolutional neural network. Frontiers in Plant Science, 10, 1422. https://doi.org/10.3389/fpls.2019.01422
  • Yu, J., Sharpe, S. M., Schumann, A. W., & Boyd, N. S. (2019). Deep learning for image-based weed detection in turfgrass. European Journal of Agronomy, 104, 78–84. https://doi.org/10.1016/j.eja.2019.01.004
  • Zhang, S., Tong, H., Xu, J., & Maciejewski, R. (2019). Graph convolutional networks: A comprehensive review. Computational Social Networks, 6(1), 11. https://doi.org/10.1186/s40649-019-0069-y
  • Zhang, R., Wang, C., Hu, X., Liu, Y., Chen, S., & & others. (2020). Weed location and recognition based on UAV imaging and deep learning. International Journal of Precision Agricultural Aviation, 3(1).
  • Zhang, K., Wu, Q., Liu, A., & Meng, X. (2018). Can deep learning identify tomato leaf disease? Advances in Multimedia, 2018, 1–10. https://doi.org/10.1155/2018/6710865
  • Zhong, Y., & Zhao, M. (2020). Research on deep learning in apple leaf disease recognition. Computers and Electronics in Agriculture, 168, 105146. https://doi.org/10.1016/j.compag.2019.105146
  • Zhou, J., Cui, G., Hu, S., Zhang, Z., Yang, C., Liu, Z., Wang, L., Li, C., & Sun, M. (2020). Graph neural networks: A review of methods and applications. AI Open, 1, 57–81. https://doi.org/10.1016/j.aiopen.2021.01.001
  • Zhou, B., Khosla, A., Lapedriza, A., Oliva, A., & Torralba, A. (2016). Learning deep features for discriminative localization. 618–626. https://openaccess.thecvf.com/content_cvpr_2016/html/Zhou_Learning_Deep_Features_CVPR_2016_paper.html.
  • Zhuang, F., Qi, Z., Duan, K., Xi, D., Zhu, Y., Zhu, H., Xiong, H., & He, Q. (2021). A comprehensive survey on transfer learning. Proceedings of the IEEE, 109(1), 43–76. https://doi.org/10.1109/JPROC.2020.3004555