Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo

dc.contributor.advisorBranch Bedoya, John Willian
dc.contributor.advisorRestrepo Arias, Juan Felipe
dc.contributor.authorArregocés Guerra, Paulina
dc.contributor.orcidArregocés Guerra, Paulina [0000000195670231]spa
dc.contributor.researchgroupGidia: Grupo de Investigación YyDesarrollo en Inteligencia Artificialspa
dc.date.accessioned2024-06-25T20:44:08Z
dc.date.available2024-06-25T20:44:08Z
dc.date.issued2024
dc.description.abstractAproximadamente el 75% de la superficie agrícola global pertenece a pequeños agricultores, siendo esenciales para el abastecimiento local de alimentos. Sin embargo, los desafíos comunes incluyen la falta de caracterización precisa de los cultivos y la escasa información detallada en las zonas productivas. La Agricultura Inteligente, que utiliza tecnologías avanzadas como Vehículos Aéreos No Tripulados (VANTs) y visión por computadora, ofrece soluciones; sin embargo, su falta de accesibilidad excluye al 94% de los pequeños agricultores en Colombia. Este trabajo aborda la necesidad de proponer un método de clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo. Se utiliza una VANT DJI Mini 2 SE, accesible en el mercado, para capturar imágenes en San Cristóbal, un área rural de Medellín, Colombia, con el objetivo de identificar cultivos de cebolla verde o de rama, follaje y áreas sin cultivo. Con 259 imágenes y 4315 instancias etiquetadas, se emplean modelos de Redes Neuronales Convolucionales (CNNs, por sus siglas en inglés) para la clasificación de objetos, segmentación de instancias y segmentación semántica. Se evaluaron métodos de Aprendizaje Profundo utilizando Transfer Learning, siendo Mask R-CNN el elegido con un 93% de precisión, una tasa de falsos positivos del 9% y falsos negativos del 4%. Las métricas incluyen un porcentaje de precisión promedio medio (mAP%) del 55.49% para follaje, 49.09% para áreas sin cultivo y 58.21% para la cebolla. El conjunto de datos etiquetado está disponible para fomentar la colaboración e investigación comparativa. En términos generales se concluye que mediante la captura de imágenes digitales con VANTs y el uso de métodos de aprendizaje profundo, se puede obtener información precisa y oportuna sobre pequeñas explotaciones agrícolas. (Texto tomado de la fuente)spa
dc.description.abstractApproximately 75% of the global agricultural land belongs to small-scale farmers, who are essential for local food supply. However, common challenges include the lack of accurate crop characterization and limited detailed information in productive areas. Smart Farming, employing advanced technologies such as Unmanned Aerial Vehicles (UAVs) and computer vision, offers solutions; however, its lack of accessibility excludes 94% of small-scale farmers in Colombia. This work addresses the need to propose a method for small-scale agricultural crop classification using deep learning techniques. A DJI Mini 2 SE UAV, readily available in the market, is used to capture images in San Cristóbal, a rural area of Medellín, Colombia, with the aim of identifying green onion or branch crops, foliage, and uncultivated areas. With 259 images and 4315 labeled instances, Convolutional Neural Network (CNN) models are employed for object detection, instance segmentation, and semantic segmentation. Deep Learning methods using transfer learning were evaluated, with Mask R-CNN selected, achieving 93% accuracy, a false positive rate of 9%, and false negative rate of 4%. Metrics include an average precision percentage (mAP%) of 55.49% for foliage, 49.09% for uncultivated areas, and 58.21% for onions. The labeled dataset is available to encourage collaboration and comparative research.In general terms, it is concluded that by capturing digital images with UAVs and using deep learning methods, precise and timely information about small agricultural operations can be obtained.eng
dc.description.curricularareaÁrea Curricular de Ingeniería de Sistemas e Informáticaspa
dc.description.degreelevelMaestríaspa
dc.description.degreenameMagister en Ingeniería Analíticaspa
dc.format.extent106 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/86302
dc.language.isospaspa
dc.publisherUniversidad Nacional de Colombiaspa
dc.publisher.branchUniversidad Nacional de Colombia - Sede Medellínspa
dc.publisher.facultyFacultad de Minasspa
dc.publisher.placeMedellín, Colombiaspa
dc.publisher.programMedellín - Minas - Maestría en Ingeniería - Analíticaspa
dc.relation.referencesAgencia de desarrollo rural, FAO, y Gobernación de Antioquia. (2012). Plan integral de desarrollo agropecuario y rural con enfoque territorial (Vol. 91).spa
dc.relation.referencesAlamsyah, A., Saputra, M. A. A., & Masrury, R. A. (2019, March). Object detection using convolutional neural network to identify popular fashion product. In Journal of Physics: Conference Series (Vol. 1192, No. 1, p. 012040). IOP Publishing.spa
dc.relation.referencesAlba, A., Angela Burgos, Cárdenas, J., Lara, K., Sierra, A., y Rojas, G. A. M. (2013, 10). Panorama investigativo sobre la segunda revolución verde en el mundo y en Colombia. Tecciencia, 8 , 49-64. Descargado de SciELO doi: 10.18180/TECCIENCIA.2013.15.6spa
dc.relation.referencesAlcaldía Mayor de Bogotá (22 de Marzo de 2022). Resolución 101 de 2022 Ministerio de Agricultura y Desarrollo Rural. Recuperado el 12 de Febrero de 2024 de https://www.alcaldiabogota.gov.co/sisjur/normas/Norma1.jsp?i=122204.spa
dc.relation.referencesAmmar, A., Koubaa, A., Ahmed, M., Saad, A., & Benjdira, B. (2019). Aerial images processing for car detection using convolutional neural networks: Comparison between faster r-cnn and yolov3. arXiv preprint arXiv:1910.07234.spa
dc.relation.referencesAyaz, M., Ammad-Uddin, M., Sharif, Z., Mansour, A., y Aggoune, E.-H. M. (2019). Internet-of-Things (IoT)-based smart agriculture: Toward making the fields talk. IEEE access, 7 , 129551–129583.spa
dc.relation.referencesBadrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/TPAMI.2016.2644615spa
dc.relation.referencesBakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153-160. https://doi.org/10.1016/j.compag.2017.12.032spa
dc.relation.referencesBayraktar, E., Basarkan, M. E., & Celebi, N. (2020). A low-cost UAV framework towards ornamental plant detection and counting in the wild. ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11. https://doi.org/10.1016/j.isprsjprs.2020.06.012spa
dc.relation.referencesBochkovskiy, A., Wang, C. Y., & Liao, H. Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934. https://doi.org/10.48550/arXiv.2004.10934spa
dc.relation.referencesBouguettaya, A., Zarzour, H., Kechida, A., y Taberkit, A. M. (2022, 3). Deep learning techniques to classify agricultural crops through UAV imagery: a review. Neural Computing and Applications 2022 34:12 , 34 , 9511-9536. Descargado de Springer doi: 10.1007/S00521-022-07104-9spa
dc.relation.referencesBotero, F., & Cristóbal, S (2017). política y económica del corregimiento de San Cristóbal. Disponible en https://bibliotecasmedellin.gov.co/wp-content/uploads/2018/10/Anexo_San_Cristo%CC%81bal.pdfspa
dc.relation.referencesCastañeda-Miranda, A., y Castaño-Meneses, V. M. (2020). Smart frost measurement for anti-disaster intelligent control in greenhouses via embedding IoT and hybrid AI methods. Measurement: Journal of the International Measurement Confederation, 164 . doi: 10.1016/j.measurement.2020.108043spa
dc.relation.referencesChamara, N., Bai, G., & Ge, Y. (2023). AICropCAM: Deploying classification, segmentation, detection, and counting deep-learning models for crop monitoring on the edge. Computers and Electronics in Agriculture, 215, 108420. https://doi.org/10.1016/j.compag.2023.108420spa
dc.relation.referencesChen, X., Girshick, R., He, K., & Dollar, P. (2019). TensorMask: A foundation for dense object segmentation. Proceedings of the IEEE International Conference on Computer Vision, 2019-Octob, 2061–2069. https://doi.org/10.1109/ICCV.2019.00215spa
dc.relation.referencesCheng, B., Collins, M. D., Zhu, Y., Liu, T., Huang, T. S., Adam, H., & Chen, L.-C. (2020). Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation. 12475–12485. https://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_Panoptic-DeepLab_A_Simple_Strong_and_Fast_Baseline_for_Bottom-Up_Panoptic_CVPR_2020_paper.htmlspa
dc.relation.referencesChew, R., Rineer, J., Beach, R., O’neil, M., Ujeneza, N., Lapidus, D., .Temple, D. S. (2020). Deep neural networks and transfer learning for food crop identification in UAV images. Drones, 4 , 1-14. doi: 10.3390/drones4010007spa
dc.relation.referencesContiu, S., y Groza, A. (2016). Improving remote sensing crop classification by argumentation-based conflict resolution in ensemble learning. Expert Systems with Applications, 64 , 269-286. doi: 10.1016/j.eswa.2016.07.037spa
dc.relation.referencesDeng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. En: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 248-255. DOI: 10.1109/CVPR.2009.5206848. https://doi.org/10.3389/fpls.2021.763479spa
dc.relation.referencesDepartamento Administrativo Nacional de Estadística (DANE). (2019). Encuesta nacional agropecuaria (ENA). Descargado de DANEspa
dc.relation.referencesDer Yang, M., Tseng, H. H., Hsu, Y. C., et al. (2020). Real-time crop classification using edge computing and deep learning. En: 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), IEEE, pp. 1–4. https://doi.org/10.1109/CCNC46108.2020.9045498spa
dc.relation.referencesDijkstra, K., van de Loosdrecht, J., Atsma, W. A., Schomaker, L. R., y Wiering, M. A. (2021). CentroidNetV2: A hybrid deep neural network for small-object segmentation and counting. Neurocomputing, 423 , 490-505. Descargado de doi doi: 10.1016/j.neucom.2020.10.075spa
dc.relation.referencesEl-Basioni, B. M. M., y El-Kader, S. M. A. (2020). Laying the foundations for an IoT reference architecture for agricultural application domain. IEEE Access, 8 , 190194-190230. doi: 10.1109/ACCESS.2020.3031634spa
dc.relation.referencesFeng, T., Chai, Y., Huang, Y., & Liu, Y. (2019, December). A Real-time Monitoring and Control System for Crop. In Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence (pp. 183-188). https://doi.org/10.1145/3377713.3377742spa
dc.relation.referencesFerro, M. V., & Catania, P. (2023). Technologies and Innovative Methods for Precision Viticulture: A Comprehensive Review. Horticulturae, 9(3), 399.spa
dc.relation.referencesFindLight. (s.f.). Wide Dynamic Range Sensor NSC1005C. Recuperado de https://www.findlight.net/imaging-and-vision/image-sensors/area-scan-sensors/wide-dynamic-range-sensor-nsc1005cspa
dc.relation.referencesFood and Agriculture Organization (FAO). (2017). The future of food and agriculture: Trends and challenges. Descargado de FAOspa
dc.relation.referencesFood and Agriculture Organization (FAO). (2018). Fao’s work on agricultural innovation. , 20. Descargado de FAOspa
dc.relation.referencesFPN. (s/f). CloudFactory Computer Vision Wiki. Recuperado el 13 de febrero de 2024, de https://wiki.cloudfactory.com/docs/mp-wiki/model-architectures/fpnspa
dc.relation.referencesFuentes-Peñailillo, F., Ortega-Farias, S., Rivera, M., Bardeen, M., & Moreno, M. (2018, October). Using clustering algorithms to segment UAV-based RGB images. In 2018 IEEE international conference on automation/XXIII congress of the Chilean association of automatic control (ICA-ACCA) (pp. 1-5). IEEE. doi: 10.1109/ICA-ACCA.2018.8609822spa
dc.relation.referencesFujiwara, R., Nashida, H., Fukushima, M., Suzuki, N., Sato, H., Sanada, Y., & Akiyama, Y. (2022). Convolutional neural network models help effectively estimate legume coverage in grass-legume mixed swards. Frontiers in Plant Science, 12, 763479.spa
dc.relation.referencesGarcía-Santillán, I. D., y Pajares, G. (2018). On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields [Article]. Biosystems Engineering, 166 , 28 – 43. doi: 10.1016/j.biosystemseng.2017.11.003spa
dc.relation.referencesGenze, N., Ajekwe, R., Güreli, Z., Haselbeck, F., Grieb, M., & Grimm, D. G. (2022). Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Computers and Electronics in Agriculture, 202, 107388.spa
dc.relation.referencesGirshick, R. (2015). Fast r-cnn. In Proceedings of the IEEE international conference on computer vision (pp. 1440-1448).spa
dc.relation.referencesGuler, R. A., Neverova, N., & Kokkinos, I. (2016). DensePose: Dense Human Pose Estimation In TheWild. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7297–7306. http://arxiv.org/abs/1612.01202spa
dc.relation.referencesHamuda, E., Glavin, M., y Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184-199. Descargado de doi.org (2016) doi: 10.1016/j.compag.2016.04.024spa
dc.relation.referencesHe, K., Gkioxari, G., Dollár, P., & Girshick, R. (2020). Mask R-CNN. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(2), 386–397. https://doi.org/10.1109/TPAMI.2018.2844175spa
dc.relation.referencesHe, K., Zhang, X., Ren, S., et al. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770–778). https://doi.org/10.1109/CVPR.2016.90spa
dc.relation.referencesHoward, A. G., et al. (2017, April). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. [Online]. arXiv preprint arXiv:1704.04861. Disponible en: http://arxiv.org/abs/1704.04861spa
dc.relation.referencesKawamura, K., Asai, H., Yasuda, T., Soisouvanh, P., & Phongchanmixay, S. (2021). Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm. Plant Production Science, 24(2), 198-215. https://doi.org/10.1080/1343943X.2020.1829490spa
dc.relation.referencesKirillov, A., He, K., Girshick, R., Rother, C., & Dollar, P. (2019). Panoptic segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 9396–9405. https://doi.org/10.1109/CVPR.2019.00963spa
dc.relation.referencesKitano, B. T., Mendes, C. C., Geus, A. R., et al. (2019). Corn plant counting using deep learning and UAV images. IEEE Geoscience and Remote Sensing Letters. https://doi.org/10.1109/LGRS.2019.2930549spa
dc.relation.referencesKitzler, F., Wagentristl, H., Neugschwandtner, R. W., Gronauer, A., & Motsch, V. (2022). Influence of Selected Modeling Parameters on Plant Segmentation Quality Using Decision Tree Classifiers. Agriculture, 12, 1408. https://doi.org/10.3390/agriculture12091408spa
dc.relation.referencesKoirala, A., Walsh, K., Wang, Z., et al. (2019). Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘mangoyolo’. Precision Agriculture, 20(6), 1107–1135. https://doi.org/10.1007/s11119-019-09642-0spa
dc.relation.referencesLeCun, Y., Bottou, L., Bengio, Y., et al. (1998). Gradient-based learning applied to document recognition. Proc IEEE, 86(11), 2278–2324. https://doi.org/10.1109/5.726791spa
dc.relation.referencesLi, L., Mu, X., Jiang, H., Chianucci, F., Hu, R., Song, W., Yan, G. (2023). Review of ground and aerial methods for vegetation cover fraction (fcover) and related quantities estimation: definitions, advances, challenges, and future perspectives [Review]. ISPRS Journal of Photogrammetry and Remote Sensing, 199, 133 – 156. (Cited by: 1) doi: 10.1016/j.isprsjprs.2023.03.020spa
dc.relation.referencesLi, W., Fu, H., Yu, L., y Cracknell, A. (2017). Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sensing, 9. doi: 10.3390/rs9010022spa
dc.relation.referencesLin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988).spa
dc.relation.referencesLiu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016). SSD: Single shot multibox detector. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9905 LNCS, 21–37. https://doi.org/10.1007/978-3-319-46448-0_2spa
dc.relation.referencesLiu, H., Qi, Y., Xiao, W., Tian, H., Zhao, D., Zhang, K., Xiao, J., Lu, X., Lan, Y., & Zhang, Y. (2022). Identification of Male and Female Parents for Hybrid Rice Seed Production Using UAV-Based Multispectral Imagery. Agriculture, 12(7), 1005. https://doi.org/10.3390/agriculture12071005spa
dc.relation.referencesLohi, S. A., & Bhatt, C. (2022). Empirical Analysis of Crop Yield Prediction and Disease Detection Systems: A Statistical Perspective. ICT Infrastructure and Computing: Proceedings of ICT4SD 2022, 49-57.spa
dc.relation.referencesLong, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3431-3440).spa
dc.relation.referencesLottes, P., H ̈orferlin, M., Sander, S., y Stachniss, C. (2017). Effective vision-based classification for separating sugar beets and weeds for precision farming. Journal of Field Robotics, 34, 1160-1178. (2017) doi: 10.1002/rob.21675spa
dc.relation.referencesLottes, P., Khanna, R., Pfeifer, J., Siegwart, R., y Stachniss, C. (2017). UAV-based crop and weed classification for smart farming. Proceedings - IEEE International Conference on Robotics and Automation, 3024-3031. doi: 10.1109/ICRA.2017.7989347spa
dc.relation.referencesLu, Y., Young, S., Wang, H., & Wijewardane, N. (2022). Robust plant segmentation of color images based on image contrast optimization. Computers and Electronics in Agriculture, 193, 106711. https://doi.org/10.1016/j.compag.2022.106711spa
dc.relation.referencesMachefer, M., Lemarchand, F., Bonnefond, V., et al. (2020). Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sensing, 12(18). https://doi.org/10.3390/rs12183015spa
dc.relation.referencesMardanisamani, S., y Eramian, M. (2022). Segmentation of vegetation and microplots in aerial agriculture images: A survey [Review]. Plant Phenome Journal, 5 (1). Descargado de Plant Phenome Journal doi: 10.3390/data8050088spa
dc.relation.referencesMateen, A., y Zhu, Q. (2019). Weed detection in wheat crop using UAV for precision agriculture [Article]. Pakistan Journal of Agricultural Sciences, 56 (3), 809 – 817. (Cited by: 17) doi: 10.21162/PAKJAS/19.8116spa
dc.relation.referencesMaulit, A., Nugumanova, A., Apayev, K., Baiburin, Y., y Sutula, M. (2023). A multispectral UAV imagery dataset of wheat, soybean and barley crops in East Kazakhstan [Article]. Data, 8 (5). doi: 10.3390/data8050088spa
dc.relation.referencesMilioto, A., Lottes, P., & Stachniss, C. (2018, May). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2229-2235). IEEE. doi: 10.1109/ICRA.2018.8460962.spa
dc.relation.referencesMorales, G., Kemper, G., Sevillano, G., et al. (2018). Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests, 9(12). https://doi.org/10.3390/f9120736spa
dc.relation.referencesMortimer, A. M. (2000). Capítulo 2. La clasificación y ecología de las malezas. FAO. Recuperado el dia mes año de https://www.fao.org/3/T1147S/t1147s06.htmspa
dc.relation.referencesMu, Y., Ni, R., Fu, L., Luo, T., Feng, R., Li, J., & Li, S. (2023). DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Frontiers in Plant Science, 13, 1041510. https://doi.org/10.3389/fpls.2022.1041510spa
dc.relation.referencesMukherjee, S. (2022, agosto 18). The annotated ResNet-50. Towards Data Science. https://towardsdatascience.com/the-annotated-resnet-50-a6c536034758spa
dc.relation.referencesMyBotShop. (s.f.). Clearpath Husky A200. Recuperado de https://www.mybotshop.de/Clearpath-Husky-A200_3spa
dc.relation.referencesNeupane, B., Horanont, T., & Hung, N. D. (2019). Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV). PLoS One, 14(10), e0223906. https://doi.org/1spa
dc.relation.referencesNgo, U. Q., Ngo, D. T., Nguyen, H. T., y Bui, T. D. (2022). Digital image processing methods for estimating leaf area of cucumber plants [Article]. Indonesian Journal of Electrical Engineering and Computer Science, 25 (1), 317 – 328. doi: 10.11591/ijeecs.v25.i1.pp317-328spa
dc.relation.referencesPashaei, M., Kamangir, H., Starek, M. J., & Tissot, P. (2020). Review and Evaluation of Deep Learning Architectures for Efficient Land Cover Mapping with UAS Hyper-Spatial Imagery: A Case Study Over a Wetland. Remote Sensing, 12(6), 959. https://doi.org/10.3390/rs12060959spa
dc.relation.referencesPatidar, P. K., Tomar, D. S., Pateriya, R. K., & Sharma, Y. K. (2023, May). Precision Agriculture: Crop Image Segmentation and Loss Evaluation through Drone Surveillance. In 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC) (pp. 495-500). IEEE. doi: 10.1109/ICSCCC58608.2023.10176980spa
dc.relation.referencesPierce, F. J., y Nowak, P. (1999). Aspects of precision agriculture. En D. L. Sparks (Ed.), (Vol. 67, p. 1-85). Academic Press. Descargado de ScienceDirect doi: https://doi.org/10.1016/S0065-2113(08)60513-1spa
dc.relation.referencesPuerta-Zapata, J., Cadavid-Castro, M. A., Montoya-Betancur, K. V., & Álvarez-Castaño, L. S. (2023). Distribución tradicional y corporativa de alimentos en una zona urbana: estudio de casos colectivos en San Cristóbal, Medellín-Colombia. Revista de Investigación, Desarrollo e Innovación, 13(1), 157-172. https://doi.org/10.19053/20278306.v13.n1.2023.16058spa
dc.relation.referencesQamar, T., & Bawany, N. Z. (2023). Agri-PAD: a scalable framework for smart agriculture. Indonesian Journal of Electrical Engineering and Computer Science, 29(3), 1597-1605. doi:10.11591/ijeecs.v29.i3.pp1597-1605spa
dc.relation.referencesQuan, L., Jiang, W., Li, H., Li, H., Wang, Q., & Chen, L. (2022). Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosystems Engineering, 216, 13-31.spa
dc.relation.referencesQuiroz, R. A. A., Guidotti, F. P., y Bedoya, A. E. (2019). A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. 2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings. (2019) doi: 10.1109/STSIVA.2019.8730214spa
dc.relation.referencesRadoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., y Moscholios, I. (2020). A compilation of UAV applications for precision agriculture. Computer Networks, 172, 107148. Descargado de doi.org doi: 10.1016/j.comnet.2020.107148spa
dc.relation.referencesRampersad, H. (2020). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Total Performance Scorecard, 159–183. https://doi.org/10.4324/9780080519340-12spa
dc.relation.referencesRedmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem, 779–788. https://doi.org/10.1109/CVPR.2016.91spa
dc.relation.referencesRehman, T. U., Zaman, Q. U., Chang, Y. K., Schumann, A. W., & Corscadden, K. W. (2019). Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Computers and Electronics in Agriculture, 162, 1-13. https://doi.org/10.1016/j.compag.2019.03.023spa
dc.relation.referencesRen, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28.spa
dc.relation.referencesRestrepo-Arias, J. (2023). Método de clasificación de imágenes, empleando técnicas de inteligencia artificial, integrado a una plataforma IoT de agricultura inteligente. Universidad Nacional de Colombia. https://repositorio.unal.edu.co/handle/unal/83849spa
dc.relation.referencesRonneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional Networks for Biomedical Image Segmentation. In N. Navab, J. Hornegger, W. M. Wells, & A. F. Frangi (Eds.), Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015 (pp. 234–241). Springer International Publishing. https://doi.org/10.1007/978-3-319-24574-4_28spa
dc.relation.referencesRoychowdhury, S. (2021). U-net-for-Multi-class-semantic-segmentation. Recuperado de: https://github.com/sohiniroych/U-net-for-Multi-class-semantic-segmentationspa
dc.relation.referencesRoychowdhury, S., Koozekanani, D. D., & Parhi, K. K. (2014, septiembre). DREAM: Diabetic Retinopathy Analysis Using Machine Learning. IEEE Journal of Biomedical and Health Informatics, 18(5), 1717-1728. https://doi.org/10.1109/JBHI.2013.2294635spa
dc.relation.referencesSaiz-Rubio, V., y Rovira-Más, F. (2020, 2). From smart farming towards agriculture 5.0: A review on crop data management (Vol. 10). MDPI. doi: 10.3390/agronomy10020207.spa
dc.relation.referencesSalvador Lopez, J. (2022). Aprendizaje profundo para Análisis de Maquetación en documentos manuscritos. Universitat Politècnica de València. http://hdl.handle.net/10251/186330.spa
dc.relation.referencesSantos, A. A., Marcato Junior, J., Araújo, M. S., et al. (2019). Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVs. Sensors, 19(16), 3595. https://doi.org/10.3390/s19163595spa
dc.relation.referencesSkyMotion. (s.f.). DJI Mini 2 SE - Skymotion. Recuperado de https://skymotion.com.co/products/dji-mini-2-se?variant=47192926126397spa
dc.relation.referencesSchrijver, R. (2016). Precision agriculture and the future of farming in Europe: Scientific foresight study: Study. European Parliament.spa
dc.relation.referencesSimonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.spa
dc.relation.referencesSong, Z., Zhang, Z., Yang, S., et al. (2020). Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Computers and Electronics in Agriculture, 179(105), 812. https://doi.org/10.1016/j.compag.2020.105812spa
dc.relation.referencesSuh, H. K., Hofstee, J. W., & Van Henten, E. J. (2020). Investigation on combinations of colour indices and threshold techniques in vegetation segmentation for volunteer potato control in sugar beet. Computers and Electronics in Agriculture, 179, 105819.spa
dc.relation.referencesTan, C., Zhang, P., Zhang, Y., Zhou, X., Wang, Z., Du, Y., ... & Guo, W. (2020). Rapid recognition of field-grown wheat spikes based on a superpixel segmentation algorithm using digital images. Frontiers in Plant Science, 11, 259. https://doi.org/10.3389/fpls.2020.00259spa
dc.relation.referencesTan, M., & Le, Q. V. (2019). EfficientNet: Rethinking model scaling for convolutional neural networks. In 36th International Conference on Machine Learning, ICML 2019 (Vol. 2019-June, pp. 10691–10700).spa
dc.relation.referencesTorrey, Lisa; Shavlik, J. (2010). Transfer Learning. Handbook of Research on Machine Learning Applications, IGI Global, 657–665. https://doi.org/10.1201/b17320spa
dc.relation.referencesTriantafyllou, A., Sarigiannidis, P., y Bibi, S. (2019). Precision agriculture: A remote sensing monitoring system architecture. Information (Switzerland), 10. doi: 10.3390/info10110348spa
dc.relation.referencesTrivelli, L., Apicella, A., Chiarello, F., Rana, R., Fantoni, G., y Tarabella, A. (2019). From precision agriculture to industry 4.0: Unveiling technological connections in the agrifood sector. British Food Journal, 121(8), 1730–1743.spa
dc.relation.referencesUnidad Administrativa Especial de Aeronáutica Civil (UAEAC). (2023). Rac 91: Reglas generales de vuelo y operación.spa
dc.relation.referencesUnited Nations Development Programme (UNDP). (2021). What are the sustainable development goals? Descargado el 2023-11-07, de UNDPspa
dc.relation.referencesvelog. (s/f). Velog.io. Recuperado el 13 de febrero de 2024, de https://velog.io/@skhim520/DeepLab-v3spa
dc.relation.referencesWang, X., Jiang, G., Zhang, H., Zhao, H., Chen, Y., Mei, C., y Jia, Z. (2020). Grayscale distribution of maize canopy based on HLS-SVM method [Article]. International Journal of Food Properties, 23(1), 839 – 852. doi: 10.1080/10942912.2020.1758717spa
dc.relation.referencesWang, J., Yao, X., & Nguyen, B. K. (2022, October 12). Identification and localisation of multiple weeds in grassland for removal operation. In Proc. SPIE 12342, Fourteenth International Conference on Digital Image Processing (ICDIP 2022) (p. 123420Z). https://doi.org/10.1117/12.2644281spa
dc.relation.referencesWu, J., Yang, G., Yang, H., et al. (2020). Extracting apple tree crown information from remote imagery using deep learning. Computers and Electronics in Agriculture, 174(105), 504. https://doi.org/10.1016/j.compag.2020.105504spa
dc.relation.referencesWu, Yuxin; Kirillov, Alexander; Massa, Francisco; Lo, W.-Y., & Girshick, R. (2019). Detectron2. https://github.com/facebookresearch/detectron2spa
dc.relation.referencesXu, K., Li, H., Cao, W., Zhu, Y., Chen, R., & Ni, J. (2020). Recognition of weeds in wheat fields based on the fusion of RGB images and depth images. IEEE Access, 8, 110362-110370.spa
dc.relation.referencesXu, B., Fan, J., Chao, J., Arsenijevic, N., Werle, R., y Zhang, Z. (2023). Instance segmentation method for weed detection using UAV imagery in soybean fields [Article]. Computers and Electronics in Agriculture, 211. (Cited by: 0) doi: 10.1016/j.compag.2023.107994spa
dc.relation.referencesYang, M. D., Boubin, J. G., Tsai, H. P., Tseng, H. H., Hsu, Y. C., & Stewart, C. C. (2020). Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture, 179, 105817. https://doi.org/10.1016/j.compag.2020.105817spa
dc.relation.referencesYang, L., Bi, P., Tang, H., Zhang, F., & Wang, Z. (2022). Improving vegetation segmentation with shadow effects based on double input networks using polarization images. Computers and Electronics in Agriculture, 199, 107123. https://doi.org/10.1016/j.compag.2022.107123spa
dc.relation.referencesYang, M. D., Tseng, H. H., Hsu, Y. C., et al. (2020). Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images. Remote Sensing, 12(4). https://doi.org/10.3390/rs12040633spa
dc.relation.referencesYou, K., Liu, Y., Wang, J., & Long, M. (2021). LogME: Practical Assessment of Pre-trained Models for Transfer Learning. http://arxiv.org/abs/2102.11005spa
dc.relation.referencesYuan, J., Xue, B., Zhang, W., Xu, L., Sun, H., & Zhou, J. (2019). RPN-FCN based Rust detection on power equipment. Procedia Computer Science, 147, 349–353. https://doi.org/10.1016/j.procs.2019.01.236spa
dc.relation.referencesZhang, J., Zhao, B., Yang, C., Shi, Y., Liao, Q., Zhou, G., Xie, J. (2020). Rapeseed stand count estimation at leaf development stages with UAV imagery and convolutional neural networks [Article]. Frontiers in Plant Science, 11. (Cited by: 22; All Open Access, Gold Open Access, Green Open Access) doi: 10.3389/fpls.2020.00617spa
dc.relation.referencesZhang, X., Wang, Z., Liu, D., Lin, Q., y Ling, Q. (2021, 1). Deep adversarial data augmentation for extremely low data regimes. IEEE Transactions on Circuits and Systems for Video Technology, 31, 15-28. doi: 10.1109/TCSVT.2020.2967419spa
dc.relation.referencesZhang, Y., Wang, C., Wang, Y., y Cheng, P. (2022). Determining the stir-frying degree of Gardeniae Fructus Praeparatus based on deep learning and transfer learning [Article]. Sensors, 22(21). doi: 10.3390/s22218091spa
dc.relation.referencesZheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., ... & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-GB and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, 105223. https://doi.org/10.1016/j.compag.2020.105223spa
dc.relation.referencesZhao, H., Shi, J., Qi, X., Wang, X., & Jia, J. (2017). Pyramid Scene Parsing Network. 2881–2890. https://openaccess.thecvf.com/content_cvpr_2017/html/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.htmlspa
dc.relation.referencesZhuang, F., Qi, Z., Duan, K., Xi, D., Zhu, Y., Zhu, H., Xiong, H., & He, Q. (2021). A Comprehensive Survey on Transfer Learning. Proceedings of the IEEE, 109(1), 43–76. https://doi.org/10.1109/JPROC.2020.3004555spa
dc.relation.referencesZhuang, S., Wang, P., y Jiang, B. (2020). Vegetation extraction in the field using multi-level features [Article]. Biosystems Engineering, 197, 352 – 366. (Cited by: 3) doi: 10.1016/j.biosystemseng.2020.07.013spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseAtribución-NoComercial-SinDerivadas 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/spa
dc.subject.ddc000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadoresspa
dc.subject.ddc000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computaciónspa
dc.subject.ddc630 - Agricultura y tecnologías relacionadasspa
dc.subject.lembProcesamiento de imágenes
dc.subject.proposalAgricultura Inteligentespa
dc.subject.proposalimágenes aéreasspa
dc.subject.proposalVANTsspa
dc.subject.proposalAprendizaje profundospa
dc.subject.proposalRedes Neuronales Convolucionalesspa
dc.subject.proposalSmart Farmingeng
dc.subject.proposalaerial imageryeng
dc.subject.proposalUAVseng
dc.subject.proposalDeep Learningeng
dc.subject.proposalConvolutional neural networkseng
dc.subject.wikidataRedes neuronales convolucionales
dc.titleMétodo para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundospa
dc.title.translatedMethod for the classification of small-scale agricultural crops using deep learning techniqueseng
dc.typeTrabajo de grado - Maestríaspa
dc.type.coarhttp://purl.org/coar/resource_type/c_bdccspa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/masterThesisspa
dc.type.redcolhttp://purl.org/redcol/resource_type/TMspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dcterms.audience.professionaldevelopmentEstudiantesspa
dcterms.audience.professionaldevelopmentInvestigadoresspa
dcterms.audience.professionaldevelopmentMaestrosspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
1017232348.2024.pdf
Tamaño:
19.4 MB
Formato:
Adobe Portable Document Format
Descripción:
Tesis de Maestría en Ingeniería - Analítica

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: