Mostrar el registro sencillo del documento
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo
dc.rights.license | Atribución-NoComercial-SinDerivadas 4.0 Internacional |
dc.contributor.advisor | Branch Bedoya, John Willian |
dc.contributor.advisor | Restrepo Arias, Juan Felipe |
dc.contributor.author | Arregocés Guerra, Paulina |
dc.date.accessioned | 2024-06-25T20:44:08Z |
dc.date.available | 2024-06-25T20:44:08Z |
dc.date.issued | 2024 |
dc.identifier.uri | https://repositorio.unal.edu.co/handle/unal/86302 |
dc.description.abstract | Aproximadamente el 75% de la superficie agrícola global pertenece a pequeños agricultores, siendo esenciales para el abastecimiento local de alimentos. Sin embargo, los desafíos comunes incluyen la falta de caracterización precisa de los cultivos y la escasa información detallada en las zonas productivas. La Agricultura Inteligente, que utiliza tecnologías avanzadas como Vehículos Aéreos No Tripulados (VANTs) y visión por computadora, ofrece soluciones; sin embargo, su falta de accesibilidad excluye al 94% de los pequeños agricultores en Colombia. Este trabajo aborda la necesidad de proponer un método de clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo. Se utiliza una VANT DJI Mini 2 SE, accesible en el mercado, para capturar imágenes en San Cristóbal, un área rural de Medellín, Colombia, con el objetivo de identificar cultivos de cebolla verde o de rama, follaje y áreas sin cultivo. Con 259 imágenes y 4315 instancias etiquetadas, se emplean modelos de Redes Neuronales Convolucionales (CNNs, por sus siglas en inglés) para la clasificación de objetos, segmentación de instancias y segmentación semántica. Se evaluaron métodos de Aprendizaje Profundo utilizando Transfer Learning, siendo Mask R-CNN el elegido con un 93% de precisión, una tasa de falsos positivos del 9% y falsos negativos del 4%. Las métricas incluyen un porcentaje de precisión promedio medio (mAP%) del 55.49% para follaje, 49.09% para áreas sin cultivo y 58.21% para la cebolla. El conjunto de datos etiquetado está disponible para fomentar la colaboración e investigación comparativa. En términos generales se concluye que mediante la captura de imágenes digitales con VANTs y el uso de métodos de aprendizaje profundo, se puede obtener información precisa y oportuna sobre pequeñas explotaciones agrícolas. (Texto tomado de la fuente) |
dc.description.abstract | Approximately 75% of the global agricultural land belongs to small-scale farmers, who are essential for local food supply. However, common challenges include the lack of accurate crop characterization and limited detailed information in productive areas. Smart Farming, employing advanced technologies such as Unmanned Aerial Vehicles (UAVs) and computer vision, offers solutions; however, its lack of accessibility excludes 94% of small-scale farmers in Colombia. This work addresses the need to propose a method for small-scale agricultural crop classification using deep learning techniques. A DJI Mini 2 SE UAV, readily available in the market, is used to capture images in San Cristóbal, a rural area of Medellín, Colombia, with the aim of identifying green onion or branch crops, foliage, and uncultivated areas. With 259 images and 4315 labeled instances, Convolutional Neural Network (CNN) models are employed for object detection, instance segmentation, and semantic segmentation. Deep Learning methods using transfer learning were evaluated, with Mask R-CNN selected, achieving 93% accuracy, a false positive rate of 9%, and false negative rate of 4%. Metrics include an average precision percentage (mAP%) of 55.49% for foliage, 49.09% for uncultivated areas, and 58.21% for onions. The labeled dataset is available to encourage collaboration and comparative research.In general terms, it is concluded that by capturing digital images with UAVs and using deep learning methods, precise and timely information about small agricultural operations can be obtained. |
dc.format.extent | 106 páginas |
dc.format.mimetype | application/pdf |
dc.language.iso | spa |
dc.publisher | Universidad Nacional de Colombia |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.subject.ddc | 000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores |
dc.subject.ddc | 000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación |
dc.subject.ddc | 630 - Agricultura y tecnologías relacionadas |
dc.title | Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
dc.type | Trabajo de grado - Maestría |
dc.type.driver | info:eu-repo/semantics/masterThesis |
dc.type.version | info:eu-repo/semantics/acceptedVersion |
dc.publisher.program | Medellín - Minas - Maestría en Ingeniería - Analítica |
dc.contributor.researchgroup | Gidia: Grupo de Investigación YyDesarrollo en Inteligencia Artificial |
dc.description.degreelevel | Maestría |
dc.description.degreename | Magister en Ingeniería Analítica |
dc.identifier.instname | Universidad Nacional de Colombia |
dc.identifier.reponame | Repositorio Institucional Universidad Nacional de Colombia |
dc.identifier.repourl | https://repositorio.unal.edu.co/ |
dc.publisher.faculty | Facultad de Minas |
dc.publisher.place | Medellín, Colombia |
dc.publisher.branch | Universidad Nacional de Colombia - Sede Medellín |
dc.relation.references | Agencia de desarrollo rural, FAO, y Gobernación de Antioquia. (2012). Plan integral de desarrollo agropecuario y rural con enfoque territorial (Vol. 91). |
dc.relation.references | Alamsyah, A., Saputra, M. A. A., & Masrury, R. A. (2019, March). Object detection using convolutional neural network to identify popular fashion product. In Journal of Physics: Conference Series (Vol. 1192, No. 1, p. 012040). IOP Publishing. |
dc.relation.references | Alba, A., Angela Burgos, Cárdenas, J., Lara, K., Sierra, A., y Rojas, G. A. M. (2013, 10). Panorama investigativo sobre la segunda revolución verde en el mundo y en Colombia. Tecciencia, 8 , 49-64. Descargado de SciELO doi: 10.18180/TECCIENCIA.2013.15.6 |
dc.relation.references | Alcaldía Mayor de Bogotá (22 de Marzo de 2022). Resolución 101 de 2022 Ministerio de Agricultura y Desarrollo Rural. Recuperado el 12 de Febrero de 2024 de https://www.alcaldiabogota.gov.co/sisjur/normas/Norma1.jsp?i=122204. |
dc.relation.references | Ammar, A., Koubaa, A., Ahmed, M., Saad, A., & Benjdira, B. (2019). Aerial images processing for car detection using convolutional neural networks: Comparison between faster r-cnn and yolov3. arXiv preprint arXiv:1910.07234. |
dc.relation.references | Ayaz, M., Ammad-Uddin, M., Sharif, Z., Mansour, A., y Aggoune, E.-H. M. (2019). Internet-of-Things (IoT)-based smart agriculture: Toward making the fields talk. IEEE access, 7 , 129551–129583. |
dc.relation.references | Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/TPAMI.2016.2644615 |
dc.relation.references | Bakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153-160. https://doi.org/10.1016/j.compag.2017.12.032 |
dc.relation.references | Bayraktar, E., Basarkan, M. E., & Celebi, N. (2020). A low-cost UAV framework towards ornamental plant detection and counting in the wild. ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11. https://doi.org/10.1016/j.isprsjprs.2020.06.012 |
dc.relation.references | Bochkovskiy, A., Wang, C. Y., & Liao, H. Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934. https://doi.org/10.48550/arXiv.2004.10934 |
dc.relation.references | Bouguettaya, A., Zarzour, H., Kechida, A., y Taberkit, A. M. (2022, 3). Deep learning techniques to classify agricultural crops through UAV imagery: a review. Neural Computing and Applications 2022 34:12 , 34 , 9511-9536. Descargado de Springer doi: 10.1007/S00521-022-07104-9 |
dc.relation.references | Botero, F., & Cristóbal, S (2017). política y económica del corregimiento de San Cristóbal. Disponible en https://bibliotecasmedellin.gov.co/wp-content/uploads/2018/10/Anexo_San_Cristo%CC%81bal.pdf |
dc.relation.references | Castañeda-Miranda, A., y Castaño-Meneses, V. M. (2020). Smart frost measurement for anti-disaster intelligent control in greenhouses via embedding IoT and hybrid AI methods. Measurement: Journal of the International Measurement Confederation, 164 . doi: 10.1016/j.measurement.2020.108043 |
dc.relation.references | Chamara, N., Bai, G., & Ge, Y. (2023). AICropCAM: Deploying classification, segmentation, detection, and counting deep-learning models for crop monitoring on the edge. Computers and Electronics in Agriculture, 215, 108420. https://doi.org/10.1016/j.compag.2023.108420 |
dc.relation.references | Chen, X., Girshick, R., He, K., & Dollar, P. (2019). TensorMask: A foundation for dense object segmentation. Proceedings of the IEEE International Conference on Computer Vision, 2019-Octob, 2061–2069. https://doi.org/10.1109/ICCV.2019.00215 |
dc.relation.references | Cheng, B., Collins, M. D., Zhu, Y., Liu, T., Huang, T. S., Adam, H., & Chen, L.-C. (2020). Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation. 12475–12485. https://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_Panoptic-DeepLab_A_Simple_Strong_and_Fast_Baseline_for_Bottom-Up_Panoptic_CVPR_2020_paper.html |
dc.relation.references | Chew, R., Rineer, J., Beach, R., O’neil, M., Ujeneza, N., Lapidus, D., .Temple, D. S. (2020). Deep neural networks and transfer learning for food crop identification in UAV images. Drones, 4 , 1-14. doi: 10.3390/drones4010007 |
dc.relation.references | Contiu, S., y Groza, A. (2016). Improving remote sensing crop classification by argumentation-based conflict resolution in ensemble learning. Expert Systems with Applications, 64 , 269-286. doi: 10.1016/j.eswa.2016.07.037 |
dc.relation.references | Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. En: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 248-255. DOI: 10.1109/CVPR.2009.5206848. https://doi.org/10.3389/fpls.2021.763479 |
dc.relation.references | Departamento Administrativo Nacional de Estadística (DANE). (2019). Encuesta nacional agropecuaria (ENA). Descargado de DANE |
dc.relation.references | Der Yang, M., Tseng, H. H., Hsu, Y. C., et al. (2020). Real-time crop classification using edge computing and deep learning. En: 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), IEEE, pp. 1–4. https://doi.org/10.1109/CCNC46108.2020.9045498 |
dc.relation.references | Dijkstra, K., van de Loosdrecht, J., Atsma, W. A., Schomaker, L. R., y Wiering, M. A. (2021). CentroidNetV2: A hybrid deep neural network for small-object segmentation and counting. Neurocomputing, 423 , 490-505. Descargado de doi doi: 10.1016/j.neucom.2020.10.075 |
dc.relation.references | El-Basioni, B. M. M., y El-Kader, S. M. A. (2020). Laying the foundations for an IoT reference architecture for agricultural application domain. IEEE Access, 8 , 190194-190230. doi: 10.1109/ACCESS.2020.3031634 |
dc.relation.references | Feng, T., Chai, Y., Huang, Y., & Liu, Y. (2019, December). A Real-time Monitoring and Control System for Crop. In Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence (pp. 183-188). https://doi.org/10.1145/3377713.3377742 |
dc.relation.references | Ferro, M. V., & Catania, P. (2023). Technologies and Innovative Methods for Precision Viticulture: A Comprehensive Review. Horticulturae, 9(3), 399. |
dc.relation.references | FindLight. (s.f.). Wide Dynamic Range Sensor NSC1005C. Recuperado de https://www.findlight.net/imaging-and-vision/image-sensors/area-scan-sensors/wide-dynamic-range-sensor-nsc1005c |
dc.relation.references | Food and Agriculture Organization (FAO). (2017). The future of food and agriculture: Trends and challenges. Descargado de FAO |
dc.relation.references | Food and Agriculture Organization (FAO). (2018). Fao’s work on agricultural innovation. , 20. Descargado de FAO |
dc.relation.references | FPN. (s/f). CloudFactory Computer Vision Wiki. Recuperado el 13 de febrero de 2024, de https://wiki.cloudfactory.com/docs/mp-wiki/model-architectures/fpn |
dc.relation.references | Fuentes-Peñailillo, F., Ortega-Farias, S., Rivera, M., Bardeen, M., & Moreno, M. (2018, October). Using clustering algorithms to segment UAV-based RGB images. In 2018 IEEE international conference on automation/XXIII congress of the Chilean association of automatic control (ICA-ACCA) (pp. 1-5). IEEE. doi: 10.1109/ICA-ACCA.2018.8609822 |
dc.relation.references | Fujiwara, R., Nashida, H., Fukushima, M., Suzuki, N., Sato, H., Sanada, Y., & Akiyama, Y. (2022). Convolutional neural network models help effectively estimate legume coverage in grass-legume mixed swards. Frontiers in Plant Science, 12, 763479. |
dc.relation.references | García-Santillán, I. D., y Pajares, G. (2018). On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields [Article]. Biosystems Engineering, 166 , 28 – 43. doi: 10.1016/j.biosystemseng.2017.11.003 |
dc.relation.references | Genze, N., Ajekwe, R., Güreli, Z., Haselbeck, F., Grieb, M., & Grimm, D. G. (2022). Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Computers and Electronics in Agriculture, 202, 107388. |
dc.relation.references | Girshick, R. (2015). Fast r-cnn. In Proceedings of the IEEE international conference on computer vision (pp. 1440-1448). |
dc.relation.references | Guler, R. A., Neverova, N., & Kokkinos, I. (2016). DensePose: Dense Human Pose Estimation In TheWild. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7297–7306. http://arxiv.org/abs/1612.01202 |
dc.relation.references | Hamuda, E., Glavin, M., y Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184-199. Descargado de doi.org (2016) doi: 10.1016/j.compag.2016.04.024 |
dc.relation.references | He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2020). Mask R-CNN. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(2), 386–397. https://doi.org/10.1109/TPAMI.2018.2844175 |
dc.relation.references | He, K., Zhang, X., Ren, S., et al. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770–778). https://doi.org/10.1109/CVPR.2016.90 |
dc.relation.references | Howard, A. G., et al. (2017, April). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. [Online]. arXiv preprint arXiv:1704.04861. Disponible en: http://arxiv.org/abs/1704.04861 |
dc.relation.references | Kawamura, K., Asai, H., Yasuda, T., Soisouvanh, P., & Phongchanmixay, S. (2021). Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm. Plant Production Science, 24(2), 198-215. https://doi.org/10.1080/1343943X.2020.1829490 |
dc.relation.references | Kirillov, A., He, K., Girshick, R., Rother, C., & Dollar, P. (2019). Panoptic segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 9396–9405. https://doi.org/10.1109/CVPR.2019.00963 |
dc.relation.references | Kitano, B. T., Mendes, C. C., Geus, A. R., et al. (2019). Corn plant counting using deep learning and UAV images. IEEE Geoscience and Remote Sensing Letters. https://doi.org/10.1109/LGRS.2019.2930549 |
dc.relation.references | Kitzler, F., Wagentristl, H., Neugschwandtner, R. W., Gronauer, A., & Motsch, V. (2022). Influence of Selected Modeling Parameters on Plant Segmentation Quality Using Decision Tree Classifiers. Agriculture, 12, 1408. https://doi.org/10.3390/agriculture12091408 |
dc.relation.references | Koirala, A., Walsh, K., Wang, Z., et al. (2019). Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘mangoyolo’. Precision Agriculture, 20(6), 1107–1135. https://doi.org/10.1007/s11119-019-09642-0 |
dc.relation.references | LeCun, Y., Bottou, L., Bengio, Y., et al. (1998). Gradient-based learning applied to document recognition. Proc IEEE, 86(11), 2278–2324. https://doi.org/10.1109/5.726791 |
dc.relation.references | Li, L., Mu, X., Jiang, H., Chianucci, F., Hu, R., Song, W., Yan, G. (2023). Review of ground and aerial methods for vegetation cover fraction (fcover) and related quantities estimation: definitions, advances, challenges, and future perspectives [Review]. ISPRS Journal of Photogrammetry and Remote Sensing, 199, 133 – 156. (Cited by: 1) doi: 10.1016/j.isprsjprs.2023.03.020 |
dc.relation.references | Li, W., Fu, H., Yu, L., y Cracknell, A. (2017). Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sensing, 9. doi: 10.3390/rs9010022 |
dc.relation.references | Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988). |
dc.relation.references | Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016). SSD: Single shot multibox detector. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9905 LNCS, 21–37. https://doi.org/10.1007/978-3-319-46448-0_2 |
dc.relation.references | Liu, H., Qi, Y., Xiao, W., Tian, H., Zhao, D., Zhang, K., Xiao, J., Lu, X., Lan, Y., & Zhang, Y. (2022). Identification of Male and Female Parents for Hybrid Rice Seed Production Using UAV-Based Multispectral Imagery. Agriculture, 12(7), 1005. https://doi.org/10.3390/agriculture12071005 |
dc.relation.references | Lohi, S. A., & Bhatt, C. (2022). Empirical Analysis of Crop Yield Prediction and Disease Detection Systems: A Statistical Perspective. ICT Infrastructure and Computing: Proceedings of ICT4SD 2022, 49-57. |
dc.relation.references | Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3431-3440). |
dc.relation.references | Lottes, P., H ̈orferlin, M., Sander, S., y Stachniss, C. (2017). Effective vision-based classification for separating sugar beets and weeds for precision farming. Journal of Field Robotics, 34, 1160-1178. (2017) doi: 10.1002/rob.21675 |
dc.relation.references | Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., y Stachniss, C. (2017). UAV-based crop and weed classification for smart farming. Proceedings - IEEE International Conference on Robotics and Automation, 3024-3031. doi: 10.1109/ICRA.2017.7989347 |
dc.relation.references | Lu, Y., Young, S., Wang, H., & Wijewardane, N. (2022). Robust plant segmentation of color images based on image contrast optimization. Computers and Electronics in Agriculture, 193, 106711. https://doi.org/10.1016/j.compag.2022.106711 |
dc.relation.references | Machefer, M., Lemarchand, F., Bonnefond, V., et al. (2020). Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sensing, 12(18). https://doi.org/10.3390/rs12183015 |
dc.relation.references | Mardanisamani, S., y Eramian, M. (2022). Segmentation of vegetation and microplots in aerial agriculture images: A survey [Review]. Plant Phenome Journal, 5 (1). Descargado de Plant Phenome Journal doi: 10.3390/data8050088 |
dc.relation.references | Mateen, A., y Zhu, Q. (2019). Weed detection in wheat crop using UAV for precision agriculture [Article]. Pakistan Journal of Agricultural Sciences, 56 (3), 809 – 817. (Cited by: 17) doi: 10.21162/PAKJAS/19.8116 |
dc.relation.references | Maulit, A., Nugumanova, A., Apayev, K., Baiburin, Y., y Sutula, M. (2023). A multispectral UAV imagery dataset of wheat, soybean and barley crops in East Kazakhstan [Article]. Data, 8 (5). doi: 10.3390/data8050088 |
dc.relation.references | Milioto, A., Lottes, P., & Stachniss, C. (2018, May). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2229-2235). IEEE. doi: 10.1109/ICRA.2018.8460962. |
dc.relation.references | Morales, G., Kemper, G., Sevillano, G., et al. (2018). Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests, 9(12). https://doi.org/10.3390/f9120736 |
dc.relation.references | Mortimer, A. M. (2000). Capítulo 2. La clasificación y ecología de las malezas. FAO. Recuperado el dia mes año de https://www.fao.org/3/T1147S/t1147s06.htm |
dc.relation.references | Mu, Y., Ni, R., Fu, L., Luo, T., Feng, R., Li, J., & Li, S. (2023). DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Frontiers in Plant Science, 13, 1041510. https://doi.org/10.3389/fpls.2022.1041510 |
dc.relation.references | Mukherjee, S. (2022, agosto 18). The annotated ResNet-50. Towards Data Science. https://towardsdatascience.com/the-annotated-resnet-50-a6c536034758 |
dc.relation.references | MyBotShop. (s.f.). Clearpath Husky A200. Recuperado de https://www.mybotshop.de/Clearpath-Husky-A200_3 |
dc.relation.references | Neupane, B., Horanont, T., & Hung, N. D. (2019). Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV). PLoS One, 14(10), e0223906. https://doi.org/1 |
dc.relation.references | Ngo, U. Q., Ngo, D. T., Nguyen, H. T., y Bui, T. D. (2022). Digital image processing methods for estimating leaf area of cucumber plants [Article]. Indonesian Journal of Electrical Engineering and Computer Science, 25 (1), 317 – 328. doi: 10.11591/ijeecs.v25.i1.pp317-328 |
dc.relation.references | Pashaei, M., Kamangir, H., Starek, M. J., & Tissot, P. (2020). Review and Evaluation of Deep Learning Architectures for Efficient Land Cover Mapping with UAS Hyper-Spatial Imagery: A Case Study Over a Wetland. Remote Sensing, 12(6), 959. https://doi.org/10.3390/rs12060959 |
dc.relation.references | Patidar, P. K., Tomar, D. S., Pateriya, R. K., & Sharma, Y. K. (2023, May). Precision Agriculture: Crop Image Segmentation and Loss Evaluation through Drone Surveillance. In 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC) (pp. 495-500). IEEE. doi: 10.1109/ICSCCC58608.2023.10176980 |
dc.relation.references | Pierce, F. J., y Nowak, P. (1999). Aspects of precision agriculture. En D. L. Sparks (Ed.), (Vol. 67, p. 1-85). Academic Press. Descargado de ScienceDirect doi: https://doi.org/10.1016/S0065-2113(08)60513-1 |
dc.relation.references | Puerta-Zapata, J., Cadavid-Castro, M. A., Montoya-Betancur, K. V., & Álvarez-Castaño, L. S. (2023). Distribución tradicional y corporativa de alimentos en una zona urbana: estudio de casos colectivos en San Cristóbal, Medellín-Colombia. Revista de Investigación, Desarrollo e Innovación, 13(1), 157-172. https://doi.org/10.19053/20278306.v13.n1.2023.16058 |
dc.relation.references | Qamar, T., & Bawany, N. Z. (2023). Agri-PAD: a scalable framework for smart agriculture. Indonesian Journal of Electrical Engineering and Computer Science, 29(3), 1597-1605. doi:10.11591/ijeecs.v29.i3.pp1597-1605 |
dc.relation.references | Quan, L., Jiang, W., Li, H., Li, H., Wang, Q., & Chen, L. (2022). Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosystems Engineering, 216, 13-31. |
dc.relation.references | Quiroz, R. A. A., Guidotti, F. P., y Bedoya, A. E. (2019). A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. 2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings. (2019) doi: 10.1109/STSIVA.2019.8730214 |
dc.relation.references | Radoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., y Moscholios, I. (2020). A compilation of UAV applications for precision agriculture. Computer Networks, 172, 107148. Descargado de doi.org doi: 10.1016/j.comnet.2020.107148 |
dc.relation.references | Rampersad, H. (2020). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Total Performance Scorecard, 159–183. https://doi.org/10.4324/9780080519340-12 |
dc.relation.references | Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem, 779–788. https://doi.org/10.1109/CVPR.2016.91 |
dc.relation.references | Rehman, T. U., Zaman, Q. U., Chang, Y. K., Schumann, A. W., & Corscadden, K. W. (2019). Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Computers and Electronics in Agriculture, 162, 1-13. https://doi.org/10.1016/j.compag.2019.03.023 |
dc.relation.references | Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28. |
dc.relation.references | Restrepo-Arias, J. (2023). Método de clasificación de imágenes, empleando técnicas de inteligencia artificial, integrado a una plataforma IoT de agricultura inteligente. Universidad Nacional de Colombia. https://repositorio.unal.edu.co/handle/unal/83849 |
dc.relation.references | Ronneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional Networks for Biomedical Image Segmentation. In N. Navab, J. Hornegger, W. M. Wells, & A. F. Frangi (Eds.), Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015 (pp. 234–241). Springer International Publishing. https://doi.org/10.1007/978-3-319-24574-4_28 |
dc.relation.references | Roychowdhury, S. (2021). U-net-for-Multi-class-semantic-segmentation. Recuperado de: https://github.com/sohiniroych/U-net-for-Multi-class-semantic-segmentation |
dc.relation.references | Roychowdhury, S., Koozekanani, D. D., & Parhi, K. K. (2014, septiembre). DREAM: Diabetic Retinopathy Analysis Using Machine Learning. IEEE Journal of Biomedical and Health Informatics, 18(5), 1717-1728. https://doi.org/10.1109/JBHI.2013.2294635 |
dc.relation.references | Saiz-Rubio, V., y Rovira-Más, F. (2020, 2). From smart farming towards agriculture 5.0: A review on crop data management (Vol. 10). MDPI. doi: 10.3390/agronomy10020207. |
dc.relation.references | Salvador Lopez, J. (2022). Aprendizaje profundo para Análisis de Maquetación en documentos manuscritos. Universitat Politècnica de València. http://hdl.handle.net/10251/186330. |
dc.relation.references | Santos, A. A., Marcato Junior, J., Araújo, M. S., et al. (2019). Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVs. Sensors, 19(16), 3595. https://doi.org/10.3390/s19163595 |
dc.relation.references | SkyMotion. (s.f.). DJI Mini 2 SE - Skymotion. Recuperado de https://skymotion.com.co/products/dji-mini-2-se?variant=47192926126397 |
dc.relation.references | Schrijver, R. (2016). Precision agriculture and the future of farming in Europe: Scientific foresight study: Study. European Parliament. |
dc.relation.references | Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556. |
dc.relation.references | Song, Z., Zhang, Z., Yang, S., et al. (2020). Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Computers and Electronics in Agriculture, 179(105), 812. https://doi.org/10.1016/j.compag.2020.105812 |
dc.relation.references | Suh, H. K., Hofstee, J. W., & Van Henten, E. J. (2020). Investigation on combinations of colour indices and threshold techniques in vegetation segmentation for volunteer potato control in sugar beet. Computers and Electronics in Agriculture, 179, 105819. |
dc.relation.references | Tan, C., Zhang, P., Zhang, Y., Zhou, X., Wang, Z., Du, Y., ... & Guo, W. (2020). Rapid recognition of field-grown wheat spikes based on a superpixel segmentation algorithm using digital images. Frontiers in Plant Science, 11, 259. https://doi.org/10.3389/fpls.2020.00259 |
dc.relation.references | Tan, M., & Le, Q. V. (2019). EfficientNet: Rethinking model scaling for convolutional neural networks. In 36th International Conference on Machine Learning, ICML 2019 (Vol. 2019-June, pp. 10691–10700). |
dc.relation.references | Torrey, Lisa; Shavlik, J. (2010). Transfer Learning. Handbook of Research on Machine Learning Applications, IGI Global, 657–665. https://doi.org/10.1201/b17320 |
dc.relation.references | Triantafyllou, A., Sarigiannidis, P., y Bibi, S. (2019). Precision agriculture: A remote sensing monitoring system architecture. Information (Switzerland), 10. doi: 10.3390/info10110348 |
dc.relation.references | Trivelli, L., Apicella, A., Chiarello, F., Rana, R., Fantoni, G., y Tarabella, A. (2019). From precision agriculture to industry 4.0: Unveiling technological connections in the agrifood sector. British Food Journal, 121(8), 1730–1743. |
dc.relation.references | Unidad Administrativa Especial de Aeronáutica Civil (UAEAC). (2023). Rac 91: Reglas generales de vuelo y operación. |
dc.relation.references | United Nations Development Programme (UNDP). (2021). What are the sustainable development goals? Descargado el 2023-11-07, de UNDP |
dc.relation.references | velog. (s/f). Velog.io. Recuperado el 13 de febrero de 2024, de https://velog.io/@skhim520/DeepLab-v3 |
dc.relation.references | Wang, X., Jiang, G., Zhang, H., Zhao, H., Chen, Y., Mei, C., y Jia, Z. (2020). Grayscale distribution of maize canopy based on HLS-SVM method [Article]. International Journal of Food Properties, 23(1), 839 – 852. doi: 10.1080/10942912.2020.1758717 |
dc.relation.references | Wang, J., Yao, X., & Nguyen, B. K. (2022, October 12). Identification and localisation of multiple weeds in grassland for removal operation. In Proc. SPIE 12342, Fourteenth International Conference on Digital Image Processing (ICDIP 2022) (p. 123420Z). https://doi.org/10.1117/12.2644281 |
dc.relation.references | Wu, J., Yang, G., Yang, H., et al. (2020). Extracting apple tree crown information from remote imagery using deep learning. Computers and Electronics in Agriculture, 174(105), 504. https://doi.org/10.1016/j.compag.2020.105504 |
dc.relation.references | Wu, Yuxin; Kirillov, Alexander; Massa, Francisco; Lo, W.-Y., & Girshick, R. (2019). Detectron2. https://github.com/facebookresearch/detectron2 |
dc.relation.references | Xu, K., Li, H., Cao, W., Zhu, Y., Chen, R., & Ni, J. (2020). Recognition of weeds in wheat fields based on the fusion of RGB images and depth images. IEEE Access, 8, 110362-110370. |
dc.relation.references | Xu, B., Fan, J., Chao, J., Arsenijevic, N., Werle, R., y Zhang, Z. (2023). Instance segmentation method for weed detection using UAV imagery in soybean fields [Article]. Computers and Electronics in Agriculture, 211. (Cited by: 0) doi: 10.1016/j.compag.2023.107994 |
dc.relation.references | Yang, M. D., Boubin, J. G., Tsai, H. P., Tseng, H. H., Hsu, Y. C., & Stewart, C. C. (2020). Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture, 179, 105817. https://doi.org/10.1016/j.compag.2020.105817 |
dc.relation.references | Yang, L., Bi, P., Tang, H., Zhang, F., & Wang, Z. (2022). Improving vegetation segmentation with shadow effects based on double input networks using polarization images. Computers and Electronics in Agriculture, 199, 107123. https://doi.org/10.1016/j.compag.2022.107123 |
dc.relation.references | Yang, M. D., Tseng, H. H., Hsu, Y. C., et al. (2020). Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images. Remote Sensing, 12(4). https://doi.org/10.3390/rs12040633 |
dc.relation.references | You, K., Liu, Y., Wang, J., & Long, M. (2021). LogME: Practical Assessment of Pre-trained Models for Transfer Learning. http://arxiv.org/abs/2102.11005 |
dc.relation.references | Yuan, J., Xue, B., Zhang, W., Xu, L., Sun, H., & Zhou, J. (2019). RPN-FCN based Rust detection on power equipment. Procedia Computer Science, 147, 349–353. https://doi.org/10.1016/j.procs.2019.01.236 |
dc.relation.references | Zhang, J., Zhao, B., Yang, C., Shi, Y., Liao, Q., Zhou, G., Xie, J. (2020). Rapeseed stand count estimation at leaf development stages with UAV imagery and convolutional neural networks [Article]. Frontiers in Plant Science, 11. (Cited by: 22; All Open Access, Gold Open Access, Green Open Access) doi: 10.3389/fpls.2020.00617 |
dc.relation.references | Zhang, X., Wang, Z., Liu, D., Lin, Q., y Ling, Q. (2021, 1). Deep adversarial data augmentation for extremely low data regimes. IEEE Transactions on Circuits and Systems for Video Technology, 31, 15-28. doi: 10.1109/TCSVT.2020.2967419 |
dc.relation.references | Zhang, Y., Wang, C., Wang, Y., y Cheng, P. (2022). Determining the stir-frying degree of Gardeniae Fructus Praeparatus based on deep learning and transfer learning [Article]. Sensors, 22(21). doi: 10.3390/s22218091 |
dc.relation.references | Zheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., ... & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-GB and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, 105223. https://doi.org/10.1016/j.compag.2020.105223 |
dc.relation.references | Zhao, H., Shi, J., Qi, X., Wang, X., & Jia, J. (2017). Pyramid Scene Parsing Network. 2881–2890. https://openaccess.thecvf.com/content_cvpr_2017/html/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.html |
dc.relation.references | Zhuang, F., Qi, Z., Duan, K., Xi, D., Zhu, Y., Zhu, H., Xiong, H., & He, Q. (2021). A Comprehensive Survey on Transfer Learning. Proceedings of the IEEE, 109(1), 43–76. https://doi.org/10.1109/JPROC.2020.3004555 |
dc.relation.references | Zhuang, S., Wang, P., y Jiang, B. (2020). Vegetation extraction in the field using multi-level features [Article]. Biosystems Engineering, 197, 352 – 366. (Cited by: 3) doi: 10.1016/j.biosystemseng.2020.07.013 |
dc.rights.accessrights | info:eu-repo/semantics/openAccess |
dc.subject.lemb | Procesamiento de imágenes |
dc.subject.proposal | Agricultura Inteligente |
dc.subject.proposal | imágenes aéreas |
dc.subject.proposal | VANTs |
dc.subject.proposal | Aprendizaje profundo |
dc.subject.proposal | Redes Neuronales Convolucionales |
dc.subject.proposal | Smart Farming |
dc.subject.proposal | aerial imagery |
dc.subject.proposal | UAVs |
dc.subject.proposal | Deep Learning |
dc.subject.proposal | Convolutional neural networks |
dc.title.translated | Method for the classification of small-scale agricultural crops using deep learning techniques |
dc.type.coar | http://purl.org/coar/resource_type/c_bdcc |
dc.type.coarversion | http://purl.org/coar/version/c_ab4af688f83e57aa |
dc.type.content | Text |
dc.type.redcol | http://purl.org/redcol/resource_type/TM |
oaire.accessrights | http://purl.org/coar/access_right/c_abf2 |
dcterms.audience.professionaldevelopment | Estudiantes |
dcterms.audience.professionaldevelopment | Investigadores |
dcterms.audience.professionaldevelopment | Maestros |
dc.description.curriculararea | Área Curricular de Ingeniería de Sistemas e Informática |
dc.contributor.orcid | Arregocés Guerra, Paulina [0000000195670231] |
dc.subject.wikidata | Redes neuronales convolucionales |
Archivos en el documento
Este documento aparece en la(s) siguiente(s) colección(ones)
![Atribución-NoComercial-SinDerivadas 4.0 Internacional](/themes/Mirage2//images/creativecommons/cc-generic.png)