Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones

dc.contributor.advisorGonzalez Osorio, Fabio Augusto
dc.contributor.authorArias VAnegas, Victor Alfonso
dc.contributor.researchgroupMachine Learning Perception and Discovery Lab (MindLab)spa
dc.date.accessioned2022-10-04T13:13:35Z
dc.date.available2022-10-04T13:13:35Z
dc.date.issued2022-10-02
dc.descriptionilustraciones, fotografías a color, gráficasspa
dc.description.abstractLa maleza o malas hierbas se define como una planta que crece de forma silvestre en un lugar indeseable para la actividad agrícola. Esto es debido a que compite por los recursos limitados disponibles en un sector previamente destinado y acondicionado a la producción de alimentos u otras actividades específicas, disminuyendo su rendimiento. Tradicionalmente los granjeros aplican la escarda o eliminación de malas hierbas con herramientas manuales, haciendo de este un proceso lento y costoso debido a la gran cantidad de mano de obra necesaria. Con el fin de reducir el número de trabajadores en la labor, agentes químicos de acción selectiva son usados directamente sobre el cultivo para matar la planta invasora, sin embargo, en grandes extensiones de terreno es difícil conocer previamente la distribución espacial de la maleza, por lo que la aplicación del agente se hace de manera uniforme en toda la plantación, llevando a un mayor desperdicio del producto y por ende un incremento en los costos. En este documento se propone una estrategia para la detección automática de la distribución espacial de la maleza en un terreno cultivado usando algoritmos de aprendizaje profundo (DL) en imágenes multiespectrales. Para probar el desempeño de la estrategia se utilizó una base de datos de imágenes recolectada por un vehículo aéreo no tripulado (VANT). Las bases de datos empleadas proporcionan las imágenes multiespectrales y su respectiva máscara, esta última representa la información semántica de cada uno de los pixeles de la imagen, la información semántica se constituye a partir de tres colores cada uno de ellos pertenecientes a una clase de interés: el rojo representa la maleza, el verde representa el cultivo y el negro representa el fondo o todo aquello que no es vegetal en el mapa. Adicionalmente, el problema se abordó como un problema de segmentación semántica y la estrategia de solución fue un algoritmo de DL. Al aplicar la solución a las imágenes se evidencia una mejora en las diferentes métricas usadas en la literatura para estas bases de datos tales como el AUC y el F1-score, además se evidencia excelentes resultados en las máscaras predichas para los datos de prueba. Por último, se analiza el aporte de los diferentes canales multiespectrales y de técnicas clásicas de preprocesamiento de imágenes a las métricas del modelo, además de la capacidad de este por generar buenas representaciones semánticas del terreno captado por el sensor.(Texto tomado de la fuente)spa
dc.description.abstractA weed is defined as a plant that grows wild in a place undesirable for agricultural crops. This is because it competes for the limited resources available in a sector previously destined and conditioned for food production or other specific activities, decreasing its yield. Traditionally farmers apply weeding or weed removal with hand tools, making this a slow and costly process due to the large amount of labor required. In order to reduce the number of workers involved, selective action chemical agents are used directly on the crop to kill the invasive plant, however, in large extensions of land it is difficult to know the spatial distribution of the weeds beforehand, so the application of the agent is done uniformly throughout the plantation, leading to a greater waste of the product and therefore an increase in costs. This thesis presents a strategy for automatic detection of the spatial distribution of weeds in a cultivated field using deep learning (DL) algorithms on multispectral images is proposed. An image database collected by an unmanned aerial vehicle (UAV) was used to test the performance of the strategy. The databases used provide the multispectral images and their respective mask, the latter represents the semantic information of each of the pixels of the image, the semantic information is represented using three colors, each one belonging to a class of interest: red represents the weeds, green represents the crop and black represents the background or everything that is not vegetation on the map. Additionally, the problem was approached as a semantic segmentation problem and the solution strategy was a DL algorithm. By applying the solution to the images, an improvement in the different metrics used in the literature for these databases such as AUC and F1-score is evidenced, in addition to excellent results in the predicted masks for the test data. Finally, the contribution of the different multispectral channels and classical image preprocessing techniques to the model metrics is analyzed, as well as the model’s ability to generate good semantic representations of the terrain captured by the sensor.eng
dc.description.degreelevelMaestríaspa
dc.description.degreenameMagíster en Ingeniería - Ingeniería de Sistemas y Computaciónspa
dc.description.researchareaProcesamiento digital de imágenes.spa
dc.description.sponsorshipColcienciasspa
dc.format.extentxii, 45 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/82351
dc.language.isospaspa
dc.publisherUniversidad Nacional De Colombiaspa
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotáspa
dc.publisher.facultyFacultad de Ingenieríaspa
dc.publisher.placeBogotá, Colombiaspa
dc.publisher.programBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computaciónspa
dc.relation.indexedBiremespa
dc.relation.indexedRedColspa
dc.relation.referencesStephen O Duke. Perspectives on transgenic, herbicide-resistant crops in the united states almost 20 years after introduction. Pest management science, 71(5):652–657, 2015.spa
dc.relation.referencesAlexa Varah, Kwadjo Ahodo, Shaun R Coutts, Helen L Hicks, David Comont, Laura Crook, Richard Hull, Paul Neve, Dylan Z Childs, Robert P Freckleton, et al. The costs of human-induced evolution in an agricultural system. Nature sustainability, 3(1):63–71, 2020.spa
dc.relation.referencesAlessandro dos Santos Ferreira, Daniel Matte Freitas, Gercina Gon ̧calves da Silva, Hemerson Pistori, and Marcelo Theophilo Folhes. Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143:314–324, 2017.spa
dc.relation.referencesShirley A Briggs. Basic guide to pesticides: their characteristics and hazards. CRC Press, 2018.spa
dc.relation.referencesIsabelle Schuster, Henning Nordmeyer, and Thomas Rath. Comparison of vision-based and manual weed mapping in sugar beet. Biosystems engineering, 98(1):17–25, 2007.spa
dc.relation.referencesDavid Pimentel, Herbert Acquay, Michael Biltonen, P Rice, M Silva, J Nelson, V Lip- ner, S Giordano, A Horowitz, and M D’amore. Environmental and economic costs of pesticide use. BioScience, 42(10):750–760, 1992.spa
dc.relation.referencesK Neil Harker and John T O’Donovan. Recent weed control, weed management, and integrated weed management. Weed Technology, 27(1):1–11, 2013.spa
dc.relation.referencesMulham Fawakherji, Ali Youssef, Domenico D Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weed classification using pixel-wise segmentation on ground and aerial images. Int. J. Robot. Comput, 2(1):39–57, 2020.spa
dc.relation.referencesDavid R Shaw. Remote sensing and site-specific weed management. Frontiers in Ecology and the Environment, 3(10):526–532, 2005.spa
dc.relation.referencesColin Birch, Ian Cooper, Gurjeet Gill, Stephen Adkins, and Madan Gupta. Weed management in rainfed agricultural systems. In Rainfed Farming Systems, pages 215– 232. Springer, 2011.spa
dc.relation.referencesPhilipp Lottes, Jens Behley, Nived Chebrolu, Andres Milioto, and Cyrill Stachniss. Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming. Journal of Field Robotics, 37(1):20–34, 2020.spa
dc.relation.referencesInkyu Sa, Marija Popovi ́c, Raghav Khanna, Zetao Chen, Philipp Lottes, Frank Liebisch, Juan Nieto, Cyrill Stachniss, Achim Walter, and Roland Siegwart. Weedmap: a large- scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sensing, 10(9):1423, 2018.spa
dc.relation.referencesJorge Torres-S ́anchez, Jos ́e Manuel Pena, Ana Isabel de Castro, and Fransisca L ́opez- Granados. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Computers and Electronics in Agriculture, 103:104–113, 2014.spa
dc.relation.referencesChang-chun Li, Guang-sheng Zhang, Tian-jie Lei, and A-du GONG. Quick image- processing method of uav without control points data in earthquake disaster area. Transactions of Nonferrous Metals Society of China, 21:s523–s528, 2011.spa
dc.relation.referencesAndreas Kamilaris and Francesc X Prenafeta-Bold ́u. Deep learning in agriculture: A survey. Computers and electronics in agriculture, 147:70–90, 2018.spa
dc.relation.referencesKonstantinos G Liakos, Patrizia Busato, Dimitrios Moshou, Simon Pearson, and Dion- ysis Bochtis. Machine learning in agriculture: A review. Sensors, 18(8):2674, 2018.spa
dc.relation.referencesDimosthenis C Tsouros, Stamatia Bibi, and Panagiotis G Sarigiannidis. A review on uav-based applications for precision agriculture. Information, 10(11):349, 2019.spa
dc.relation.referencesHuasheng Huang, Yubin Lan, Aqing Yang, Yali Zhang, Sheng Wen, and Jizhong Deng. Deep learning versus object-based image analysis (obia) in weed mapping of uav im- agery. International Journal of Remote Sensing, 41(9):3446–3479, 2020.spa
dc.relation.referencesInkyu Sa, Zetao Chen, Marija Popovi ́c, Raghav Khanna, Frank Liebisch, Juan Nieto, and Roland Siegwart. weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1):588– 595, 2017.spa
dc.relation.referencesAlwaseela Abdalla, Haiyan Cen, Liang Wan, Reem Rashid, Haiyong Weng, Weijun Zhou, and Yong He. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Computers and Electronics in Agriculture, 167:105091, 2019.spa
dc.relation.referencesKavir Osorio, Andr ́es Puerto, Cesar Pedraza, David Jamaica, and Leonardo Rodr ́ıguez. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering, 2(3):471–488, 2020.spa
dc.relation.referencesSigurbj ̈orn J ́onsson. Rgb and multispectral uav image classification of agricultural fields using a machine learning algorithm. Student thesis series INES, 2019.spa
dc.relation.referencesASM Mahmudul Hasan, Ferdous Sohel, Dean Diepeveen, Hamid Laga, and Michael GK Jones. A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture, 184:106067, 2021.spa
dc.relation.referencesAurelien Geron. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. ” O’Reilly Media, Inc.”, 2019.spa
dc.relation.referencesCalvin Hung, Zhe Xu, and Salah Sukkarieh. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a uav. Remote Sensing, 6(12):12037–12054, 2014.spa
dc.relation.referencesYann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553):436–444, 2015.spa
dc.relation.referencesAston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola. Dive into deep learning. arXiv preprint arXiv:2106.11342, 2021.spa
dc.relation.referencesHuasheng Huang, Jizhong Deng, Yubin Lan, Aqing Yang, Xiaoling Deng, and Lei Zhang. A fully convolutional network for weed mapping of unmanned aerial vehicle (uav) im- agery. PloS one, 13(4):e0196302, 2018.spa
dc.relation.referencesJonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431–3440, 2015.spa
dc.relation.referencesSebastian Haug and J ̈orn Ostermann. A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In Lourdes Agapito, Michael M. Bronstein, and Carsten Rother, editors, Computer Vision - ECCV 2014 Workshops, pages 105–116, Cham, 2015. Springer International Publishing.spa
dc.relation.referencesSebastian Haug, Andreas Michaels, Peter Biber, and J ̈orn Ostermann. Plant classi- fication system for crop/weed discrimination without segmentation. In IEEE winter conference on applications of computer vision, pages 1142–1149. IEEE, 2014.spa
dc.relation.referencesJos ́e Manuel Pe ̃na, Jorge Torres-S ́anchez, Ana Isabel de Castro, Maggi Kelly, and Fran- cisca L ́opez-Granados. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLOS ONE, 8(10):null, 10 2013.spa
dc.relation.referencesMaria Perez-Ortiz, JM Pena, Pedro Antonio Gutierrez, Jorge Torres-Sanchez, Cesar Hervas-Martınez, and Francisca Lopez-Granados. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Applied Soft Computing, 37:533–544, 2015.spa
dc.relation.referencesThomas K Alexandridis, Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Anastasia L Lagopodi, Javid Kashefi, Georgios Ovakoglou, Vassilios Polychronos, and Dimitrios Moshou. Novelty detection classifiers in weed mapping: Silybum marianum detection on uav multispectral images. Sensors, 17(9):2007, 2017.spa
dc.relation.referencesPhilipp Lottes, Raghav Khanna, Johannes Pfeifer, Roland Siegwart, and Cyrill Stach- niss. Uav-based crop and weed classification for smart farming. In 2017 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 3024–3031. IEEE, 2017.spa
dc.relation.referencesnders Krogh Mortensen, Mads Dyrmann, Henrik Karstoft, R Nyholm Jørgensen, Ren ́e Gislum, et al. Semantic segmentation of mixed crops using deep convolutional neural network. In CIGR-AgEng Conference, 26-29 June 2016, Aarhus, Denmark. Abstracts and Full papers, pages 1–6. Organising Committee, CIGR 2016, 2016.spa
dc.relation.referencesM. Dyrmann, R. N. Jørgensen, and H. S. Midtiby. Roboweedsupport - detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences, 8(2):842–847, 2017.spa
dc.relation.referencesVijay Badrinarayanan, Alex Kendall, and Roberto Cipolla. Segnet: A deep convo- lutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence, 39(12):2481–2495, 2017.spa
dc.relation.referencesMaurilio Di Cicco, Ciro Potena, Giorgio Grisetti, and Alberto Pretto. Automatic model based dataset generation for fast and accurate crop and weeds detection. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 5188–5195. IEEE, 2017.spa
dc.relation.referencesHuasheng Huang, Yubin Lan, Jizhong Deng, Aqing Yang, Xiaoling Deng, Lei Zhang, and Sheng Wen. A semantic labeling approach for accurate weed mapping of high resolution uav imagery. Sensors, 18(7):2113, 2018.spa
dc.relation.referencesSoren Skovsen, Mads Dyrmann, Anders K Mortensen, Morten S Laursen, Ren ́e Gis- lum, Jorgen Eriksen, Sadaf Farkhani, Henrik Karstoft, and Rasmus N Jorgensen. The grassclover image dataset for semantic and hierarchical species understanding in agri- culture. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 0–0, 2019.spa
dc.relation.referencesYannik Rist, Iurii Shendryk, Foivos Diakogiannis, and Shaun Levick. Weed mapping using very high resolution satellite imagery and fully convolutional neural network. In IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, pages 9784–9787. IEEE, 2019.spa
dc.relation.referencesFoivos I Diakogiannis, Fran ̧cois Waldner, Peter Caccetta, and Chen Wu. Resunet-a: A deep learning framework for semantic segmentation of remotely sensed data. ISPRS Journal of Photogrammetry and Remote Sensing, 162:94–114, 2020.spa
dc.relation.referencesShyam Prasad Adhikari, Heechan Yang, and Hyongsuk Kim. Learning semantic graph- ics using convolutional encoder–decoder network for autonomous weeding in paddy. Frontiers in plant science, 10:1404, 2019.spa
dc.relation.referencesOlaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image com- puting and computer-assisted intervention, pages 234–241. Springer, 2015.spa
dc.relation.referencesLiang-Chieh Chen, George Papandreou, Florian Schroff, and Hartwig Adam. Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:1706.05587, 2017.spa
dc.relation.referencesAnderson Brilhador, Matheus Gutoski, Leandro Takeshi Hattori, Andrei de Souza In ́acio, Andr ́e Eugˆenio Lazzaretti, and Heitor Silv ́erio Lopes. Classifi- cation of weeds and crops at the pixel-level using convolutional neural networks and data augmentation. In 2019 IEEE Latin American Conference on Computational Intelligence (LA-CCI), pages 1–6. IEEE, 2019.spa
dc.relation.referencesMulham Fawakherji, Ali Youssef, Domenico Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weeds classification for precision agriculture using context-independent pixel- wise segmentation. In 2019 Third IEEE International Conference on Robotic Computing (IRC), pages 146–152. IEEE, 2019.spa
dc.relation.referencesKaren Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.spa
dc.relation.referencesMuhammad Hamza Asad and Abdul Bais. Weed detection in canola fields using maxi- mum likelihood classification and deep convolutional neural network. Information Pro- cessing in Agriculture, 7(4):535–545, 2020.spa
dc.relation.referencesKaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.spa
dc.relation.referencesXu Ma, Xiangwu Deng, Long Qi, Yu Jiang, Hongwei Li, Yuwei Wang, and Xupo Xing. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PloS one, 14(4):e0215676, 2019.spa
dc.relation.referencesLukas Petrich, Georg Lohrmann, Matthias Neumann, Fabio Martin, Andreas Frey, Al- bert Stoll, and Volker Schmidt. Detection of colchicum autumnale in drone images, using a machine-learning approach. Precision Agriculture, 21(6):1291–1303, 2020.spa
dc.relation.referencesW Ramirez, P Achanccaray, LF Mendoza, and MAC Pacheco. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. In 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), pages 133–137. IEEE, 2020.spa
dc.relation.referencesNavneet Dalal and Bill Triggs. Histograms of oriented gradients for human detection. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), volume 1, pages 886–893. Ieee, 2005.spa
dc.relation.referencesJoseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779–788, 2016.spa
dc.relation.referencesKaiming He, Georgia Gkioxari, Piotr Doll ́ar, and Ross Girshick. Mask r-cnn. In Pro- ceedings of the IEEE international conference on computer vision, pages 2961–2969, 2017.spa
dc.relation.referencesKunlin Zou, Xin Chen, Fan Zhang, Hang Zhou, and Chunlong Zhang. A field weed density evaluation method based on uav imaging and modified u-net. Remote Sensing, 13(2):310, 2021.spa
dc.relation.referencesPetra Bosilj, Erchan Aptoula, Tom Duckett, and Grzegorz Cielniak. Transfer learn- ing between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1):7–19, 2020.spa
dc.relation.referencesS Umamaheswari and Ashvini V Jain. Encoder–decoder architecture for crop-weed classification using pixel-wise labelling. In 2020 International Conference on Artificial Intelligence and Signal Processing (AISP), pages 1–6. IEEE, 2020.spa
dc.relation.referencesAichen Wang, Yifei Xu, Xinhua Wei, and Bingbo Cui. Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. IEEE Access, 8:81724–81734, 2020.spa
dc.relation.referencesYuzhen Lu and Sierra Young. A survey of public datasets for computer vision tasks in precision agriculture. Computers and Electronics in Agriculture, 178:105760, 2020.spa
dc.relation.referencesZhangnan Wu, Yajun Chen, Bo Zhao, Xiaobing Kang, and Yuanyuan Ding. Review of weed detection methods based on computer vision. Sensors, 21(11):3647, 2021.spa
dc.relation.referencesMerima Kulin, Tarik Kazaz, Eli De Poorter, and Ingrid Moerman. A survey on machine learning-based performance improvement of wireless networks: Phy, mac and network layer. Electronics, 10(3):318, 2021.spa
dc.relation.referencesPanqu Wang, Pengfei Chen, Ye Yuan, Ding Liu, Zehua Huang, Xiaodi Hou, and Gar- rison Cottrell. Understanding convolution for semantic segmentation. In 2018 IEEE winter conference on applications of computer vision (WACV), pages 1451–1460. IEEE, 2018.spa
dc.relation.referencesPeng Liu, Hui Zhang, and Kie B Eom. Active deep learning for classification of hyper- spectral images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10(2):712–724, 2016.spa
dc.relation.referencesNguyen Thanh Toan and Nguyen Thanh Tam. Early bushfire detection with 3d cnn from streams of satellite images.spa
dc.relation.referencesNikhil Jangamreddy. A survey on specialised hardware for machine learning. 2019.spa
dc.relation.referencesE-C Oerke. Crop losses to pests. The Journal of Agricultural Science, 144(1):31–43, 2006.spa
dc.relation.referencesCraig D Osteen and Jorge Fernandez-Cornejo. Herbicide use trends: a backgrounder. Choices, 31(4):1–7, 2016.spa
dc.relation.referencesJohn Peterson Myers, Michael N Antoniou, Bruce Blumberg, Lynn Carroll, Theo Col- born, Lorne G Everett, Michael Hansen, Philip J Landrigan, Bruce P Lanphear, Robin Mesnage, et al. Concerns over use of glyphosate-based herbicides and risks associated with exposures: a consensus statement. Environmental Health, 15(1):1–13, 2016.spa
dc.relation.referencesKevis-Kokitsi Maninis, Jordi Pont-Tuset, Pablo Arbel ́aez, and Luc Van Gool. Deep retinal image understanding. In International conference on medical image computing and computer-assisted intervention, pages 140–148. Springer, 2016.spa
dc.relation.referencesKaren Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.spa
dc.relation.referencesLiam Li and Ameet Talwalkar. Random search and reproducibility for neural architec- ture search. In Uncertainty in artificial intelligence, pages 367–377. PMLR, 2020.spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseReconocimiento 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/spa
dc.subject.ddc630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materialesspa
dc.subject.ddc632 - Lesiones, enfermedades, plagas vegetalesspa
dc.subject.ddc000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadoresspa
dc.subject.lembControl de malezaspa
dc.subject.lembWeed control - researcheng
dc.subject.lembControl de maleza - investigacionesspa
dc.subject.lembWeed controleng
dc.subject.proposalMapeo de Malezaspa
dc.subject.proposalSegmentación Semánticaspa
dc.subject.proposalimágenes Multiespectralesspa
dc.subject.proposalAprendizaje Profundospa
dc.subject.proposalVehículo Aéreo No Tripuladospa
dc.subject.proposalClasificación Por Píxelesspa
dc.subject.proposalAprendizaje Automático En Producciónspa
dc.subject.proposalRedes Neuronales Convolucionalesspa
dc.titleAprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por dronesspa
dc.title.translatedDeep learning for weed mapping using multispectral drone Acquired imageryeng
dc.typeTrabajo de grado - Maestríaspa
dc.type.coarhttp://purl.org/coar/resource_type/c_bdccspa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/masterThesisspa
dc.type.redcolhttp://purl.org/redcol/resource_type/TMspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dcterms.audience.professionaldevelopmentEstudiantesspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa
oaire.awardtitleAprendizaje Profundo en Imágenes de Cultivos para la Detección Automática de Enfermedadesspa
oaire.fundernameColcienciasspa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
1090457208.2022.pdf
Tamaño:
14.83 MB
Formato:
Adobe Portable Document Format
Descripción:
Tesis de Maestría en Ingeniería - Ingeniería de Sistemas y Computación

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: