Mostrar el registro sencillo del documento

dc.rights.licenseReconocimiento 4.0 Internacional
dc.contributor.advisorGonzalez Osorio, Fabio Augusto
dc.contributor.authorArias VAnegas, Victor Alfonso
dc.date.accessioned2022-10-04T13:13:35Z
dc.date.available2022-10-04T13:13:35Z
dc.date.issued2022-10-02
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/82351
dc.descriptionilustraciones, fotografías a color, gráficas
dc.description.abstractLa maleza o malas hierbas se define como una planta que crece de forma silvestre en un lugar indeseable para la actividad agrícola. Esto es debido a que compite por los recursos limitados disponibles en un sector previamente destinado y acondicionado a la producción de alimentos u otras actividades específicas, disminuyendo su rendimiento. Tradicionalmente los granjeros aplican la escarda o eliminación de malas hierbas con herramientas manuales, haciendo de este un proceso lento y costoso debido a la gran cantidad de mano de obra necesaria. Con el fin de reducir el número de trabajadores en la labor, agentes químicos de acción selectiva son usados directamente sobre el cultivo para matar la planta invasora, sin embargo, en grandes extensiones de terreno es difícil conocer previamente la distribución espacial de la maleza, por lo que la aplicación del agente se hace de manera uniforme en toda la plantación, llevando a un mayor desperdicio del producto y por ende un incremento en los costos. En este documento se propone una estrategia para la detección automática de la distribución espacial de la maleza en un terreno cultivado usando algoritmos de aprendizaje profundo (DL) en imágenes multiespectrales. Para probar el desempeño de la estrategia se utilizó una base de datos de imágenes recolectada por un vehículo aéreo no tripulado (VANT). Las bases de datos empleadas proporcionan las imágenes multiespectrales y su respectiva máscara, esta última representa la información semántica de cada uno de los pixeles de la imagen, la información semántica se constituye a partir de tres colores cada uno de ellos pertenecientes a una clase de interés: el rojo representa la maleza, el verde representa el cultivo y el negro representa el fondo o todo aquello que no es vegetal en el mapa. Adicionalmente, el problema se abordó como un problema de segmentación semántica y la estrategia de solución fue un algoritmo de DL. Al aplicar la solución a las imágenes se evidencia una mejora en las diferentes métricas usadas en la literatura para estas bases de datos tales como el AUC y el F1-score, además se evidencia excelentes resultados en las máscaras predichas para los datos de prueba. Por último, se analiza el aporte de los diferentes canales multiespectrales y de técnicas clásicas de preprocesamiento de imágenes a las métricas del modelo, además de la capacidad de este por generar buenas representaciones semánticas del terreno captado por el sensor.(Texto tomado de la fuente)
dc.description.abstractA weed is defined as a plant that grows wild in a place undesirable for agricultural crops. This is because it competes for the limited resources available in a sector previously destined and conditioned for food production or other specific activities, decreasing its yield. Traditionally farmers apply weeding or weed removal with hand tools, making this a slow and costly process due to the large amount of labor required. In order to reduce the number of workers involved, selective action chemical agents are used directly on the crop to kill the invasive plant, however, in large extensions of land it is difficult to know the spatial distribution of the weeds beforehand, so the application of the agent is done uniformly throughout the plantation, leading to a greater waste of the product and therefore an increase in costs. This thesis presents a strategy for automatic detection of the spatial distribution of weeds in a cultivated field using deep learning (DL) algorithms on multispectral images is proposed. An image database collected by an unmanned aerial vehicle (UAV) was used to test the performance of the strategy. The databases used provide the multispectral images and their respective mask, the latter represents the semantic information of each of the pixels of the image, the semantic information is represented using three colors, each one belonging to a class of interest: red represents the weeds, green represents the crop and black represents the background or everything that is not vegetation on the map. Additionally, the problem was approached as a semantic segmentation problem and the solution strategy was a DL algorithm. By applying the solution to the images, an improvement in the different metrics used in the literature for these databases such as AUC and F1-score is evidenced, in addition to excellent results in the predicted masks for the test data. Finally, the contribution of the different multispectral channels and classical image preprocessing techniques to the model metrics is analyzed, as well as the model’s ability to generate good semantic representations of the terrain captured by the sensor.
dc.description.sponsorshipColciencias
dc.format.extentxii, 45 páginas
dc.format.mimetypeapplication/pdf
dc.language.isospa
dc.publisherUniversidad Nacional De Colombia
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.ddc630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales
dc.subject.ddc632 - Lesiones, enfermedades, plagas vegetales
dc.subject.ddc000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
dc.titleAprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
dc.typeTrabajo de grado - Maestría
dc.type.driverinfo:eu-repo/semantics/masterThesis
dc.type.versioninfo:eu-repo/semantics/acceptedVersion
dc.publisher.programBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación
dc.contributor.researchgroupMachine Learning Perception and Discovery Lab (MindLab)
dc.description.degreelevelMaestría
dc.description.degreenameMagíster en Ingeniería - Ingeniería de Sistemas y Computación
dc.description.researchareaProcesamiento digital de imágenes.
dc.publisher.facultyFacultad de Ingeniería
dc.publisher.placeBogotá, Colombia
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotá
dc.relation.indexedBireme
dc.relation.indexedRedCol
dc.relation.referencesStephen O Duke. Perspectives on transgenic, herbicide-resistant crops in the united states almost 20 years after introduction. Pest management science, 71(5):652–657, 2015.
dc.relation.referencesAlexa Varah, Kwadjo Ahodo, Shaun R Coutts, Helen L Hicks, David Comont, Laura Crook, Richard Hull, Paul Neve, Dylan Z Childs, Robert P Freckleton, et al. The costs of human-induced evolution in an agricultural system. Nature sustainability, 3(1):63–71, 2020.
dc.relation.referencesAlessandro dos Santos Ferreira, Daniel Matte Freitas, Gercina Gon ̧calves da Silva, Hemerson Pistori, and Marcelo Theophilo Folhes. Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143:314–324, 2017.
dc.relation.referencesShirley A Briggs. Basic guide to pesticides: their characteristics and hazards. CRC Press, 2018.
dc.relation.referencesIsabelle Schuster, Henning Nordmeyer, and Thomas Rath. Comparison of vision-based and manual weed mapping in sugar beet. Biosystems engineering, 98(1):17–25, 2007.
dc.relation.referencesDavid Pimentel, Herbert Acquay, Michael Biltonen, P Rice, M Silva, J Nelson, V Lip- ner, S Giordano, A Horowitz, and M D’amore. Environmental and economic costs of pesticide use. BioScience, 42(10):750–760, 1992.
dc.relation.referencesK Neil Harker and John T O’Donovan. Recent weed control, weed management, and integrated weed management. Weed Technology, 27(1):1–11, 2013.
dc.relation.referencesMulham Fawakherji, Ali Youssef, Domenico D Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weed classification using pixel-wise segmentation on ground and aerial images. Int. J. Robot. Comput, 2(1):39–57, 2020.
dc.relation.referencesDavid R Shaw. Remote sensing and site-specific weed management. Frontiers in Ecology and the Environment, 3(10):526–532, 2005.
dc.relation.referencesColin Birch, Ian Cooper, Gurjeet Gill, Stephen Adkins, and Madan Gupta. Weed management in rainfed agricultural systems. In Rainfed Farming Systems, pages 215– 232. Springer, 2011.
dc.relation.referencesPhilipp Lottes, Jens Behley, Nived Chebrolu, Andres Milioto, and Cyrill Stachniss. Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming. Journal of Field Robotics, 37(1):20–34, 2020.
dc.relation.referencesInkyu Sa, Marija Popovi ́c, Raghav Khanna, Zetao Chen, Philipp Lottes, Frank Liebisch, Juan Nieto, Cyrill Stachniss, Achim Walter, and Roland Siegwart. Weedmap: a large- scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sensing, 10(9):1423, 2018.
dc.relation.referencesJorge Torres-S ́anchez, Jos ́e Manuel Pena, Ana Isabel de Castro, and Fransisca L ́opez- Granados. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Computers and Electronics in Agriculture, 103:104–113, 2014.
dc.relation.referencesChang-chun Li, Guang-sheng Zhang, Tian-jie Lei, and A-du GONG. Quick image- processing method of uav without control points data in earthquake disaster area. Transactions of Nonferrous Metals Society of China, 21:s523–s528, 2011.
dc.relation.referencesAndreas Kamilaris and Francesc X Prenafeta-Bold ́u. Deep learning in agriculture: A survey. Computers and electronics in agriculture, 147:70–90, 2018.
dc.relation.referencesKonstantinos G Liakos, Patrizia Busato, Dimitrios Moshou, Simon Pearson, and Dion- ysis Bochtis. Machine learning in agriculture: A review. Sensors, 18(8):2674, 2018.
dc.relation.referencesDimosthenis C Tsouros, Stamatia Bibi, and Panagiotis G Sarigiannidis. A review on uav-based applications for precision agriculture. Information, 10(11):349, 2019.
dc.relation.referencesHuasheng Huang, Yubin Lan, Aqing Yang, Yali Zhang, Sheng Wen, and Jizhong Deng. Deep learning versus object-based image analysis (obia) in weed mapping of uav im- agery. International Journal of Remote Sensing, 41(9):3446–3479, 2020.
dc.relation.referencesInkyu Sa, Zetao Chen, Marija Popovi ́c, Raghav Khanna, Frank Liebisch, Juan Nieto, and Roland Siegwart. weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1):588– 595, 2017.
dc.relation.referencesAlwaseela Abdalla, Haiyan Cen, Liang Wan, Reem Rashid, Haiyong Weng, Weijun Zhou, and Yong He. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Computers and Electronics in Agriculture, 167:105091, 2019.
dc.relation.referencesKavir Osorio, Andr ́es Puerto, Cesar Pedraza, David Jamaica, and Leonardo Rodr ́ıguez. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering, 2(3):471–488, 2020.
dc.relation.referencesSigurbj ̈orn J ́onsson. Rgb and multispectral uav image classification of agricultural fields using a machine learning algorithm. Student thesis series INES, 2019.
dc.relation.referencesASM Mahmudul Hasan, Ferdous Sohel, Dean Diepeveen, Hamid Laga, and Michael GK Jones. A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture, 184:106067, 2021.
dc.relation.referencesAurelien Geron. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. ” O’Reilly Media, Inc.”, 2019.
dc.relation.referencesCalvin Hung, Zhe Xu, and Salah Sukkarieh. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a uav. Remote Sensing, 6(12):12037–12054, 2014.
dc.relation.referencesYann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553):436–444, 2015.
dc.relation.referencesAston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola. Dive into deep learning. arXiv preprint arXiv:2106.11342, 2021.
dc.relation.referencesHuasheng Huang, Jizhong Deng, Yubin Lan, Aqing Yang, Xiaoling Deng, and Lei Zhang. A fully convolutional network for weed mapping of unmanned aerial vehicle (uav) im- agery. PloS one, 13(4):e0196302, 2018.
dc.relation.referencesJonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431–3440, 2015.
dc.relation.referencesSebastian Haug and J ̈orn Ostermann. A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In Lourdes Agapito, Michael M. Bronstein, and Carsten Rother, editors, Computer Vision - ECCV 2014 Workshops, pages 105–116, Cham, 2015. Springer International Publishing.
dc.relation.referencesSebastian Haug, Andreas Michaels, Peter Biber, and J ̈orn Ostermann. Plant classi- fication system for crop/weed discrimination without segmentation. In IEEE winter conference on applications of computer vision, pages 1142–1149. IEEE, 2014.
dc.relation.referencesJos ́e Manuel Pe ̃na, Jorge Torres-S ́anchez, Ana Isabel de Castro, Maggi Kelly, and Fran- cisca L ́opez-Granados. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLOS ONE, 8(10):null, 10 2013.
dc.relation.referencesMaria Perez-Ortiz, JM Pena, Pedro Antonio Gutierrez, Jorge Torres-Sanchez, Cesar Hervas-Martınez, and Francisca Lopez-Granados. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Applied Soft Computing, 37:533–544, 2015.
dc.relation.referencesThomas K Alexandridis, Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Anastasia L Lagopodi, Javid Kashefi, Georgios Ovakoglou, Vassilios Polychronos, and Dimitrios Moshou. Novelty detection classifiers in weed mapping: Silybum marianum detection on uav multispectral images. Sensors, 17(9):2007, 2017.
dc.relation.referencesPhilipp Lottes, Raghav Khanna, Johannes Pfeifer, Roland Siegwart, and Cyrill Stach- niss. Uav-based crop and weed classification for smart farming. In 2017 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 3024–3031. IEEE, 2017.
dc.relation.referencesnders Krogh Mortensen, Mads Dyrmann, Henrik Karstoft, R Nyholm Jørgensen, Ren ́e Gislum, et al. Semantic segmentation of mixed crops using deep convolutional neural network. In CIGR-AgEng Conference, 26-29 June 2016, Aarhus, Denmark. Abstracts and Full papers, pages 1–6. Organising Committee, CIGR 2016, 2016.
dc.relation.referencesM. Dyrmann, R. N. Jørgensen, and H. S. Midtiby. Roboweedsupport - detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences, 8(2):842–847, 2017.
dc.relation.referencesVijay Badrinarayanan, Alex Kendall, and Roberto Cipolla. Segnet: A deep convo- lutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence, 39(12):2481–2495, 2017.
dc.relation.referencesMaurilio Di Cicco, Ciro Potena, Giorgio Grisetti, and Alberto Pretto. Automatic model based dataset generation for fast and accurate crop and weeds detection. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 5188–5195. IEEE, 2017.
dc.relation.referencesHuasheng Huang, Yubin Lan, Jizhong Deng, Aqing Yang, Xiaoling Deng, Lei Zhang, and Sheng Wen. A semantic labeling approach for accurate weed mapping of high resolution uav imagery. Sensors, 18(7):2113, 2018.
dc.relation.referencesSoren Skovsen, Mads Dyrmann, Anders K Mortensen, Morten S Laursen, Ren ́e Gis- lum, Jorgen Eriksen, Sadaf Farkhani, Henrik Karstoft, and Rasmus N Jorgensen. The grassclover image dataset for semantic and hierarchical species understanding in agri- culture. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 0–0, 2019.
dc.relation.referencesYannik Rist, Iurii Shendryk, Foivos Diakogiannis, and Shaun Levick. Weed mapping using very high resolution satellite imagery and fully convolutional neural network. In IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, pages 9784–9787. IEEE, 2019.
dc.relation.referencesFoivos I Diakogiannis, Fran ̧cois Waldner, Peter Caccetta, and Chen Wu. Resunet-a: A deep learning framework for semantic segmentation of remotely sensed data. ISPRS Journal of Photogrammetry and Remote Sensing, 162:94–114, 2020.
dc.relation.referencesShyam Prasad Adhikari, Heechan Yang, and Hyongsuk Kim. Learning semantic graph- ics using convolutional encoder–decoder network for autonomous weeding in paddy. Frontiers in plant science, 10:1404, 2019.
dc.relation.referencesOlaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image com- puting and computer-assisted intervention, pages 234–241. Springer, 2015.
dc.relation.referencesLiang-Chieh Chen, George Papandreou, Florian Schroff, and Hartwig Adam. Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:1706.05587, 2017.
dc.relation.referencesAnderson Brilhador, Matheus Gutoski, Leandro Takeshi Hattori, Andrei de Souza In ́acio, Andr ́e Eugˆenio Lazzaretti, and Heitor Silv ́erio Lopes. Classifi- cation of weeds and crops at the pixel-level using convolutional neural networks and data augmentation. In 2019 IEEE Latin American Conference on Computational Intelligence (LA-CCI), pages 1–6. IEEE, 2019.
dc.relation.referencesMulham Fawakherji, Ali Youssef, Domenico Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weeds classification for precision agriculture using context-independent pixel- wise segmentation. In 2019 Third IEEE International Conference on Robotic Computing (IRC), pages 146–152. IEEE, 2019.
dc.relation.referencesKaren Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
dc.relation.referencesMuhammad Hamza Asad and Abdul Bais. Weed detection in canola fields using maxi- mum likelihood classification and deep convolutional neural network. Information Pro- cessing in Agriculture, 7(4):535–545, 2020.
dc.relation.referencesKaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
dc.relation.referencesXu Ma, Xiangwu Deng, Long Qi, Yu Jiang, Hongwei Li, Yuwei Wang, and Xupo Xing. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PloS one, 14(4):e0215676, 2019.
dc.relation.referencesLukas Petrich, Georg Lohrmann, Matthias Neumann, Fabio Martin, Andreas Frey, Al- bert Stoll, and Volker Schmidt. Detection of colchicum autumnale in drone images, using a machine-learning approach. Precision Agriculture, 21(6):1291–1303, 2020.
dc.relation.referencesW Ramirez, P Achanccaray, LF Mendoza, and MAC Pacheco. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. In 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), pages 133–137. IEEE, 2020.
dc.relation.referencesNavneet Dalal and Bill Triggs. Histograms of oriented gradients for human detection. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), volume 1, pages 886–893. Ieee, 2005.
dc.relation.referencesJoseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779–788, 2016.
dc.relation.referencesKaiming He, Georgia Gkioxari, Piotr Doll ́ar, and Ross Girshick. Mask r-cnn. In Pro- ceedings of the IEEE international conference on computer vision, pages 2961–2969, 2017.
dc.relation.referencesKunlin Zou, Xin Chen, Fan Zhang, Hang Zhou, and Chunlong Zhang. A field weed density evaluation method based on uav imaging and modified u-net. Remote Sensing, 13(2):310, 2021.
dc.relation.referencesPetra Bosilj, Erchan Aptoula, Tom Duckett, and Grzegorz Cielniak. Transfer learn- ing between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1):7–19, 2020.
dc.relation.referencesS Umamaheswari and Ashvini V Jain. Encoder–decoder architecture for crop-weed classification using pixel-wise labelling. In 2020 International Conference on Artificial Intelligence and Signal Processing (AISP), pages 1–6. IEEE, 2020.
dc.relation.referencesAichen Wang, Yifei Xu, Xinhua Wei, and Bingbo Cui. Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. IEEE Access, 8:81724–81734, 2020.
dc.relation.referencesYuzhen Lu and Sierra Young. A survey of public datasets for computer vision tasks in precision agriculture. Computers and Electronics in Agriculture, 178:105760, 2020.
dc.relation.referencesZhangnan Wu, Yajun Chen, Bo Zhao, Xiaobing Kang, and Yuanyuan Ding. Review of weed detection methods based on computer vision. Sensors, 21(11):3647, 2021.
dc.relation.referencesMerima Kulin, Tarik Kazaz, Eli De Poorter, and Ingrid Moerman. A survey on machine learning-based performance improvement of wireless networks: Phy, mac and network layer. Electronics, 10(3):318, 2021.
dc.relation.referencesPanqu Wang, Pengfei Chen, Ye Yuan, Ding Liu, Zehua Huang, Xiaodi Hou, and Gar- rison Cottrell. Understanding convolution for semantic segmentation. In 2018 IEEE winter conference on applications of computer vision (WACV), pages 1451–1460. IEEE, 2018.
dc.relation.referencesPeng Liu, Hui Zhang, and Kie B Eom. Active deep learning for classification of hyper- spectral images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10(2):712–724, 2016.
dc.relation.referencesNguyen Thanh Toan and Nguyen Thanh Tam. Early bushfire detection with 3d cnn from streams of satellite images.
dc.relation.referencesNikhil Jangamreddy. A survey on specialised hardware for machine learning. 2019.
dc.relation.referencesE-C Oerke. Crop losses to pests. The Journal of Agricultural Science, 144(1):31–43, 2006.
dc.relation.referencesCraig D Osteen and Jorge Fernandez-Cornejo. Herbicide use trends: a backgrounder. Choices, 31(4):1–7, 2016.
dc.relation.referencesJohn Peterson Myers, Michael N Antoniou, Bruce Blumberg, Lynn Carroll, Theo Col- born, Lorne G Everett, Michael Hansen, Philip J Landrigan, Bruce P Lanphear, Robin Mesnage, et al. Concerns over use of glyphosate-based herbicides and risks associated with exposures: a consensus statement. Environmental Health, 15(1):1–13, 2016.
dc.relation.referencesKevis-Kokitsi Maninis, Jordi Pont-Tuset, Pablo Arbel ́aez, and Luc Van Gool. Deep retinal image understanding. In International conference on medical image computing and computer-assisted intervention, pages 140–148. Springer, 2016.
dc.relation.referencesKaren Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
dc.relation.referencesLiam Li and Ameet Talwalkar. Random search and reproducibility for neural architec- ture search. In Uncertainty in artificial intelligence, pages 367–377. PMLR, 2020.
dc.rights.accessrightsinfo:eu-repo/semantics/openAccess
dc.subject.lembControl de maleza
dc.subject.lembWeed control - research
dc.subject.lembControl de maleza - investigaciones
dc.subject.lembWeed control
dc.subject.proposalMapeo de Maleza
dc.subject.proposalSegmentación Semántica
dc.subject.proposalimágenes Multiespectrales
dc.subject.proposalAprendizaje Profundo
dc.subject.proposalVehículo Aéreo No Tripulado
dc.subject.proposalClasificación Por Píxeles
dc.subject.proposalAprendizaje Automático En Producción
dc.subject.proposalRedes Neuronales Convolucionales
dc.title.translatedDeep learning for weed mapping using multispectral drone Acquired imagery
dc.type.coarhttp://purl.org/coar/resource_type/c_bdcc
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aa
dc.type.contentText
dc.type.redcolhttp://purl.org/redcol/resource_type/TM
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2
oaire.awardtitleAprendizaje Profundo en Imágenes de Cultivos para la Detección Automática de Enfermedades
oaire.fundernameColciencias
dcterms.audience.professionaldevelopmentEstudiantes


Archivos en el documento

Thumbnail

Este documento aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del documento

Reconocimiento 4.0 InternacionalEsta obra está bajo licencia internacional Creative Commons Reconocimiento-NoComercial 4.0.Este documento ha sido depositado por parte de el(los) autor(es) bajo la siguiente constancia de depósito