Metodología para identificar y caracterizar áreas sembradas en papa en el departamento de Cundinamarca a partir de la integración de imágenes ópticas y de radar

dc.contributor.advisorMartinez Martinez, Luis Joel
dc.contributor.authorLuque Sanabria, Nadia Yurany
dc.contributor.cvlacLuque Sanabria, Nadia [https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000840661]spa
dc.contributor.orcidLuque Sanabria, Nadia [https://orcid.org/0000-0002-4108-3231]spa
dc.coverage.countryColombia
dc.coverage.regionCundinamarca
dc.date.accessioned2023-09-26T15:43:08Z
dc.date.available2023-09-26T15:43:08Z
dc.date.issued2023-09
dc.descriptionilustraciones, diagramas, fotografías, mapas a color, planosspa
dc.description.abstractIdentificar áreas cultivadas en papa en Colombia permitiría programar mejor los periodos de siembra, disminuyendo la sobreoferta que reduce el precio a los productores. Las imágenes ópticas permiten identificar áreas de cultivo, pero la presencia de nubes es un factor limitante que se puede contrarrestar a través del método de fusión con imágenes radar, que complementa la información faltante. Con base en lo anterior, se desarrolló una metodología para identificar y caracterizar áreas sembradas en papa en Cundinamarca, integrando imágenes Sentinel 1 y 2. Como datos de referencia, se georreferenciaron lotes de papa del primer semestre de 2022 en Villapinzón y Lenguazaque. Se procesaron imágenes de Sentinel 1, Sentinel 2 y Sentinel 2 más índices espectrales de vegetación. Se realizó la fusión por análisis de componentes principales (ACP) y se construyó una imagen multitemporal con los seis primeros componentes de las imágenes seleccionadas. Se aplicaron algoritmos de bosques aleatorios (BA) y máquinas de vector de soporte (MVS). La evaluación de exactitud mejoró en la fusión con ACP, pues el algoritmo MVS aumento 4.7% y BA 7.3% con respecto a la exactitud global hallada con la imagen multitemporal de radar. Con respecto a lo encontrado con las imagenes multitemporales de imágenes ópticas sin incluir e incluyendo índices espectrales, no se encuentra diferencia en la exactitud global de BA, que fue del 96% para las tres imágenes. Para la clase papa, la métrica de exactitud F1 estuvo entre 89 y 92%. El área obtenida para el cultivo de papa en el primer semestre de 2022 con la imagen multitemporal por fusión de ACP y algoritmo BA, fue de 13.202 ha. Este valor es coherente y cercano a la realidad si se compara con lo obtenido en 2021 para el área de estudio (27.593 ha/año; aproximadamente 13.797 ha/semestre). (Texto tomado de la fuente)spa
dc.description.abstractIdentifying cultivated areas with potato crops in Colombia would allow better programming of planting periods, reducing the oversupply that reduces the price to producers. Optical images make it possible to identify croping areas, but clouds are a barrier. This can be counteracted through the fusion method with radar images, which complements the missing information. Based on this, a methodology is presented to identify and characterize areas planted with potatoes in Cundinamarca, integrating Sentinel 1 and 2 images. As reference data, potato lots from the first semester of 2022 in Villapinzon and Lenguazaque were georeferenced. Sentinel 1 and 2 images were processed, including vegetation indices. Fusion was performed by principal component analysis (PCA) and a multitemporal image was constructed with the first six components of the selected images. Random forest (RF) and support vector machine (SVM) algorithms were applied. The accuracy evaluation improved in the fusion with PCA, since the SVM algorithm increased 4.7% and RF 7.3% with respect to the global accuracy found with the multitemporal radar image. With respect to what was found with the multitemporal images of optical images without including and including spectral indices, no difference was found in the global accuracy of RF, which was 96% for the three images. For the potato class, the F1 accuracy metric was between 89 and 92%. Area obtained for potato cultivation in the first semester of 2022, with the multitemporal image by PCA fusion and RF algorithm was 13,202 ha. This value is coherent and close to reality when compared to what was obtained in 2021 for the study area (27,593 ha/year; approximately 13,797 ha/semester).eng
dc.description.degreelevelMaestríaspa
dc.description.researchareaGeoinformación para el uso sostenible de los recursos naturalesspa
dc.format.extentxv, 83 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/84734
dc.publisherUniversidad Nacional de Colombiaspa
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotáspa
dc.publisher.facultyFacultad de Ciencias Agrariasspa
dc.publisher.placeBogotá, Colombiaspa
dc.publisher.programBogotá - Ciencias Agrarias - Maestría en Geomáticaspa
dc.relation.referencesAgronet (s. f.). (2021). Estadísticas-Área, Producción, Rendimiento y Participación Municipal en el Departamento por Cultivo. [Términos de búsqueda: Departamento: Cundinamarca, cultivo: papa: municipios: Carmen de Carupa, Ubate, Sutatausa, Tausa, Suesca, Choconta, Cucunuba, Villapinzon y Lenguazaque, año: 2021]. Agronet, Minagricultura. Recuperado de https://www.agronet.gov.co/estadistica/Paginas/home.aspx?cod=4spa
dc.relation.referencesAdrian, J., Sagan, V., & Maimaitijiang, M. (2021). Sentinel SAR-optical fusion for crop type mapping using deep learning and Google Earth Engine. ISPRS Journal of Photogrammetry and Remote Sensing, 175(December 2020), 215–235. https://doi.org/10.1016/j.isprsjprs.2021.02.018spa
dc.relation.referencesAshourloo, D., Shahrabi, H. S., Azadbakht, M., Rad, A. M., Aghighi, H., & Radiom, S. (2020). A novel method for automatic potato mapping using time series of Sentinel-2 images. Computers and Electronics in Agriculture, 175(May), 105583. https://doi.org/10.1016/j.compag.2020.105583spa
dc.relation.referencesBargiel, D. (2017). A new method for crop classification combining time series of radar images and crop phenology information. Remote Sensing of Environment, 198, 369–383. https://doi.org/10.1016/j.rse.2017.06.022spa
dc.relation.referencesBasukala, A. K., Oldenburg, C., Schellberg, J., Sultanov, M., & Dubovyk, O. (2017). Towards improved land use mapping of irrigated croplands: performance assessment of different image classification algorithms and approaches. European Journal of Remote Sensing, 50(1), 187–201. https://doi.org/10.1080/22797254.2017.1308235spa
dc.relation.referencesBelgiu, M., & Stein, A. (2019). Spatiotemporal image fusion in remote sensing. Remote Sensing, 11(7). https://doi.org/10.3390/rs11070818spa
dc.relation.referencesBhandari, A. K., Kumar, A., & Singh, G. K. (2012). Feature Extraction using Normalized Difference Vegetation Index (NDVI): A Case Study of Jabalpur City. Procedia Technology, 6, 612–621. https://doi.org/10.1016/j.protcy.2012.10.074spa
dc.relation.referencesBoswell, D. (2002). Introduction to Support Vector Machines. University of California, San Diego. https://doi.org/10.1016/B978-044451378-6/50001-6spa
dc.relation.referencesBreiman, L. (2001). Random Forests. Machine Learning, 45, 5–32.spa
dc.relation.referencesBurbidge, R., & Buxton, B. (2001). An introduction to support vector machines for data mining. Keynote Papers, Young OR12, 2–14. http://www.cc.gatech.edu/fac/Charles.Isbell/classes/2008/cs7641_spring/handouts/yor12-introsvm.pdfspa
dc.relation.referencesCampbell, N. A. (1993). Towards more quantitative extraction of information from remotely sensed data. Advanced Remote Sensing, Conference Proceedings, Held in Sydney, Australia, 2, 29–40.spa
dc.relation.referencesCCB. (2008). Ubaté. Caracterización económica y empresarial. Camara de Comercio de Bogotá, 50. https://bibliotecadigital.ccb.org.co/bitstream/handle/11520/2889/6233_caracteriz_empresarial_ubate.pdf?sequence=1spa
dc.relation.referencesCarletto, C., Gourlay, S., & Winters, P. (2015). From guesstimates to GPStimates: Land area measurement and implications for agricultural analysis. Journal of African Economies, 24(5), 593–628. https://doi.org/10.1093/jae/ejv011spa
dc.relation.referencesCCB. (2008). Caracterización económica y empresarial de la Provincia Almeidas. Camara de Comercio de Bogotá, 50. https://bibliotecadigital.ccb.org.co/bitstream/handle/11520/2874/6217_caracteriz_empresarial_almeidas.pdf?sequence=1&isAllowed=yspa
dc.relation.referencesChang, NB., Ba,i K., Imen, S., Chen, CF., Gao, W. (2016). Multi-sensor satellite image fusion and networking for all-weather environmental. IEEE Systems Journal. 12, 1341-1357. DOI: 10.1109/JSYST.2016.2565900.spa
dc.relation.referencesChellappa, R., Rosenfeld, A., & Meyers, R. A. (2003). Image Processing. In Encyclopedia of Physical Science and Technology (Third Edition) (pp. 595–630). Academic Press. https://doi.org/https://doi.org/10.1016/B0-12-227410-5/00841-3spa
dc.relation.referencesChen, Y., Hou, J., Huang, C., Zhang, Y., & Li, X. (2021). Mapping maize area in heterogeneous agricultural landscape with multi-temporal sentinel-1 and sentinel-2 images based on random forest. Remote Sensing, 13(15), 1–22. https://doi.org/10.3390/rs13152988spa
dc.relation.referencesChuvieco, E. (2016). Fundamentals of satellite remote sensing. In Fundamentals of Satellite Remote Sensing. https://doi.org/10.1201/b18954spa
dc.relation.referencesCIP. (2017). El Centro Internacional de la Papa. Hechos y cifras sobre la papa. Lima (Perú). CIP, 2 Pspa
dc.relation.referencesDaughtry, C. S. T. ., Walthall, C. L. ., Kim, M. S. ., Brown de Colstoun, E. ., & McMurtrey, J. E. (2000). Estimating Corn Leaf Chlorophyll Concentration from Leaf and Canopy Reflectance. Remote Sensing of Environment, 79(2), 229–239. https://doi.org/10.3184/174751911X556684spa
dc.relation.referencesDraghici, S. (2020). Machine learning techniques. In Statistics and Data Analysis for Microarrays Using R and Bioconductor. https://doi.org/10.1201/b11566-34spa
dc.relation.referencesDong, J., Dafang, Z., Yaohuan, H., Jinying, F. (2011). Survey of multispectral image fusion techniques in remote sensing applications. In: Zheng Y (ed) Image fusion and its applications. Alcorn State University, USA. DOI: 10.5772/10548.spa
dc.relation.referencesDu, P., Liu, S., Xia, J., Zhao, Y. (2013). Information fusion techniques for change detection from multi-temporal remote sensing images. Inf Fusion 14(1), 19–27. https://doi.org/10.1016/j.inffus.2012.05.003spa
dc.relation.referencesEbrahimy, H., Mirbagheri, B., Matkan, A. A., & Azadbakht, M. (2021). Per-pixel land cover accuracy prediction: A random forest-based method with limited reference sample data. ISPRS Journal of Photogrammetry and Remote Sensing, 172(November 2020), 17–27. https://doi.org/10.1016/j.isprsjprs.2020.11.024spa
dc.relation.referencesEhlers, M., Klonus, S., Astrand, PJ. (2008). Quality assessment for multi-sensor multi-date image fusion. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences ISPRS, 499–506. https://www.isprs.org/proceedings/XXXVII/congress/4_pdf/89.pdf.spa
dc.relation.referencesESA. (2012). Sentinel-1: ESA’s radar observatory mission for GMES operational services. In ESA Special Publication (Vol. 1, Issue 1322). https://sentinel.esa.int/documents/247904/349449/S1_SP-1322_1.pdfspa
dc.relation.referencesESA. (2015). Sentinel -2 User Handbook. In ESA Special Publication (Issue 2, pp. 1–64). https://doi.org/10.1021/ie51400a018spa
dc.relation.referencesFEDEPAPA. (2019). Boletín regional y nacional, 2020. In Boletín del consumidor (Vol. 3, Issue 7).spa
dc.relation.referencesFerentinos, K. P. (2018). Deep learning models for plant disease detection and diagnosis. Computers and Electronics in Agriculture, 145(February), 311–318. https://doi.org/10.1016/j.compag.2018.01.009spa
dc.relation.referencesFilipponi, F. (2019). Sentinel-1 GRD Preprocessing Workflow. Proceedings, 18(1), 11. https://doi.org/10.3390/ecrs-3-06201spa
dc.relation.referencesFonseca, L., Namikawa, L., Castejon, E., Carvalho, L., Pinho, C., & Pagamisse, A. (2011). Image Fusion for Remote Sensing Applications. Image Fusion and Its Applications, June. https://doi.org/10.5772/22899spa
dc.relation.referencesFoody, G. M. (2002). Status of land cover classification accuracy assessment. Remote Sensing of Environment, 80, 185–201. https://doi.org/https://doi.org/10.1016/S0034-4257(01)00295-4spa
dc.relation.referencesFoody, G. M., Mathur, A., Sanchez-Hernandez, C., & Boyd, D. S. (2006). Training set size requirements for the classification of a specific class. Remote Sensing of Environment, 104(1), 1–14. https://doi.org/10.1016/j.rse.2006.03.004spa
dc.relation.referencesGoel, E., & Abhilasha, E. (2017). Random forest: A review. International Journal of Advanced Research in Computer Sci- ence and Software Engineering, 7(1), 251–257 https://doi.org/10.23956/ijarcsse/V7I1/01113spa
dc.relation.referencesGómez, C., White, J. C., & Wulder, M. A. (2016). Optical remotely sensed time series data for land cover classification: A review. ISPRS Journal of Photogrammetry and Remote Sensing, 116, 55–72. https://doi.org/10.1016/j.isprsjprs.2016.03.008spa
dc.relation.referencesGoodman, R., & Carrara, W. (2005). Synthetic Aperture Radar Algorithms. In Handbook of Image and Video Processing (Second Edi). Elsevier Inc. https://doi.org/10.1016/B978-012119792-6/50127-3spa
dc.relation.referencesGorelick, N., Hancher, M., Dixon, M., Ilyushchenko, S., Thau, D., & Moore, R. (2017). Google Earth Engine: Planetary-scale geospatial analysis for everyone. Remote Sensing of Environment, 202, 18–27. https://doi.org/10.1016/j.rse.2017.06.031spa
dc.relation.referencesGovindaraju, V., Arni, S. R., & Rao, C. R. (2023). Handbook of Statistics 48. Deep learning. http://deeplearning.net/.spa
dc.relation.referencesHe, Y., Dong, J., Liao, X., Sun, L., Wang, Z., You, N., Li, Z., & Fu, P. (2021). Examining rice distribution and cropping intensity in a mixed single- and double-cropping region in South China using all available Sentinel 1/2 images. International Journal of Applied Earth Observation and Geoinformation, 101, 102351. https://doi.org/10.1016/j.jag.2021.102351spa
dc.relation.referencesHuete, A. R. (2004). R Emote S Ensing for. In Environmental Monitoring and Characterization. Elsevier, Inc. https://doi.org/10.1016/B978-0-12-064477-3.50013-8spa
dc.relation.referencesIenco, D., Interdonato, R., Gaetano, R., & Ho Tong Minh, D. (2019). Combining Sentinel-1 and Sentinel-2 Satellite Image Time Series for land cover mapping via a multi-source deep learning architecture. ISPRS Journal of Photogrammetry and Remote Sensing, 158(September), 11–22. https://doi.org/10.1016/j.isprsjprs.2019.09.016spa
dc.relation.referencesIshihara, K., Ogawa, T., & Haseyama, M. (2017). Helicobacter Pylori infection detection from gastric X-ray images based on feature fusion and decision fusion. Computers in Biology and Medicine, 84(September 2016), 69–78. https://doi.org/10.1016/j.compbiomed.2017.03.007spa
dc.relation.referencesJensen, J. R. (2015). Digital Image processing. A remote sensing perspective.spa
dc.relation.referencesJohannsen, C. J. ;, & Daughtry, C. S. T. (2009). Chapter 17: Surface Reference Data Collection. In F. G. Warner T, Nellis M (Ed.), The SAGE Handbook of Remote Sensing (Sage Publi, pp. 1–481). https://doi.org/10.4135/9780857021052spa
dc.relation.referencesJoshi, N., Baumann, M., Ehammer, A., Fensholt, R., Grogan, K., Hostert, P., Jepsen, M. R., Kuemmerle, T., Meyfroidt, P., Mitchard, E. T. A., Reiche, J., Ryan, C. M., & Waske, B. (2016). A review of the application of optical and radar remote sensing data fusion to land use mapping and monitoring. Remote Sensing, 8(1), 1–23. https://doi.org/10.3390/rs8010070spa
dc.relation.referencesKarim, S., Tong, G., Li, J., Qadir, A., Farooq, U., & Yu, Y. (2023). Current advances and future perspectives of image fusion: A comprehensive review. Information Fusion, 90(July 2022), 185–217. https://doi.org/10.1016/j.inffus.2022.09.019spa
dc.relation.referencesKulkarni, S. C., & Rege, P. P. (2020). Pixel level fusion techniques for SAR and optical images: A review. Information Fusion, 59(November 2019), 13–29. https://doi.org/10.1016/j.inffus.2020.01.003spa
dc.relation.referencesKumar, U., Mukhopadhyay, C., Ramachandra. (2009). Fusion of multisensor data: review and comparative analysis. In: 2009 WRI Global Congress on Intelligent Systems. vol 2, 418–422. DOI:10.1109/GCIS.2009.457spa
dc.relation.referencesLeslie, C. R., Serbina, L. O., & Miller, H. M. (2017). Landsat and agriculture—Case studies on the uses and benefits of Landsat imagery in agricultural monitoring and production. Open-File Report, 27. https://pubs.er.usgs.gov/publication/ofr20171034spa
dc.relation.referencesLi, D., Song, Z., Quan, C., Xu, X., & Liu, C. (2021). Recent advances in image fusion technology in agriculture. Computers and Electronics in Agriculture, 191(November 2020), 106491. https://doi.org/10.1016/j.compag.2021.106491spa
dc.relation.referencesLi, J., 2022. spm: Spatial Predictive Modelling. R package version 1.2.2 (2022-05-06) https://search.r-project.org/CRAN/refmans/spm/html/00Index.htmlspa
dc.relation.referencesLiu, C. an, Chen, Z. xin, Shao, Y., Chen, J. song, Hasi, T., & Pan, H. zhu. (2019). Research advances of SAR remote sensing for agriculture applications: A review. Journal of Integrative Agriculture, 18(3), 506–525. https://doi.org/10.1016/S2095-3119(18)62016-7spa
dc.relation.referencesLiu, L., & Lei, B. (2018). Can SAR images and optical images transfer with each other? International Geoscience and Remote Sensing Symposium (IGARSS), 2018-July, 7019–7022. https://doi.org/10.1109/IGARSS.2018.8518921spa
dc.relation.referencesLozano-Tello, A., Fernández-Sellers, M., Quirós, E., Fragoso-Campón, L., García-Martín, A., Gutiérrez Gallego, J. A., Mateos, C., Trenado, R., & Muñoz, P. (2021). Crop identification by massive processing of multiannual satellite imagery for EU common agriculture policy subsidy control. European Journal of Remote Sensing, 54(1), 1–12. https://doi.org/10.1080/22797254.2020.1858723spa
dc.relation.referencesLuo, K., Lu, L., Xie, Y., Chen, F., Yin, F., & Li, Q. (2023). Crop type mapping in the central part of the North China Plain using Sentinel-2 time series and machine learning. Computers and Electronics in Agriculture, 205(July 2022), 107577. https://doi.org/10.1016/j.compag.2022.107577spa
dc.relation.referencesMADR. (2019). Estrategia de ordenamiento de la producción. Cadena productiva de la papa y su industria.spa
dc.relation.referencesMakode, P., Khan, J. (2017) A review on multi-focus digital image pair fusion using multi-scale image. Wavelet Decom- position 3(1), 575–579spa
dc.relation.referencesMansaray, L. R., Kabba, V. T. S., Zhang, L., & Bebeley, H. A. (2021). Optimal multi-temporal Sentinel-1A SAR imagery for paddy rice field discrimination; a recommendation for operational mapping initiatives. Remote Sensing Applications: Society and Environment, 22(April), 100533. https://doi.org/10.1016/j.rsase.2021.100533spa
dc.relation.referencesMarais Sicre, C., Fieuzal, R., & Baup, F. (2020). Contribution of multispectral (optical and radar) satellite images to the classification of agricultural surfaces. International Journal of Applied Earth Observation and Geoinformation, 84(September 2018), 101972. https://doi.org/10.1016/j.jag.2019.101972spa
dc.relation.referencesMcNairn, H., Champagne, C., Shang, J., Holmstrom, D., & Reichert, G. (2009). Integration of optical and Synthetic Aperture Radar (SAR) imagery for delivering operational annual crop inventories. ISPRS Journal of Photogrammetry and Remote Sensing, 64(5), 434–449. https://doi.org/10.1016/j.isprsjprs.2008.07.006spa
dc.relation.referencesMeyer, D., Dimitriadou, E., Hornik, K., Weingessel, A., & Leisch, F. (2023). Package ‘e1071.’spa
dc.relation.referencesMittal, M. (2015). Hybrid image fusion using curvelet and wavelet transform using PCA and SVM. Int J Sci Emerg Technol Latest Trends 22(1), 28–35. DOI: 10.5120/21607-4639.spa
dc.relation.referencesMohammadi, V., & Minaei, S. (2019). Artificial Intelligence in the Production Process. In Engineering Tools in the Beverage Industry. Elsevier Inc. https://doi.org/10.1016/b978-0-12-815258-4.00002-0spa
dc.relation.referencesMucsia, L., & Bui, D. H. (2023). Evaluating the performance of multi-temporal synthetic-aperture radar imagery in land-cover mapping using a forward stepwise selection approach. Remote Sensing Applications: Society and Environment. https://doi.org/10.1016/j.rsase.2023.100975spa
dc.relation.referencesNg, A. (2008). Part V Support Vector Machines. CS229 Lecture Notes. https://doi.org/10.1016/j.aca.2011.07.027spa
dc.relation.referencesPantazi, X. E., Moshou, D., & Bochtis, D. (2020). Artificial intelligence in agriculture. In Intelligent Data Mining and Fusion Systems in Agriculture. https://doi.org/10.1016/b978-0-12-814391-9.00002-9spa
dc.relation.referencesPanwar, SA., Malwadkar, S. (2015). A review: image fusion techniques for multisensor images. Int J Adv Res Electr, Electr Instrum Eng. 4(1), 406–410. Doi: 10.15662/ijareeie.2015.0401049.spa
dc.relation.referencesParamanandham, N., Rajendiran, K. (2018). Infrared and visible image fusion using discrete cosine transform and swarm intelligence for surveillance applications. Infrared Phys Technol 88, 13–22.spa
dc.relation.referencesPiella, G. (2003). A general framework for multiresolution image fusion: From pixels to regions. Information Fusion, 4(4), 259–280. https://doi.org/10.1016/S1566-2535(03)00046-0spa
dc.relation.referencesPohl, C., & Van Genderen, J. L. (1998). Review article Multisensor image fusion in remote sensing: Concepts, methods and applications. In International Journal of Remote Sensing (Vol. 19, Issue 5). https://doi.org/10.1080/014311698215748spa
dc.relation.referencesPott, L. P., Amado, T. J. C., Schwalbert, R. A., Corassa, G. M., & Ciampitti, I. A. (2021). Satellite-based data fusion crop type classification and mapping in Rio Grande do Sul, Brazil. ISPRS Journal of Photogrammetry and Remote Sensing, 176(April), 196–210. https://doi.org/10.1016/j.isprsjprs.2021.04.015spa
dc.relation.referencesQi, J., Marsett, R., Heilman, P., Biedenbender, S., Moran, S., Goodrich, D., & Weltz, M. (2002). RANGES improves satellite-based information and land cover assessments in Southwest United States. Eos, 83(51). https://doi.org/10.1029/2002EO000411spa
dc.relation.referencesQian, Y., Zhou, W., Yan, J., Li, W., & Han, L. (2015). Comparing machine learning classifiers for object-based land cover classification using very high resolution imagery. Remote Sensing, 7(1), 153–168. https://doi.org/10.3390/rs70100153spa
dc.relation.referencesRad, A. M., Ashourloo, D., Shahrabi, H. S., & Nematollahi, H. (2019). Developing an Automatic Phenology-Based Algorithm for Rice Detection Using Sentinel-2 Time-Series Data. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(5), 1471–1481. https://doi.org/10.1109/JSTARS.2019.2906684spa
dc.relation.referencesRajah, P., Odindi, J., & Mutanga, O. (2018). Feature level image fusion of optical imagery and Synthetic Aperture Radar (SAR) for invasive alien plant species detection and mapping. Remote Sensing Applications: Society and Environment, 10(November 2017), 198–208. https://doi.org/10.1016/j.rsase.2018.04.007spa
dc.relation.referencesRevathy, R., Setia, R., Jain, S., Das, S., Gupta, S., Pateriya, B. (2023). Classification of Potato in Indian Punjab Using Time-Series Sentinel-2 Images. In: Kumar, S., Setia, R., Singh, K. (eds) Artificial Intelligence and Machine Learning in Satellite Data Processing and Services. Lecture Notes in Electrical Engineering, vol 970. Springer, Singapore. https://doi.org/10.1007/978-981-19-7698-8_20spa
dc.relation.referencesRoy, K., Kar, S., & Das, R. N. (2015). Selected Statistical Methods in QSAR. In Understanding the Basics of QSAR for Applications in Pharmaceutical Sciences and Risk Assessment. https://doi.org/10.1016/b978-0-12-801505-6.00006-5spa
dc.relation.referencesSacks, W. J., Deryng, D., Foley, J. A., & Ramankutty, N. (2010). Crop planting dates: An analysis of global patterns. Global Ecology and Biogeography, 19(5), 607–620. https://doi.org/10.1111/j.1466-8238.2010.00551.xspa
dc.relation.referencesSan, & Suzen. (2010). Evaluation of Different Atmospheric Correction Algorithms for Eo-1 Hyperion Imagery. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Science, XXXVIII, 392–397.spa
dc.relation.referencesSchowengerdt, R. A. (2006). Chapter 8 - Image Registration and Fusion. Remote Sensing. https://doi.org/10.1016/B978-0-12-369407-2.50011-5spa
dc.relation.referencesSchumann, G. J. P., & Moller, D. K. (2015). Microwave remote sensing of flood inundation. Physics and Chemistry of the Earth, 83–84, 84–95. https://doi.org/10.1016/j.pce.2015.05.002spa
dc.relation.referencesSingha, C., Swain, K. C., & Jayasuriya, H. (2022). Growth and yield monitoring of potato crop using Sentinel-1 data through cloud computing. Arabian Journal of Geosciences, 15(19). https://doi.org/10.1007/s12517-022-10844-6spa
dc.relation.referencesSingh, Sanjay, & Tiwari, K. C. (2021). Exploring the optimal combination of image fusion and classification techniques. Remote Sensing Applications: Society and Environment, 24(September), 100642. https://doi.org/10.1016/j.rsase.2021.100642spa
dc.relation.referencesSingh, Simrandeep, Singh, H., Bueno, G., Deniz, O., Singh, S., Monga, H., Hrisheekesha, P. N., & Pedraza, A. (2023). A review of image fusion: Methods, applications and performance metrics. Digital Signal Processing: A Review Journal, 137, 104020. https://doi.org/10.1016/j.dsp.2023.104020spa
dc.relation.referencesSkakun, S., Wevers, J., Brockmann, C., Doxani, G., Aleksandrov, M., Batič, M., Frantz, D., Gascon, F., Gómez-Chova, L., Hagolle, O., López-Puigdollers, D., Louis, J., Lubej, M., Mateo-García, G., Osman, J., Peressutti, D., Pflug, B., Puc, J., Richter, R., … Žust, L. (2022). Cloud Mask Intercomparison eXercise (CMIX): An evaluation of cloud masking algorithms for Landsat 8 and Sentinel-2. Remote Sensing of Environment, 274(September 2021). https://doi.org/10.1016/j.rse.2022.112990spa
dc.relation.referencesSmall, D. (2011). Flattening gamma: Radiometric terrain correction for SAR imagery. IEEE Transactions on Geoscience and Remote Sensing, 49(8), 3081–3093. https://doi.org/10.1109/TGRS.2011.2120616spa
dc.relation.referencesSonobe, R., Yamaya, Y., Tani, H., Wang, X., Kobayashi, N., & Mochizuki, K. ichiro. (2017). Assessing the suitability of data from Sentinel-1A and 2A for crop classification. GIScience and Remote Sensing, 54(6), 918–938. https://doi.org/10.1080/15481603.2017.1351149spa
dc.relation.referencesStankiewicz, K. A. (2006). The Efficiency of Crop Recognition on ENVISAT ASAR Images in Two Growing Seasons. 44(April), 806–814.spa
dc.relation.referencesStehman, S, & Czaplewski, R. L. (1998). Design and Analysis for Thematic Map Accuracy Assessment - an application of satellite imagery. Remote Sensing of Environment, 64(January), 331–344. http://www.ingentaconnect.com/content/els/00344257/1998/00000064/00000003/art00010%5Cnhttp://dx.doi.org/10.1016/S0034-4257(98)00010-8spa
dc.relation.referencesStehman, Stephen V. (2009). Sampling designs for accuracy assessment of land cover. International Journal of Remote Sensing, 30(20), 5243–5272. https://doi.org/10.1080/01431160903131000spa
dc.relation.referencesSteinhausen, M. J., Wagner, P. D., Narasimhan, B., & Waske, B. (2018). Combining Sentinel-1 and Sentinel-2 data for improved land use and land cover mapping of monsoon regions. International Journal of Applied Earth Observation and Geoinformation, 73(April), 595–604. https://doi.org/10.1016/j.jag.2018.08.011spa
dc.relation.referencesSujud, L., Jaafar, H., Haj Hassan, M. A., & Zurayk, R. (2021). Cannabis detection from optical and RADAR data fusion: A comparative analysis of the SMILE machine learning algorithms in Google Earth Engine. Remote Sensing Applications: Society and Environment, 24, 100639. https://doi.org/10.1016/j.rsase.2021.100639spa
dc.relation.referencesSun, C., Bian, Y., Zhou, T., & Pan, J. (2019). Using of multi-source and multi-temporal remote sensing data improves crop-type mapping in the subtropical agriculture region. Sensors (Switzerland), 19(10), 1–23. https://doi.org/10.3390/s19102401spa
dc.relation.referencesSun, Y., Li, L., Zheng, L., Hu, J., Li, W., Jiang, Y., & Yan, C. (2019). Image classification base on PCA of multi-view deep representation. Journal of Visual Communication and Image Representation, 62, 253–258. https://doi.org/10.1016/j.jvcir.2019.05.016spa
dc.relation.referencesSuthaharan, S. (2016). A Cognitive Random Forest: An Intra- and Intercognitive Computing for Big Data Classification Under Cune Condition. In Handbook of Statistics (1st ed., Vol. 35). Elsevier B.V. https://doi.org/10.1016/bs.host.2016.07.006spa
dc.relation.referencesTharwat, A. (2018). Classification assessment methods. Applied Computing and Informatics, 17(1), 168–192. https://doi.org/10.1016/j.aci.2018.08.003spa
dc.relation.referencesTufail, R., Ahmad, A., Javed, M. A., & Ahmad, S. R. (2021). A machine learning approach for accurate crop type mapping using combined SAR and optical time series data. Advances in Space Research, xxxx. https://doi.org/10.1016/j.asr.2021.09.019spa
dc.relation.referencesUPRA. (2022). Análisis situacional de la cadena productiva de la papa en Colombia.spa
dc.relation.referencesVan Tricht, K., Gobin, A., Gilliams, S., & Piccard, I. (2018). Synergistic use of radar sentinel-1 and optical sentinel-2 imagery for crop mapping: A case study for Belgium. Remote Sensing, 10(10), 1–22. https://doi.org/10.3390/rs10101642spa
dc.relation.referencesVapnik, V. (1998). of Function Estimation. 55–85. https://doi.org/10.1007/978-1-4615-5703-6_spa
dc.relation.referencesVeerabhadraswamy, N., Devagiri, G. M., & Khaple, A. K. (2021). Fusion of complementary information of SAR and optical data for forest cover mapping using random forest algorithm. Current Science, 120(1), 193–199. https://doi.org/10.18520/cs/v120/i1/193-199spa
dc.relation.referencesVenables, W. N., Smith, D. M., & Core Team, R. (2013). An Introduction to R. Practical Graph Mining with R, 3, 27–52. https://doi.org/10.1201/b15352-7spa
dc.relation.referencesVijayaraj, V., Younan, N. H., & O’Hara, C. G. (2006). Concepts of image fusion in remote sensing applications. International Geoscience and Remote Sensing Symposium (IGARSS), 3781–3784. https://doi.org/10.1109/IGARSS.2006.973spa
dc.relation.referencesWang, G., Li, P., Li, Z., Liang, C., & Wang, H. (2022). Coastal subsidence detection and characterization caused by brine mining over the Yellow River Delta using time series InSAR and PCA. International Journal of Applied Earth Observation and Geoinformation, 114(August), 103077. https://doi.org/10.1016/j.jag.2022.103077spa
dc.relation.referencesWang, L., Ma, H., Li, J., Gao, Y., Fan, L., Yang, Z., Yang, Y., & Wang, C. (2022). An automated extraction of small- and middle-sized rice fields under complex terrain based on SAR time series: A case study of Chongqing. Computers and Electronics in Agriculture, 200(November 2021), 107232. https://doi.org/10.1016/j.compag.2022.107232spa
dc.relation.referencesWang, Q., Shen, Y., & Jin, J. (2008). Performance evaluation of image fusion techniques. In Image Fusion. Elsevier Ltd. https://doi.org/10.1016/b978-0-12-372529-5.00017-2spa
dc.relation.referencesWeiss, M., Jacob, F., & Duveiller, G. (2020). Remote sensing for agricultural applications: A meta-review. Remote Sensing of Environment, 236, 0–39. https://doi.org/10.1016/j.rse.2019.111402spa
dc.relation.referencesWright, M. N., & Ziegler, A. (2017). Ranger: A fast implementation of random forests for high dimensional data in C++ and R. Journal of Statistical Software, 77(1). https://doi.org/10.18637/jss.v077.i01spa
dc.relation.referencesXun, L., Zhang, J., Cao, D., Yang, S., & Yao, F. (2021). A novel cotton mapping index combining Sentinel-1 SAR and Sentinel-2 multispectral imagery. ISPRS Journal of Photogrammetry and Remote Sensing, 181(August), 148–166. https://doi.org/10.1016/j.isprsjprs.2021.08.021spa
dc.relation.referencesYee, L. C., & Wei, Y. C. (2012). Current Modeling Methods Used in QSAR/QSPR. Statistical Modelling of Molecular Descriptors in QSAR/QSPR, 2, 1–31. https://doi.org/10.1002/9783527645121.ch1spa
dc.relation.referencesYésou, H., Besnus, Y., Rolet, J., Pion, J. C., & Aing, A. (1993). Merging seasat and SPOT imagery for the study of geological structures in a temperate agricultural region. Remote Sensing of Environment, 43(3), 265–279. https://doi.org/10.1016/0034-4257(93)90070-Espa
dc.relation.referencesZhang, J., Li, M., Feng, Y., & Yang, C. (2020). Robotic grasp detection based on image processing and random forest. Multimedia Tools and Applications, 79(3–4), 2427–2446. https://doi.org/10.1007/s11042-019-08302-9spa
dc.relation.referencesZhang, H., Kang, J., Xu, X., & Zhang, L. (2020). Accessing the temporal and spectral features in crop type mapping using multi-temporal Sentinel-2 imagery: A case study of Yi’an County, Heilongjiang province, China. Computers and Electronics in Agriculture, 176(February), 105618. https://doi.org/10.1016/j.compag.2020.105618spa
dc.relation.referencesZhao, J., Yan, H., & Huang, L. (2023). A joint method of spatial–spectral features and BP neural network for hyperspectral image classification. Egyptian Journal of Remote Sensing and Space Science, 26(1), 107–115. https://doi.org/10.1016/j.ejrs.2022.12.012spa
dc.relation.referencesZheng, B., Myint, S., Thenkabail, P., Aggarwal R. (2015). A support vector machine to identify irrigated crop types using time-series Landsat NDVI data. International Journal of Applied Earth Observation and Geoinformation, 34, 103-112, ISSN 1569-8432. https://doi.org/10.1016/j.jag.2014.07.002.spa
dc.relation.referencesZitová, B., & Flusser, J. (2003). Image registration methods: A survey. Image and Vision Computing, 21(11), 977–1000. https://doi.org/10.1016/S0262-8856(03)00137-9spa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseAtribución-NoComercial-CompartirIgual 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/spa
dc.subject.ddc630 - Agricultura y tecnologías relacionadas::633 - Cultivos de campo y de plantaciónspa
dc.subject.ddc000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadoresspa
dc.subject.lembIndustria de la papaspa
dc.subject.lembPotato industryeng
dc.subject.lembAgriculturaspa
dc.subject.lembAgricultureeng
dc.subject.lembTecnología agricolaspa
dc.subject.lembAgricultural technologyeng
dc.subject.proposalFusiónspa
dc.subject.proposalImagen ópticaspa
dc.subject.proposalImagen radarspa
dc.subject.proposalbosques aleatoriosspa
dc.subject.proposalMáquina de vector de soportespa
dc.subject.proposalFusioneng
dc.subject.proposalOptical imageeng
dc.subject.proposalRadar imageeng
dc.subject.proposalRandom forestseng
dc.subject.proposalSupport vector machineeng
dc.titleMetodología para identificar y caracterizar áreas sembradas en papa en el departamento de Cundinamarca a partir de la integración de imágenes ópticas y de radar
dc.title.translatedMethodology to identify and characterize areas planted in potato in the department of Cundinamarca from the integration of optical and radar images.
dc.typeTrabajo de grado - Maestríaspa
dc.type.coarhttp://purl.org/coar/resource_type/c_bdccspa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.contentDatasetspa
dc.type.contentImagespa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/masterThesisspa
dc.type.redcolhttp://purl.org/redcol/resource_type/TMspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dcterms.audience.professionaldevelopmentEstudiantesspa
dcterms.audience.professionaldevelopmentInvestigadoresspa
dcterms.audience.professionaldevelopmentMaestrosspa
dcterms.audience.professionaldevelopmentPúblico generalspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
53115304.2023.pdf
Tamaño:
5.15 MB
Formato:
Adobe Portable Document Format
Descripción:
Tesis de Maestría en Geomática

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: