Implementación de un modelo de generación de imágenes basado en principios de la física cuántica.
| dc.contributor.advisor | Gonzalez Osorio, Fabio Augusto | |
| dc.contributor.advisor | Useche Reyes, Diego Hernan | |
| dc.contributor.author | Mora López, Andrea Carolina | |
| dc.contributor.researchgroup | Mindlab | |
| dc.date.accessioned | 2026-02-04T12:52:13Z | |
| dc.date.available | 2026-02-04T12:52:13Z | |
| dc.date.issued | 2026 | |
| dc.description | Ilustraciones, diagramas, fotografías, gráficos | spa |
| dc.description.abstract | La computación cuántica presenta ventajas frente a la computación clásica, en particular en tareas de optimización y generación de datos. Sin embargo, las limitaciones del hardware actual dificultan la implementación práctica de modelos generativos cuánticos a gran escala. Esto ha impulsado el desarrollo de enfoques híbridos, simulaciones clásicas y modelos inspirados en principios cuánticos. En este trabajo se proponen tres modelos generativos inspirados en principios de la computación cuántica, pero implementados sobre arquitecturas clásicas. Todos ellos comparten un método de estimación de densidad llamado Kernel Density Matrix (KDM), compuesto por tres elementos un kernel, un conjunto de prototipos y los pesos asociados a dichos prototipos. Los modelos desarrollados son: un autoencoder entrenado de manera independiente de la capa KDM (AEKDMS), un autoencoder y una KDM entrenados conjuntamente mediante una función de pérdida combinada (AEKDM-FPC) y un autoencoder con una KDM estocástica inctegrada como capa intermedia en el espacio latente (AEKDM-CIS). La validación experimental se llevó a cabo utilizando los conjuntos de datos MNIST Digits, Fashion-MNIST y CELEB-A, evaluando el desempeño mediante las métricas Fréchet Inception Distance (FID), divergencia de Kullback Leibler y log-verosimilitud (LL). Los resultados muestran que el modelo AEKDMS alcanzó el mejor desempeño global. En MNIST Digits obtuvo un FID de 21.52, una KL de 4.12 y una LL de 654.84, superando a modelos base como VAE (FID 75.43, KL 10.76, LL 582.61) y GAN (FID 35.24, KL 15.68, LL 419.56). En Fashion-MNIST, el mismo modelo logró un FID de 53.11 y una LL de 674.63, mientras que VAE y GAN alcanzaron FIDs de 149.67 y 167.62, respectivamente. Finalmente, para el set CELEB-A obtuvo un FID de 182.51 comparado con los modelos de linea base que alcanzaron valores de 216.47 y 211.83 para la GAN y VAE respectivamente. El modelo entrenado conjuntamente AEKDM-FPC se destacó en log-verosimilitud, alcanzando el mayor valor en el conjunto de MNIST Digits (LL = 745.57), aunque sacrificó calidad visual (FID = 31.74). Finalmente, el modelo AEKDM-CIS presentó un desempeño inferior en ambas bases de datos, con FID más altos (76.79 en dígitos y 194.04 en Fashion-MNIST) y log-verosimilitudes más bajas, lo que indica inestabilidad en el entrenamiento. Sin embargo, para el conjunto de mayor dimensión CELEB-A obtuvo un mejor desempeño en métricas como la log verosimilitud (LL = 8213,04). En conjunto, los resultados confirman que los modelos inspirados en principios cuánticos, en particular el AEKDMS, superan a las líneas base clásicas tanto en calidad visual como en consistencia estadística, además de requerir arquitecturas más ligeras (61k–68k parámetros frente a los 2.9M del GAN). (Texto tomado de la fuente) | spa |
| dc.description.abstract | Quantum computing offers advantages over classical computing, particularly in optimization and data generation tasks. However, the limitations of current hardware hinder the practical implementation of large-scale quantum generative models. This has driven the development of hybrid approaches, classical simulations, and models inspired by quantum principles. In this work, three generative models inspired by principles of quantum computing are proposed, but implemented on classical architectures. All of them share a density estimation method called Kernel Density Matrix (KDM), composed of three elements: a kernel, a set of prototypes, and the weights associated with these prototypes. The developed models are: an autoencoder trained independently of the KDM layer (AEKDMS), an autoencoder and a KDM trained jointly through a combined loss function (AEKDM-FPC), and an autoencoder with a stochastic KDM integrated as an intermediate layer in the latent space (AEKDM-CIS). The experimental validation was carried out using the MNIST Digits, Fashion-MNIST, and CELEB-A datasets, evaluating performance through the Fréchet Inception Distance (FID), Kullback–Leibler divergence, and log-likelihood (LL) metrics. The results show that the AEKDMS model achieved the best overall performance. On MNIST Digits it obtained a FID of 21.52, a KL of 4.12, and an LL of 654.84, surpassing baseline models such as VAE (FID 75.43, KL 10.76, LL 582.61) and GAN (FID 35.24, KL 15.68, LL 419.56). On Fashion-MNIST, the same model achieved a FID of 53.11 and an LL of 674.63, while VAE and GAN reached FIDs of 149.67 and 167.62, respectively. Finally, for the CELEB-A dataset it obtained a FID of 182.51 compared to the baseline models, which reached values of 216.47 and 211.83 for GAN and VAE, respectively. The jointly trained AEKDM-FPC model stood out in log-likelihood, achieving the highest value on the MNIST Digits dataset (LL = 745.57), although it sacrificed visual quality (FID = 31.74). Finally, the AEKDM-CIS model showed inferior performance on both datasets, with higher FIDs (76.79 on digits and 194.04 on Fashion-MNIST) and lower log-likelihoods, indicating instability in training. However, for the higher-dimensional CELEB-A dataset it obtained better performance on metrics such as log-likelihood (LL = 8213,04). Overall, the results confirm that the models inspired by quantum principles, particularly AEKDMS, outperform classical baselines in both visual quality and statistical consistency, while also requiring lighter architectures (61k–68k parameters versus 2.9M for the GAN). | eng |
| dc.description.degreelevel | Maestría | |
| dc.description.degreename | Magíster en ingeniería de sistemas y computación | |
| dc.description.researcharea | Aprendizaje de Máquina | |
| dc.format.extent | ix, 54 páginas | |
| dc.format.mimetype | application/pdf | |
| dc.identifier.instname | Universidad Nacional de Colombia | spa |
| dc.identifier.reponame | Repositorio Institucional Universidad Nacional de Colombia | spa |
| dc.identifier.repourl | https://repositorio.unal.edu.co/ | spa |
| dc.identifier.uri | https://repositorio.unal.edu.co/handle/unal/89385 | |
| dc.language.iso | spa | |
| dc.publisher | Universidad Nacional de Colombia | |
| dc.publisher.branch | Universidad Nacional de Colombia - Sede Bogotá | |
| dc.publisher.faculty | Facultad de Ingeniería | |
| dc.publisher.place | Bogotá, Colombia | |
| dc.publisher.program | Bogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación | |
| dc.relation.references | Agliardi, G. ; Prati, E.: Optimal Tuning of Quantum Generative Adversarial Net- works for Multivariate Distribution Loading. En: Quantum Reports 4 (2022), p. 75–105 | |
| dc.relation.references | Ajagekar, A. ; You, F.: Molecular design with automated quantum computing-based deep learning and optimization. En: npj Computational Materials 9 (2023) | |
| dc.relation.references | Alemohammad, Sina ; Humayun, Ahmed I. ; Agarwal, Shruti ; Collomosse, John ; Baraniuk, Richard. Self-Improving Diffusion Models with Synthetic Data. 2024 | |
| dc.relation.references | Anand, A. ; Romero, J. ; Degroote, M. ; Aspuru-Guzik, A.: Noise Robustness and Experimental Demonstration of a Quantum Generative Adversarial Network for Continuous Distributions. En: Advanced Quantum Technologies 4 (2021) | |
| dc.relation.references | Arrasmith, Andrew ; Cerezo, M. ; Czarnik, Piotr ; Cincio, Lukasz ; Coles, Patrick J.: Effect of barren plateaus on gradient-free optimization. En: Quantum 5 (2021), oct, p. 558 | |
| dc.relation.references | Assouel, A. ; Jacquier, A. ; Kondratyev, A.: A quantum generative adversarial network for distributions. En: Quantum Machine Intelligence 4 (2022) | |
| dc.relation.references | Assouel, A. ; Jacquier, A. ; Kondratyev, A.: A quantum generative adversarial network for distributions. En: Quantum Machine Intelligence 4 (2022) | |
| dc.relation.references | Assouel, Amine ; Jacquier, Antoine ; Kondratyev, Alexei. A Quantum Generative Adversarial Network for distributions. 2021 | |
| dc.relation.references | Bartkiewicz, K. ; Tulewicz, P. ; Roik, J. ; Lemr, K.: Synergic quantum generative machine learning. En: Scientific Reports 13 (2023) | |
| dc.relation.references | Bartkiewicz, Karol ; Tulewicz, Patrycja ; Roik, Jan ; Lemr, Karel: Synergic quantum generative machine learning. En: Scientific Reports 13 (2023), August, Nr. 1. – ISSN 2045–2322 | |
| dc.relation.references | Bengio, Yoshua ; Yao, Li ; Alain, Guillaume ; Vincent, Pascal. Generalized Denoi- sing Auto-Encoders as Generative Models. 2013 | |
| dc.relation.references | Bigdeli, Siavash A. ; Lin, Geng ; Portenier, Tiziano ; Dunbar, L. A. ; Zwicker, Matthias. Learning Generative Models using Denoising Density Estimators. 2020 | |
| dc.relation.references | Bigdeli, Siavash A. ; Lin, Geng ; Portenier, Tiziano ; Dunbar, L. A. ; Zwicker, Matthias: Learning Generative Models using Denoising Density Estimators. En: CoRR abs/2001.02728 (2020) | |
| dc.relation.references | Brooks, Michael: What’s next for quantum computing. En: MIT Technology Review (2023) | |
| dc.relation.references | Cerezo, M. ; Arrasmith, Andrew ; Babbush, Ryan ; Benjamin, Simon C. ; Endo, Suguru ; Fujii, Keisuke ; McClean, Jarrod R. ; Mitarai, Kosuke ; Yuan, Xiao ; Cincio, Lukasz ; Coles, Patrick J.: Variational quantum algorithms. En: Nature Reviews Physics 3 (2021), August, Nr. 9, p. 625–644. – ISSN 2522–5820 | |
| dc.relation.references | Chakrabarti, S. ; Huang, Y. ; Li, T. ; Feizi, S. ; Wu, X.: Quantum wasserstein GANs. En: Quantum wasserstein GANs Vol. 32, 2019 | |
| dc.relation.references | Chaudhary, S. ; Huembeli, P. ; MacCormack, I. ; Patti, T.L. ; Kossaifi, J. ; Galda, A.: Towards a scalable discrete quantum generative adversarial neural network. En: Quantum Science and Technology 8 (2023) | |
| dc.relation.references | Gao, X. ; Anschuetz, E.R. ; Wang, S.-T. ; Cirac, J.I. ; Lukin, M.D.: Enhancing Generative Models via Quantum Correlations. En: Physical Review X 12 (2022) | |
| dc.relation.references | Gao, Xun ; Anschuetz, Eric R. ; Wang, Sheng-Tao ; Cirac, J. I. ; Lukin, Mikhail D.: Enhancing Generative Models via Quantum Correlations. En: Physical Review X 12 (2022), Mai, Nr. 2. – ISSN 2160–3308 | |
| dc.relation.references | Gircha, A.I. ; Boev, A.S. ; Avchaciov, K. ; Fedichev, P.O. ; Fedorov, A.K.: Hybrid quantum-classical machine learning for generative chemistry and drug design. En: Scientific Reports 13 (2023) | |
| dc.relation.references | Gircha, A.I. ; Boev, A.S. ; Avchaciov, K. ; Fedichev, P.O. ; Fedorov, A.K.: Hybrid quantum-classical machine learning for generative chemistry and drug design. En: Scientific Reports 13 (2023) | |
| dc.relation.references | González, Fabio A. ; Gallego, Alejandro ; Toledo-Cortés, Santiago ; Vargas- Calderón, Vladimir: Learning with density matrices and random features. En: Quan- tum Machine Intelligence 4 (2022), 12. – ISSN 25244914 | |
| dc.relation.references | González, Fabio A. ; Ramos-Pollán, Raúl ; Gallego-Mejia, Joseph A.: Kernel Density Matrices for Probabilistic Deep Learning. En: - (2023), 5 | |
| dc.relation.references | Goodfellow, Ian J. ; Pouget-Abadie, Jean ; Mirza, Mehdi ; Xu, Bing ; Warde- Farley, David ; Ozair, Sherjil ; Courville, Aaron ; Bengio, Yoshua. Generative Adversarial Networks. 2014 | |
| dc.relation.references | Griffiths, David J.: Introduction to Quantum Mechanics. 2nd. Upper Saddle River, NJ : Pearson Prentice Hall, 2005. – ISBN 978–0131118928 | |
| dc.relation.references | Heusel, Martin ; Ramsauer, Hubert ; Unterthiner, Thomas ; Nessler, Bernhard ; Hochreiter, Sepp. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium. 2018 | |
| dc.relation.references | Jayasumana, Sadeep ; Ramalingam, Srikumar ; Veit, Andreas ; Glasner, Daniel ; Chakrabarti, Ayan ; Kumar, Sanjiv. Rethinking FID: Towards a Better Evaluation Metric for Image Generation. 2024 | |
| dc.relation.references | Kieferova, Maria ; Carlos, Ortiz M. ; Wiebe, Nathan. Quantum Generative Trai- ning Using Rényi Divergences. 2021 | |
| dc.relation.references | Kieferová, Mária ; Marrero, Carlos O. ; Wiebe, Nathan. Quantum Generative Training Using Rényi Divergences. 2021 | |
| dc.relation.references | Kingma, Diederik P. ; Welling, Max: An Introduction to Variational Autoencoders. En: Foundations and Trends® in Machine Learning 12 (2019), Nr. 4, p. 307–392 | |
| dc.relation.references | Kingma, Diederik P. ; Welling, Max. Auto-Encoding Variational Bayes. 2022 | |
| dc.relation.references | Lamb, Alex. A Brief Introduction to Generative Models. 2021 | |
| dc.relation.references | van Leeuwen, Steyn ; de Alba Ortíz, Alberto P. ; Dijkstra, Marjolein. A Boltz- mann generator for the isobaric-isothermal ensemble. 2023 | |
| dc.relation.references | Liu, Weizhi ; Zhang, Yufei ; Deng, Zhiqiang [u. a.]: A hybrid quantum-classical conditional generative adversarial network algorithm for human-centered paradigm in cloud. En: Journal of Wireless Communications and Networking 2021 (2021), Nr. 1, p. 37 | |
| dc.relation.references | Miranda, E.R. ; Shaji, H.: Generative Music with Partitioned Quantum Cellular Automata. En: Applied Sciences (Switzerland) 13 (2023) | |
| dc.relation.references | Miyahara, H. ; Roychowdhury, V.: Quantum advantage in variational Bayes in- ference. En: Proceedings of the National Academy of Sciences of the United States of America 120 (2023) | |
| dc.relation.references | Miyahara, H. ; Roychowdhury, V.: Quantum advantage in variational Bayes in- ference. En: Proceedings of the National Academy of Sciences of the United States of America 120 (2023) | |
| dc.relation.references | Moussa, Charles ; Wang, Hao ; Araya-Polo, Mauricio ; Bäck, Thomas ; Dunjko, Vedran: Application of quantum-inspired generative models to small molecular data- sets. En: 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), IEEE, September 2023, p. 342–348 | |
| dc.relation.references | Ng, Joseph ; Abbott, Derek: Solid state quantum computers: a nanoscopic solution to the Moore’s law problem. En: Abbott, Derek (Ed.) ; Varadan, Vijay K. (Ed.) ; Boehringer, Karl F. (Ed.): Smart Electronics and MEMS II Vol. 4236 International Society for Optics and Photonics, SPIE, 2001, p. 89 – 98 | |
| dc.relation.references | Nielsen, Michael A. ; Chuang, Isaac L.: Quantum Computation and Quantum Infor- mation. 10th Anniversary Edition. Cambridge University Press, 2010 | |
| dc.relation.references | Paine, A.E. ; Elfving, V.E. ; Kyriienko, O.: Quantum Quantile Mechanics: Solving Stochastic Differential Equations for Generating Time-Series. En: Advanced Quantum Technologies (2023) | |
| dc.relation.references | Plesovskaya, Ekaterina ; Ivanov, Sergey: An Empirical Analysis of KDE-based Generative Models on Small Datasets. En: Procedia Computer Science 193 (2021), p. 442–452. – ISSN 1877–0509 | |
| dc.relation.references | Romero, J. ; Aspuru-Guzik, A.: Variational Quantum Generators: Generative Adver- sarial Quantum Machine Learning for Continuous Distributions. En: Advanced Quantum Technologies 4 (2021) | |
| dc.relation.references | Romero, Jonathan ; Aspuru-Guzik, Alan. Variational quantum generators: Genera- tive adversarial quantum machine learning for continuous distributions. 2019 | |
| dc.relation.references | S., GAMBLE: Quantum Computing: What It Is, Why We Want It, and How We’re Trying to Get It. En: Quantum Computing: What It Is, Why We Want It, and How We’re Trying to Get It. National Academy of Engineering., Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2018 Symposium. Washington (DC): National Academies Press (US); 2019 Jan 28., 2019 | |
| dc.relation.references | Salakhutdinov, Ruslan ; Hinton, Geoffrey E.: Deep Boltzmann Machines. En: Journal of Machine Learning Research - Proceedings Track 5 (2009), p. 448–455 | |
| dc.relation.references | Silverman, Bernard. W.: Density Estimation for Statistics and Data Analysis. Chap- man and Hall/CRC, 1998 (Chapman & Hall/CRC Monographs on Statistics & Applied Probability). – ISBN 9780412246203; 0412246201 | |
| dc.relation.references | Situ, H. ; He, Z. ; Wang, Y. ; Li, L. ; Zheng, S.: Quantum generative adversarial network for generating discrete distribution. En: Information Sciences 538 (2020), p. 193–208 | |
| dc.relation.references | Sleeman, J. ; Dorband, J. ; Halem, M.: A hybrid quantum enabled RBM advantage: Convolutional autoencoders for quantum image compression and generative learning. En: A hybrid quantum enabled RBM advantage: Convolutional autoencoders for quantum image compression and generative learning Vol. 11391, 2020. – ISBN 9781510635593 | |
| dc.relation.references | Stein, Samuel A. ; Baheri, Betis ; Chen, Daniel ; Mao, Ying ; Guan, Qiang ; Li, Ang ; Fang, Bo ; Xu, Shuai: QuGAN: A Quantum State Fidelity based Generative Adversarial Network. En: 2021 IEEE International Conference on Quantum Computing and Engineering (QCE), IEEE, Oktober 2021, p. 71–81 | |
| dc.relation.references | Takida, Yuhta ; Imaizumi, Masaaki ; Shibuya, Takashi ; Lai, Chieh-Hsin ; Uesaka, Toshimitsu ; Murata, Naoki ; Mitsufuji, Yuki. SAN: Inducing Metrizability of GAN with Discriminative Normalized Linear Layer. 2024 | |
| dc.relation.references | Tan, Jing ; Yin, Sixing ; Zhao, Shuo: Generative Model with Kernel Density Estima- tion. En: Generative Model with Kernel Density Estimation, 2018, p. 304–308 | |
| dc.relation.references | Tian, Jinkai ; Sun, Xiaoyu ; Du, Yuxuan ; Zhao, Shanshan ; Liu, Qing ; Zhang, Kaining ; Yi, Wei ; Huang, Wanrong ; Wang, Chaoyue ; Wu, Xingyao ; Hsieh, Min- Hsiu ; Liu, Tongliang ; Yang, Wenjing ; Tao, Dacheng: Recent Advances for Quantum Neural Networks in Generative Learning. En: IEEE Transactions on Pattern Analysis and Machine Intelligence (2022), 6 | |
| dc.relation.references | Tran, Ba-Hien ; Rossi, Simone ; Milios, Dimitrios ; Michiardi, Pietro ; Bonilla, Edwin V. ; Filippone, Maurizio. Model Selection for Bayesian Autoencoders. 2021 | |
| dc.relation.references | Tychola, K.A. ; Kalampokas, T. ; Papakostas, G.A.: Quantum Machine Lear- ning—An Overview. En: Electronics (Switzerland) 12 (2023) | |
| dc.relation.references | Vagenas, Elias C. ; Farag Ali, Ahmed ; Alshal, Hassan: GUP and the no-cloning theorem. En: The European Physical Journal C 79 (2019), Nr. 3, p. 276 | |
| dc.relation.references | Wall, Michael L. ; Abernathy, Matthew R. ; Quiroz, Gregory: Generative machine learning with tensor networks: Benchmarks on near-term quantum computers. En: Physical Review Research 3 (2021), April, Nr. 2. – ISSN 2643–1564 | |
| dc.relation.references | Wall, M.L. ; Abernathy, M.R. ; Quiroz, G.: Generative machine learning with tensor networks: Benchmarks on near-term quantum computers. En: Physical Review Research 3 (2021) | |
| dc.relation.references | Wang, Qing ; Kulkarni, Sanjeev R. ; Verdú, Sergio: Divergence estimation for multidimensional densities via k-nearest-neighbor distances. En: IEEE Transactions on Information Theory 55 (2009), Nr. 5, p. 2392–2405 | |
| dc.relation.references | Yang, Ling ; Zhang, Zhilong ; Song, Yang ; Hong, Shenda ; Xu, Runsheng ; Zhao, Yue ; Zhang, Wentao ; Cui, Bin ; Yang, Ming-Hsuan. Diffusion Models: A Com- prehensive Survey of Methods and Applications. 2023 | |
| dc.relation.references | Yu, Sihyun ; Kwak, Sangkyung ; Jang, Huiwon ; Jeong, Jongheon ; Huang, Jo- nathan ; Shin, Jinwoo ; Xie, Saining. Representation Alignment for Generation: Trai- ning Diffusion Transformers Is Easier Than You Think. 2025 | |
| dc.relation.references | Zhou, N.-R. ; Zhang, T.-F. ; Xie, X.-W. ; Wu, J.-Y.: Hybrid quantum–classical generative adversarial networks for image generation via learning discrete distribution. En: Signal Processing: Image Communication 110 (2023) | |
| dc.relation.references | Čepaité , I. ; Coyle, B. ; Kashefi, E.: A continuous variable Born machine. En: Quantum Machine Intelligence 4 (2022) | |
| dc.rights.accessrights | info:eu-repo/semantics/openAccess | |
| dc.rights.license | Reconocimiento 4.0 Internacional | |
| dc.rights.uri | http://creativecommons.org/licenses/by/4.0/ | |
| dc.subject.blaa | Aprendizaje automático (Inteligencia artificial) | |
| dc.subject.ddc | 620 - Ingeniería y operaciones afines | |
| dc.subject.proposal | Modelos generativos | spa |
| dc.subject.proposal | Autoencoder | spa |
| dc.subject.proposal | Kernel Density Matrix | spa |
| dc.subject.proposal | Modelos inspirados en cuántica | spa |
| dc.subject.proposal | Aprendizaje automático | spa |
| dc.subject.proposal | Generación de imágenes | spa |
| dc.subject.proposal | Fréchet In- ception Distance (FID) | spa |
| dc.subject.proposal | Variational Autoencoder (VAE) | spa |
| dc.subject.proposal | Redes generativas adversarias | spa |
| dc.subject.wikidata | Modelo generador | |
| dc.subject.wikidata | Generative model | |
| dc.subject.wikidata | Autocodificador | |
| dc.subject.wikidata | Autoencoder | |
| dc.subject.wikidata | Estado mixto | |
| dc.subject.wikidata | Density matrix | |
| dc.subject.wikidata | Variational auto-encoder | |
| dc.title | Implementación de un modelo de generación de imágenes basado en principios de la física cuántica. | spa |
| dc.title.translated | Implementation of an image generation model based on principles of quantum physics. | eng |
| dc.type | Trabajo de grado - Maestría | |
| dc.type.coar | http://purl.org/coar/resource_type/c_bdcc | |
| dc.type.coarversion | http://purl.org/coar/version/c_ab4af688f83e57aa | |
| dc.type.content | Text | |
| dc.type.driver | info:eu-repo/semantics/masterThesis | |
| dc.type.redcol | http://purl.org/redcol/resource_type/TM | |
| dc.type.version | info:eu-repo/semantics/acceptedVersion | |
| dcterms.audience.professionaldevelopment | Bibliotecarios | |
| dcterms.audience.professionaldevelopment | Investigadores | |
| dcterms.audience.professionaldevelopment | Maestros | |
| oaire.accessrights | http://purl.org/coar/access_right/c_abf2 |
Archivos
Bloque original
1 - 1 de 1
Cargando...
- Nombre:
- Tesis Final de Magíster en ingeniería de sistemas y computación.pdf
- Tamaño:
- 48.27 MB
- Formato:
- Adobe Portable Document Format
Bloque de licencias
1 - 1 de 1
Cargando...
- Nombre:
- license.txt
- Tamaño:
- 5.74 KB
- Formato:
- Item-specific license agreed upon to submission
- Descripción:

