Implementación de un modelo de generación de imágenes basado en principios de la física cuántica.

dc.contributor.advisorGonzalez Osorio, Fabio Augusto
dc.contributor.advisorUseche Reyes, Diego Hernan
dc.contributor.authorMora López, Andrea Carolina
dc.contributor.researchgroupMindlab
dc.date.accessioned2026-02-04T12:52:13Z
dc.date.available2026-02-04T12:52:13Z
dc.date.issued2026
dc.descriptionIlustraciones, diagramas, fotografías, gráficosspa
dc.description.abstractLa computación cuántica presenta ventajas frente a la computación clásica, en particular en tareas de optimización y generación de datos. Sin embargo, las limitaciones del hardware actual dificultan la implementación práctica de modelos generativos cuánticos a gran escala. Esto ha impulsado el desarrollo de enfoques híbridos, simulaciones clásicas y modelos inspirados en principios cuánticos. En este trabajo se proponen tres modelos generativos inspirados en principios de la computación cuántica, pero implementados sobre arquitecturas clásicas. Todos ellos comparten un método de estimación de densidad llamado Kernel Density Matrix (KDM), compuesto por tres elementos un kernel, un conjunto de prototipos y los pesos asociados a dichos prototipos. Los modelos desarrollados son: un autoencoder entrenado de manera independiente de la capa KDM (AEKDMS), un autoencoder y una KDM entrenados conjuntamente mediante una función de pérdida combinada (AEKDM-FPC) y un autoencoder con una KDM estocástica inctegrada como capa intermedia en el espacio latente (AEKDM-CIS). La validación experimental se llevó a cabo utilizando los conjuntos de datos MNIST Digits, Fashion-MNIST y CELEB-A, evaluando el desempeño mediante las métricas Fréchet Inception Distance (FID), divergencia de Kullback Leibler y log-verosimilitud (LL). Los resultados muestran que el modelo AEKDMS alcanzó el mejor desempeño global. En MNIST Digits obtuvo un FID de 21.52, una KL de 4.12 y una LL de 654.84, superando a modelos base como VAE (FID 75.43, KL 10.76, LL 582.61) y GAN (FID 35.24, KL 15.68, LL 419.56). En Fashion-MNIST, el mismo modelo logró un FID de 53.11 y una LL de 674.63, mientras que VAE y GAN alcanzaron FIDs de 149.67 y 167.62, respectivamente. Finalmente, para el set CELEB-A obtuvo un FID de 182.51 comparado con los modelos de linea base que alcanzaron valores de 216.47 y 211.83 para la GAN y VAE respectivamente. El modelo entrenado conjuntamente AEKDM-FPC se destacó en log-verosimilitud, alcanzando el mayor valor en el conjunto de MNIST Digits (LL = 745.57), aunque sacrificó calidad visual (FID = 31.74). Finalmente, el modelo AEKDM-CIS presentó un desempeño inferior en ambas bases de datos, con FID más altos (76.79 en dígitos y 194.04 en Fashion-MNIST) y log-verosimilitudes más bajas, lo que indica inestabilidad en el entrenamiento. Sin embargo, para el conjunto de mayor dimensión CELEB-A obtuvo un mejor desempeño en métricas como la log verosimilitud (LL = 8213,04). En conjunto, los resultados confirman que los modelos inspirados en principios cuánticos, en particular el AEKDMS, superan a las líneas base clásicas tanto en calidad visual como en consistencia estadística, además de requerir arquitecturas más ligeras (61k–68k parámetros frente a los 2.9M del GAN). (Texto tomado de la fuente)spa
dc.description.abstractQuantum computing offers advantages over classical computing, particularly in optimization and data generation tasks. However, the limitations of current hardware hinder the practical implementation of large-scale quantum generative models. This has driven the development of hybrid approaches, classical simulations, and models inspired by quantum principles. In this work, three generative models inspired by principles of quantum computing are proposed, but implemented on classical architectures. All of them share a density estimation method called Kernel Density Matrix (KDM), composed of three elements: a kernel, a set of prototypes, and the weights associated with these prototypes. The developed models are: an autoencoder trained independently of the KDM layer (AEKDMS), an autoencoder and a KDM trained jointly through a combined loss function (AEKDM-FPC), and an autoencoder with a stochastic KDM integrated as an intermediate layer in the latent space (AEKDM-CIS). The experimental validation was carried out using the MNIST Digits, Fashion-MNIST, and CELEB-A datasets, evaluating performance through the Fréchet Inception Distance (FID), Kullback–Leibler divergence, and log-likelihood (LL) metrics. The results show that the AEKDMS model achieved the best overall performance. On MNIST Digits it obtained a FID of 21.52, a KL of 4.12, and an LL of 654.84, surpassing baseline models such as VAE (FID 75.43, KL 10.76, LL 582.61) and GAN (FID 35.24, KL 15.68, LL 419.56). On Fashion-MNIST, the same model achieved a FID of 53.11 and an LL of 674.63, while VAE and GAN reached FIDs of 149.67 and 167.62, respectively. Finally, for the CELEB-A dataset it obtained a FID of 182.51 compared to the baseline models, which reached values of 216.47 and 211.83 for GAN and VAE, respectively. The jointly trained AEKDM-FPC model stood out in log-likelihood, achieving the highest value on the MNIST Digits dataset (LL = 745.57), although it sacrificed visual quality (FID = 31.74). Finally, the AEKDM-CIS model showed inferior performance on both datasets, with higher FIDs (76.79 on digits and 194.04 on Fashion-MNIST) and lower log-likelihoods, indicating instability in training. However, for the higher-dimensional CELEB-A dataset it obtained better performance on metrics such as log-likelihood (LL = 8213,04). Overall, the results confirm that the models inspired by quantum principles, particularly AEKDMS, outperform classical baselines in both visual quality and statistical consistency, while also requiring lighter architectures (61k–68k parameters versus 2.9M for the GAN).eng
dc.description.degreelevelMaestría
dc.description.degreenameMagíster en ingeniería de sistemas y computación
dc.description.researchareaAprendizaje de Máquina
dc.format.extentix, 54 páginas
dc.format.mimetypeapplication/pdf
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/89385
dc.language.isospa
dc.publisherUniversidad Nacional de Colombia
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotá
dc.publisher.facultyFacultad de Ingeniería
dc.publisher.placeBogotá, Colombia
dc.publisher.programBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación
dc.relation.referencesAgliardi, G. ; Prati, E.: Optimal Tuning of Quantum Generative Adversarial Net- works for Multivariate Distribution Loading. En: Quantum Reports 4 (2022), p. 75–105
dc.relation.referencesAjagekar, A. ; You, F.: Molecular design with automated quantum computing-based deep learning and optimization. En: npj Computational Materials 9 (2023)
dc.relation.referencesAlemohammad, Sina ; Humayun, Ahmed I. ; Agarwal, Shruti ; Collomosse, John ; Baraniuk, Richard. Self-Improving Diffusion Models with Synthetic Data. 2024
dc.relation.referencesAnand, A. ; Romero, J. ; Degroote, M. ; Aspuru-Guzik, A.: Noise Robustness and Experimental Demonstration of a Quantum Generative Adversarial Network for Continuous Distributions. En: Advanced Quantum Technologies 4 (2021)
dc.relation.referencesArrasmith, Andrew ; Cerezo, M. ; Czarnik, Piotr ; Cincio, Lukasz ; Coles, Patrick J.: Effect of barren plateaus on gradient-free optimization. En: Quantum 5 (2021), oct, p. 558
dc.relation.referencesAssouel, A. ; Jacquier, A. ; Kondratyev, A.: A quantum generative adversarial network for distributions. En: Quantum Machine Intelligence 4 (2022)
dc.relation.referencesAssouel, A. ; Jacquier, A. ; Kondratyev, A.: A quantum generative adversarial network for distributions. En: Quantum Machine Intelligence 4 (2022)
dc.relation.referencesAssouel, Amine ; Jacquier, Antoine ; Kondratyev, Alexei. A Quantum Generative Adversarial Network for distributions. 2021
dc.relation.referencesBartkiewicz, K. ; Tulewicz, P. ; Roik, J. ; Lemr, K.: Synergic quantum generative machine learning. En: Scientific Reports 13 (2023)
dc.relation.referencesBartkiewicz, Karol ; Tulewicz, Patrycja ; Roik, Jan ; Lemr, Karel: Synergic quantum generative machine learning. En: Scientific Reports 13 (2023), August, Nr. 1. – ISSN 2045–2322
dc.relation.referencesBengio, Yoshua ; Yao, Li ; Alain, Guillaume ; Vincent, Pascal. Generalized Denoi- sing Auto-Encoders as Generative Models. 2013
dc.relation.referencesBigdeli, Siavash A. ; Lin, Geng ; Portenier, Tiziano ; Dunbar, L. A. ; Zwicker, Matthias. Learning Generative Models using Denoising Density Estimators. 2020
dc.relation.referencesBigdeli, Siavash A. ; Lin, Geng ; Portenier, Tiziano ; Dunbar, L. A. ; Zwicker, Matthias: Learning Generative Models using Denoising Density Estimators. En: CoRR abs/2001.02728 (2020)
dc.relation.referencesBrooks, Michael: What’s next for quantum computing. En: MIT Technology Review (2023)
dc.relation.referencesCerezo, M. ; Arrasmith, Andrew ; Babbush, Ryan ; Benjamin, Simon C. ; Endo, Suguru ; Fujii, Keisuke ; McClean, Jarrod R. ; Mitarai, Kosuke ; Yuan, Xiao ; Cincio, Lukasz ; Coles, Patrick J.: Variational quantum algorithms. En: Nature Reviews Physics 3 (2021), August, Nr. 9, p. 625–644. – ISSN 2522–5820
dc.relation.referencesChakrabarti, S. ; Huang, Y. ; Li, T. ; Feizi, S. ; Wu, X.: Quantum wasserstein GANs. En: Quantum wasserstein GANs Vol. 32, 2019
dc.relation.referencesChaudhary, S. ; Huembeli, P. ; MacCormack, I. ; Patti, T.L. ; Kossaifi, J. ; Galda, A.: Towards a scalable discrete quantum generative adversarial neural network. En: Quantum Science and Technology 8 (2023)
dc.relation.referencesGao, X. ; Anschuetz, E.R. ; Wang, S.-T. ; Cirac, J.I. ; Lukin, M.D.: Enhancing Generative Models via Quantum Correlations. En: Physical Review X 12 (2022)
dc.relation.referencesGao, Xun ; Anschuetz, Eric R. ; Wang, Sheng-Tao ; Cirac, J. I. ; Lukin, Mikhail D.: Enhancing Generative Models via Quantum Correlations. En: Physical Review X 12 (2022), Mai, Nr. 2. – ISSN 2160–3308
dc.relation.referencesGircha, A.I. ; Boev, A.S. ; Avchaciov, K. ; Fedichev, P.O. ; Fedorov, A.K.: Hybrid quantum-classical machine learning for generative chemistry and drug design. En: Scientific Reports 13 (2023)
dc.relation.referencesGircha, A.I. ; Boev, A.S. ; Avchaciov, K. ; Fedichev, P.O. ; Fedorov, A.K.: Hybrid quantum-classical machine learning for generative chemistry and drug design. En: Scientific Reports 13 (2023)
dc.relation.referencesGonzález, Fabio A. ; Gallego, Alejandro ; Toledo-Cortés, Santiago ; Vargas- Calderón, Vladimir: Learning with density matrices and random features. En: Quan- tum Machine Intelligence 4 (2022), 12. – ISSN 25244914
dc.relation.referencesGonzález, Fabio A. ; Ramos-Pollán, Raúl ; Gallego-Mejia, Joseph A.: Kernel Density Matrices for Probabilistic Deep Learning. En: - (2023), 5
dc.relation.referencesGoodfellow, Ian J. ; Pouget-Abadie, Jean ; Mirza, Mehdi ; Xu, Bing ; Warde- Farley, David ; Ozair, Sherjil ; Courville, Aaron ; Bengio, Yoshua. Generative Adversarial Networks. 2014
dc.relation.referencesGriffiths, David J.: Introduction to Quantum Mechanics. 2nd. Upper Saddle River, NJ : Pearson Prentice Hall, 2005. – ISBN 978–0131118928
dc.relation.referencesHeusel, Martin ; Ramsauer, Hubert ; Unterthiner, Thomas ; Nessler, Bernhard ; Hochreiter, Sepp. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium. 2018
dc.relation.referencesJayasumana, Sadeep ; Ramalingam, Srikumar ; Veit, Andreas ; Glasner, Daniel ; Chakrabarti, Ayan ; Kumar, Sanjiv. Rethinking FID: Towards a Better Evaluation Metric for Image Generation. 2024
dc.relation.referencesKieferova, Maria ; Carlos, Ortiz M. ; Wiebe, Nathan. Quantum Generative Trai- ning Using Rényi Divergences. 2021
dc.relation.referencesKieferová, Mária ; Marrero, Carlos O. ; Wiebe, Nathan. Quantum Generative Training Using Rényi Divergences. 2021
dc.relation.referencesKingma, Diederik P. ; Welling, Max: An Introduction to Variational Autoencoders. En: Foundations and Trends® in Machine Learning 12 (2019), Nr. 4, p. 307–392
dc.relation.referencesKingma, Diederik P. ; Welling, Max. Auto-Encoding Variational Bayes. 2022
dc.relation.referencesLamb, Alex. A Brief Introduction to Generative Models. 2021
dc.relation.referencesvan Leeuwen, Steyn ; de Alba Ortíz, Alberto P. ; Dijkstra, Marjolein. A Boltz- mann generator for the isobaric-isothermal ensemble. 2023
dc.relation.referencesLiu, Weizhi ; Zhang, Yufei ; Deng, Zhiqiang [u. a.]: A hybrid quantum-classical conditional generative adversarial network algorithm for human-centered paradigm in cloud. En: Journal of Wireless Communications and Networking 2021 (2021), Nr. 1, p. 37
dc.relation.referencesMiranda, E.R. ; Shaji, H.: Generative Music with Partitioned Quantum Cellular Automata. En: Applied Sciences (Switzerland) 13 (2023)
dc.relation.referencesMiyahara, H. ; Roychowdhury, V.: Quantum advantage in variational Bayes in- ference. En: Proceedings of the National Academy of Sciences of the United States of America 120 (2023)
dc.relation.referencesMiyahara, H. ; Roychowdhury, V.: Quantum advantage in variational Bayes in- ference. En: Proceedings of the National Academy of Sciences of the United States of America 120 (2023)
dc.relation.referencesMoussa, Charles ; Wang, Hao ; Araya-Polo, Mauricio ; Bäck, Thomas ; Dunjko, Vedran: Application of quantum-inspired generative models to small molecular data- sets. En: 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), IEEE, September 2023, p. 342–348
dc.relation.referencesNg, Joseph ; Abbott, Derek: Solid state quantum computers: a nanoscopic solution to the Moore’s law problem. En: Abbott, Derek (Ed.) ; Varadan, Vijay K. (Ed.) ; Boehringer, Karl F. (Ed.): Smart Electronics and MEMS II Vol. 4236 International Society for Optics and Photonics, SPIE, 2001, p. 89 – 98
dc.relation.referencesNielsen, Michael A. ; Chuang, Isaac L.: Quantum Computation and Quantum Infor- mation. 10th Anniversary Edition. Cambridge University Press, 2010
dc.relation.referencesPaine, A.E. ; Elfving, V.E. ; Kyriienko, O.: Quantum Quantile Mechanics: Solving Stochastic Differential Equations for Generating Time-Series. En: Advanced Quantum Technologies (2023)
dc.relation.referencesPlesovskaya, Ekaterina ; Ivanov, Sergey: An Empirical Analysis of KDE-based Generative Models on Small Datasets. En: Procedia Computer Science 193 (2021), p. 442–452. – ISSN 1877–0509
dc.relation.referencesRomero, J. ; Aspuru-Guzik, A.: Variational Quantum Generators: Generative Adver- sarial Quantum Machine Learning for Continuous Distributions. En: Advanced Quantum Technologies 4 (2021)
dc.relation.referencesRomero, Jonathan ; Aspuru-Guzik, Alan. Variational quantum generators: Genera- tive adversarial quantum machine learning for continuous distributions. 2019
dc.relation.referencesS., GAMBLE: Quantum Computing: What It Is, Why We Want It, and How We’re Trying to Get It. En: Quantum Computing: What It Is, Why We Want It, and How We’re Trying to Get It. National Academy of Engineering., Frontiers of Engineering: Reports on Leading-Edge Engineering from the 2018 Symposium. Washington (DC): National Academies Press (US); 2019 Jan 28., 2019
dc.relation.referencesSalakhutdinov, Ruslan ; Hinton, Geoffrey E.: Deep Boltzmann Machines. En: Journal of Machine Learning Research - Proceedings Track 5 (2009), p. 448–455
dc.relation.referencesSilverman, Bernard. W.: Density Estimation for Statistics and Data Analysis. Chap- man and Hall/CRC, 1998 (Chapman & Hall/CRC Monographs on Statistics & Applied Probability). – ISBN 9780412246203; 0412246201
dc.relation.referencesSitu, H. ; He, Z. ; Wang, Y. ; Li, L. ; Zheng, S.: Quantum generative adversarial network for generating discrete distribution. En: Information Sciences 538 (2020), p. 193–208
dc.relation.referencesSleeman, J. ; Dorband, J. ; Halem, M.: A hybrid quantum enabled RBM advantage: Convolutional autoencoders for quantum image compression and generative learning. En: A hybrid quantum enabled RBM advantage: Convolutional autoencoders for quantum image compression and generative learning Vol. 11391, 2020. – ISBN 9781510635593
dc.relation.referencesStein, Samuel A. ; Baheri, Betis ; Chen, Daniel ; Mao, Ying ; Guan, Qiang ; Li, Ang ; Fang, Bo ; Xu, Shuai: QuGAN: A Quantum State Fidelity based Generative Adversarial Network. En: 2021 IEEE International Conference on Quantum Computing and Engineering (QCE), IEEE, Oktober 2021, p. 71–81
dc.relation.referencesTakida, Yuhta ; Imaizumi, Masaaki ; Shibuya, Takashi ; Lai, Chieh-Hsin ; Uesaka, Toshimitsu ; Murata, Naoki ; Mitsufuji, Yuki. SAN: Inducing Metrizability of GAN with Discriminative Normalized Linear Layer. 2024
dc.relation.referencesTan, Jing ; Yin, Sixing ; Zhao, Shuo: Generative Model with Kernel Density Estima- tion. En: Generative Model with Kernel Density Estimation, 2018, p. 304–308
dc.relation.referencesTian, Jinkai ; Sun, Xiaoyu ; Du, Yuxuan ; Zhao, Shanshan ; Liu, Qing ; Zhang, Kaining ; Yi, Wei ; Huang, Wanrong ; Wang, Chaoyue ; Wu, Xingyao ; Hsieh, Min- Hsiu ; Liu, Tongliang ; Yang, Wenjing ; Tao, Dacheng: Recent Advances for Quantum Neural Networks in Generative Learning. En: IEEE Transactions on Pattern Analysis and Machine Intelligence (2022), 6
dc.relation.referencesTran, Ba-Hien ; Rossi, Simone ; Milios, Dimitrios ; Michiardi, Pietro ; Bonilla, Edwin V. ; Filippone, Maurizio. Model Selection for Bayesian Autoencoders. 2021
dc.relation.referencesTychola, K.A. ; Kalampokas, T. ; Papakostas, G.A.: Quantum Machine Lear- ning—An Overview. En: Electronics (Switzerland) 12 (2023)
dc.relation.referencesVagenas, Elias C. ; Farag Ali, Ahmed ; Alshal, Hassan: GUP and the no-cloning theorem. En: The European Physical Journal C 79 (2019), Nr. 3, p. 276
dc.relation.referencesWall, Michael L. ; Abernathy, Matthew R. ; Quiroz, Gregory: Generative machine learning with tensor networks: Benchmarks on near-term quantum computers. En: Physical Review Research 3 (2021), April, Nr. 2. – ISSN 2643–1564
dc.relation.referencesWall, M.L. ; Abernathy, M.R. ; Quiroz, G.: Generative machine learning with tensor networks: Benchmarks on near-term quantum computers. En: Physical Review Research 3 (2021)
dc.relation.referencesWang, Qing ; Kulkarni, Sanjeev R. ; Verdú, Sergio: Divergence estimation for multidimensional densities via k-nearest-neighbor distances. En: IEEE Transactions on Information Theory 55 (2009), Nr. 5, p. 2392–2405
dc.relation.referencesYang, Ling ; Zhang, Zhilong ; Song, Yang ; Hong, Shenda ; Xu, Runsheng ; Zhao, Yue ; Zhang, Wentao ; Cui, Bin ; Yang, Ming-Hsuan. Diffusion Models: A Com- prehensive Survey of Methods and Applications. 2023
dc.relation.referencesYu, Sihyun ; Kwak, Sangkyung ; Jang, Huiwon ; Jeong, Jongheon ; Huang, Jo- nathan ; Shin, Jinwoo ; Xie, Saining. Representation Alignment for Generation: Trai- ning Diffusion Transformers Is Easier Than You Think. 2025
dc.relation.referencesZhou, N.-R. ; Zhang, T.-F. ; Xie, X.-W. ; Wu, J.-Y.: Hybrid quantum–classical generative adversarial networks for image generation via learning discrete distribution. En: Signal Processing: Image Communication 110 (2023)
dc.relation.referencesČepaité , I. ; Coyle, B. ; Kashefi, E.: A continuous variable Born machine. En: Quantum Machine Intelligence 4 (2022)
dc.rights.accessrightsinfo:eu-repo/semantics/openAccess
dc.rights.licenseReconocimiento 4.0 Internacional
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.blaaAprendizaje automático (Inteligencia artificial)
dc.subject.ddc620 - Ingeniería y operaciones afines
dc.subject.proposalModelos generativosspa
dc.subject.proposalAutoencoderspa
dc.subject.proposalKernel Density Matrixspa
dc.subject.proposalModelos inspirados en cuánticaspa
dc.subject.proposalAprendizaje automáticospa
dc.subject.proposalGeneración de imágenesspa
dc.subject.proposalFréchet In- ception Distance (FID)spa
dc.subject.proposalVariational Autoencoder (VAE)spa
dc.subject.proposalRedes generativas adversariasspa
dc.subject.wikidataModelo generador
dc.subject.wikidataGenerative model
dc.subject.wikidataAutocodificador
dc.subject.wikidataAutoencoder
dc.subject.wikidataEstado mixto
dc.subject.wikidataDensity matrix
dc.subject.wikidataVariational auto-encoder
dc.titleImplementación de un modelo de generación de imágenes basado en principios de la física cuántica.spa
dc.title.translatedImplementation of an image generation model based on principles of quantum physics.eng
dc.typeTrabajo de grado - Maestría
dc.type.coarhttp://purl.org/coar/resource_type/c_bdcc
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aa
dc.type.contentText
dc.type.driverinfo:eu-repo/semantics/masterThesis
dc.type.redcolhttp://purl.org/redcol/resource_type/TM
dc.type.versioninfo:eu-repo/semantics/acceptedVersion
dcterms.audience.professionaldevelopmentBibliotecarios
dcterms.audience.professionaldevelopmentInvestigadores
dcterms.audience.professionaldevelopmentMaestros
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
Tesis Final de Magíster en ingeniería de sistemas y computación.pdf
Tamaño:
48.27 MB
Formato:
Adobe Portable Document Format

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: