Enhanced interpretability using regression networks for assessing domain dependences in motor imagery
| dc.contributor.advisor | Castellanos Domínguez, César Germán | |
| dc.contributor.advisor | Álvarez Meza, Andrés Marino | |
| dc.contributor.author | Gómez Morales, Óscar Wladimir | |
| dc.contributor.orcid | Gómez Morales, Óscar Wladimir [0000000346547231] | |
| dc.contributor.researchgroup | Grupo de Control y Procesamiento Digital de Señales | |
| dc.date.accessioned | 2026-01-22T20:40:01Z | |
| dc.date.available | 2026-01-22T20:40:01Z | |
| dc.date.issued | 2025 | |
| dc.description | graficas | spa |
| dc.description.abstract | The growing demand for efficient and accessible brain-computer interface (BCI) systems has driven research into methods that enhance the interpretation of electroencephalographic (EEG) signals. In this context, improving the accuracy and transparency of computational models represents a crucial challenge for advancing applications such as neurorehabilitation and human–machine interaction. However, three fundamental problems persist in the development of BCI systems: (a) the non-stationarity of EEG signals, which compromises model stability; (b) the high intra- and inter-subject variability in EEG recordings, which limits generalizability; and (c) the low physiological interpretability of the models used, which hinders neuroscientific validation. To address these challenges, this thesis proposes a methodological framework based on machine learning models and functional connectivity techniques, aimed at optimizing the prediction, classification, and interpretation of EEG signals in BCI systems. Three strategies were developed, aligned with the aforementioned problems: (i) a regularized regression model to predict full-channel EEG from a reduced subset of electrodes, mitigating nonstationarity and improving classification performance; (ii) a supervised feature extraction strategy to predict musical emotions in MIDI format, applying domain alignment and neural clustering techniques to address intra- and inter-subject variability; and (iii) a functional connectivity analysis approach using the Weighted Phase Lag Index (wPLI) and graph-based representations, aimed at enhancing interpretability through the detection of pre-training brain desynchronization patterns. The results show that the predictive model based on Elastic Net regression achieved an average classification accuracy of 78.16%, outperforming traditional approaches using either reduced or full-channel data. In the second objective, the combination of domain alignment and latent encoding improved emotional prediction and musical synthesis in MIDI, preserving both temporal and tonal coherence. Lastly, the functional connectivity analysis enabled the identification of key brain regions involved in desynchronization, associated with different levels of proficiency in BCI paradigms. The most relevant conclusions include: (i) regularized regression can estimate high-fidelity signals from a small number of electrodes, promoting the development of more cost-effective and portable systems; (ii) shared representations between EEG and musical data enable more accurate emotional prediction, contributing to the design of personalized neurointeraction environments; and (iii) the use of functional connectivity metrics provides validated physiological explanations, enhancing confidence in classification models. As future work, the proposed methodology will be validated using broader and more diverse datasets, and adversarial generative architectures will be incorporated to strengthen transferability across subjects and experimental conditions. | eng |
| dc.description.abstract | La creciente demanda de sistemas cerebro-computadora (BCI) eficientes y accesibles ha impulsado la investigación sobre métodos que optimicen la interpretación de señales electroencefalográficas (EEG). En este contexto, mejorar la precisión y transparencia de los modelos computacionales representa un desafío crucial para avanzar en aplicaciones como la neurorehabilitación y la interacción humano–máquina. Sin embargo, persisten tres problemas fundamentales en el desarrollo de sistemas BCI: (a) la no estacionariedad de las señales EEG, que compromete la estabilidad de los modelos; (b) la alta variabilidad intra e intersujeto en los registros EEG, que limita la generalización; y (c) la baja interpretabilidad fisiológica de los modelos utilizados, dificultando su validación neurocientífica. Para abordar estos retos, esta tesis propone una arquitectura metodológica basada en modelos de aprendizaje automático y técnicas de conectividad funcional, orientada a optimizar la predicción, clasificación e interpretación de señales EEG en sistemas BCI. Se desarrollaron tres estrategias alineadas con los problemas mencionados: (i) un modelo de regresión regularizada para predecir canales completos a partir de un subconjunto reducido de electrodos, mitigando la no estacionariedad y mejorando el rendimiento en tareas de clasificación; (ii) una estrategia de extracción de características supervisada para predecir emociones musicales en formato MIDI, aplicando técnicas de alineación entre dominios y agrupamiento neuronal para enfrentar la variabilidad intra e intersujeto; y (iii) un enfoque de análisis de conectividad funcional mediante el índice ponderado de acoplamiento de fase (wPLI) y representaciones en grafos, dirigido a mejorar la interpretabilidad mediante la detección de patrones de desincronización cerebral previos al entrenamiento. Los resultados muestran que el modelo predictivo basado en regresión elástica logró una precisión promedio del 78.16 %, superando a enfoques tradicionales con datos reducidos o completos. En el segundo objetivo, se evidenció que la combinación de alineación de dominios y codificación latente mejoró la predicción emocional y la síntesis musical en MIDI, preservando la coherencia temporal y tonal. Por último, el análisis de conectividad funcional permitió identificar regiones clave de desincronización cerebral, asociadas a distintos niveles de habilidad en la práctica de paradigmas BCI. Las conclusiones más relevantes incluyen: (i) la regresión regularizada puede estimar señales de alta fidelidad a partir de pocos electrodos, favoreciendo sistemas más económicos y portables; (ii) la representación compartida entre EEG y datos musicales permite una predicción emocional más precisa, contribuyendo al diseño de entornos de neurointeracción personalizados; y (iii) el uso de métricas de conectividad funcional aporta explicaciones fisiológicas validadas, que mejoran la confianza en los modelos de clasificación. Como trabajo futuro, se plantea validar la metodología propuesta en bases de datos más amplias y diversas, e incorporar arquitecturas generativas adversariales para fortalecer las capacidades de transferencia entre sujetos y condiciones (Texto tomado de la fuente). | spa |
| dc.description.curriculararea | Eléctrica, Electrónica, Automatización Y Telecomunicaciones.Sede Manizales | |
| dc.description.degreelevel | Doctorado | |
| dc.description.degreename | Doctor en Ingeniería - Automática | |
| dc.description.methods | Metodología de Investigación La presente investigación adopta una metodología cuantitativa, experimental y computacional, orientada al análisis, modelado e interpretación de señales electroencefalográficas (EEG) en sistemas de Interfaz Cerebro–Computador (BCI) basados en Imaginación Motora (IM). El enfoque metodológico integra técnicas de procesamiento de señales, aprendizaje automático, aprendizaje profundo y análisis de conectividad funcional, con el propósito de abordar tres problemáticas fundamentales: la no estacionariedad de las señales EEG, la alta variabilidad intra e intersujeto, y la baja interpretabilidad fisiológica de los modelos. Diseño metodológico general La metodología se estructura en tres ejes experimentales, alineados con los objetivos específicos de la tesis: Modelado predictivo de señales EEG mediante regresión regularizada, orientado a la estimación de canales EEG completos a partir de un subconjunto reducido de electrodos. Extracción supervisada de características para la predicción de emociones inducidas por música, utilizando representaciones EEG y datos musicales en formato MIDI. Análisis de conectividad funcional cerebral, basado en el índice de desfase de fase ponderado (wPLI) y representaciones en grafos, con el fin de mejorar la interpretabilidad neurofisiológica. Cada eje sigue un flujo metodológico común que incluye: adquisición de datos, preprocesamiento, modelado, validación experimental y análisis de resultados. Modelado y técnicas empleadas 1. Predicción y clasificación de señales EEG Se implementó un modelo de regresión Elastic Net, capaz de estimar señales EEG multicanal a partir de un conjunto reducido de electrodos centrales. Las señales predichas fueron posteriormente utilizadas para: Extracción de características espaciales mediante Common Spatial Patterns (CSP). Clasificación de tareas de imaginación motora usando Support Vector Machines (SVM). 2. Predicción emocional y generación MIDI Se desarrolló un marco de aprendizaje profundo supervisado, que incluyó: Redes neuronales convolucionales tipo EEGNet para extracción de características EEG. Autoencoders profundos para generar representaciones latentes. Técnicas de alineación entre dominios EEG–MIDI mediante Centered Kernel Alignment (CKA). 3. Análisis de conectividad funcional Para mejorar la interpretabilidad fisiológica, se calculó la conectividad funcional entre canales EEG usando el Weighted Phase Lag Index (wPLI). Los resultados fueron modelados mediante grafos cerebrales, permitiendo: Identificar patrones de desincronización pre-entrenamiento. Analizar diferencias entre sujetos con distintos niveles de desempeño en BCI. | |
| dc.description.researcharea | Inteligencia Artificial | |
| dc.description.technicalinfo | The document describes the design and implementation of a computational framework for the analysis, modeling, and interpretation of electroencephalographic (EEG) signals in motor imagery–based Brain–Computer Interface (BCI) systems. The methodology includes digital signal processing techniques (band-pass filtering in the μ and β frequency bands, temporal segmentation, and normalization), Elastic Net regularized regression models for multivariate EEG channel prediction, and spatial feature extraction using Common Spatial Patterns (CSP). Classification is performed using Support Vector Machines (SVM), while advanced analysis incorporates deep learning architectures such as EEGNet and autoencoders for supervised feature extraction. Additionally, functional brain connectivity analysis is implemented using the Weighted Phase Lag Index (wPLI) and graph-based representations. The experimental framework relies on computational tools for data analysis and machine learning, enabling reproducibility and facilitating integration of the proposed approach into other BCI systems. | eng |
| dc.format.extent | xiii, 130 páginas | |
| dc.format.mimetype | application/pdf | |
| dc.identifier.instname | Universidad Nacional de Colombia | spa |
| dc.identifier.reponame | Repositorio Institucional Universidad Nacional de Colombia | spa |
| dc.identifier.repourl | https://repositorio.unal.edu.co/ | spa |
| dc.identifier.uri | https://repositorio.unal.edu.co/handle/unal/89303 | |
| dc.language.iso | eng | |
| dc.publisher | Universidad Nacional de Colombia | |
| dc.publisher.branch | Universidad Nacional de Colombia - Sede Manizales | |
| dc.publisher.faculty | Facultad de Arquitectura | |
| dc.publisher.place | Manizales, Colombia | |
| dc.publisher.program | Manizales - Ingeniería y Arquitectura - Doctorado en Ingeniería - Automática | |
| dc.relation.references | [Acir et al., 2004] Acir, N.; Oztura, I.; Kuntalp, M.; Baklan, B. & Guzelis, C.: , 2004; Automatic detection of epileptiform events in eeg by a three-stage procedure based on artificial neural networks; IEEE Transactions on Biomedical Engineering; 52 (1): 30–40. | |
| dc.relation.references | Al-Qaysi et al., 2021] Al-Qaysi, Z.; Ahmed, M.; Hammash, N. M.; Hussein, A. F.; Albahri, A.; Suzani, M.; Al-Bander, B.; Shuwandy, M. L. & Salih, M. M.: , 2021; Systematic review of training environments with motor imagery brain–computer interface: coherent taxonomy, open issues and recommendation pathway solution; Health and Technology; 11 (4): 783–801. | |
| dc.relation.references | [Alazrai et al., 2019] Alazrai, R.; Abuhijleh, M.; Alwanni, H. & Daoud, M. I.: , 2019; A deep learning framework for decoding motor imagery tasks of the same hand using eeg signals; IEEE Access; 7: 109612–109627. | |
| dc.relation.references | [Alizadeh & Omranpour, 2023] Alizadeh, D. & Omranpour, H.: , 2023; Em-csp: an efficient multiclass common spatial pattern feature method for speech imagery eeg signals recognition; Biomedical Signal Processing and Control; 84: 104933. | |
| dc.relation.references | [Alizadeh et al., 2023] Alizadeh, N.; Afrakhteh, S. & Mosavi, M. R.: , 2023; Multi-task eeg signal classification using correlation-based imf selection and multi-class csp; IEEE Access; 11: 52712–52725. | |
| dc.relation.references | [Alnaanah et al., 2023] Alnaanah, M.; Wahdow, M. & Alrashdan, M.: , 2023; Cnn models for eeg motor imagery signal classification; Signal, Image and Video Processing; 17 (3): 825–830. | |
| dc.relation.references | [Alsharif et al., 2022] Alsharif, A. H.; Salleh, N. Z. M.; Baharun, R.; Abuhassna, H. & Hashem, A. R.: , 2022; A global research trends of neuromarketing: 2015-2020; Revista de comunicación; 21 (1): 15–32. | |
| dc.relation.references | Altaheri et al., 2023] Altaheri, H.; Muhammad, G.; Alsulaiman, M.; Amin, S. U.; Altuwaijri, G. A.; Abdul, W.; Bencherif, M. A. & Faisal, M.: , 2023; Deep learning techniques for classification of electroencephalogram (eeg) motor imagery (mi) signals: A review; Neural Computing and Applications; 35 (20): 14681–14722. | |
| dc.relation.references | [Amin et al., 2019] Amin, S. U.; Alsulaiman, M.; Muhammad, G.; Mekhtiche, M. A. & Hossain, M. S.: , 2019; Deep learning for eeg motor imagery classification based on multi-layer cnns feature fusion; Future Generation computer systems; 101: 542–554. | |
| dc.relation.references | [Amin et al., 2021] Amin, S. U.; Altaheri, H.; Muhammad, G.; Alsulaiman, M. & Abdul, W.: , 2021; Attention based inception model for robust eeg motor imagery classification; en 2021 IEEE international instrumentation and measurement technology conference (I2MTC); IEEE; págs. 1–6. | |
| dc.relation.references | [An et al., 2023] An, Y.; Lam, H. K. & Ling, S. H.: , 2023; Multi-classification for eeg motor imagery signals using data evaluation-based auto-selected regularized fbcsp and convolutional neural network; Neural Computing and Applications; 35 (16): 12001– 12027. | |
| dc.relation.references | [Ang et al., 2017] Ang, K. K.; Chin, Z. Y.; Zhang, H. & Guan, C.: , 2017; Filter bank common spatial pattern (fbcsp) in brain-computer interface; IEEE Transactions on Biomedical Engineering; 59: 1239–1248. | |
| dc.relation.references | [Antony et al., 2022] Antony, M. J.; Sankaralingam, B. P.; Mahendran, R. K.; Gardezi, A. A.; Shafiq, M.; Choi, J.-G. & Hamam, H.: , 2022; Classification of eeg using adaptive svm classifier with csp and online recursive independent component analysis; Sensors; 22 (19): 7596. | |
| dc.relation.references | [Apicella et al., 2022] Apicella, A.; Arpaia, P.; Frosolone, M.; Improta, G.; Moccaldi, N. & Pollastro, A.: , 2022; Eeg-based measurement system for monitoring student engagement in learning 4.0; Scientific Reports; 12 (1): 5857. | |
| dc.relation.references | [Arpaia et al., 2022] Arpaia, P.; Esposito, A.; Natalizio, A. & Parvis, M.: , 2022; How to successfully classify eeg in motor imagery bci: a metrological analysis of the state of the art; Journal of Neural Engineering; 19 (3): 031002. | |
| dc.relation.references | [Ayoobi & Sadeghian, 2022] Ayoobi, N. & Sadeghian, E. B.: , 2022; Unsupervised motor imagery saliency detection based on self-attention mechanism; en 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC); IEEE; págs. 4817–4820. | |
| dc.relation.references | [Babiloni et al., 2021] Babiloni, C.; Arakaki, X.; Azami, H.; Bennys, K.; Blinowska, K.; Bonanni, L.; Bujan, A.; Carrillo, M.; Cichocki, A.; de Frutos Lucas, J. et al.: , 2021; Measures of resting state eeg rhythms for clinical trials in alzheimer’s disease: Recommendations of an expert panel; Alzheimer’s and Dementia; 17: 1528– 1553; [CrossRef] [PubMed]. | |
| dc.relation.references | [Bashivan et al., 2015] Bashivan, P.; Rish, I.; Yeasin, M. & Codella, N.: , 2015; Learning representations from eeg with deep recurrent-convolutional neural networks; arXiv preprint arXiv:1511.06448. | |
| dc.relation.references | Batail et al., 2019] Batail, J.-M.; Bioulac, S.; Cabestaing, F.; Daudet, C.; Drapier, D.; Fouillen, M.; Fovet, T.; Hakoun, A.; Jardri, R.; Jeunet, C. et al.: , 2019; Eeg neurofeedback research: A fertile ground for psychiatry?; L’encephale; 45 (3): 245–255. | |
| dc.relation.references | [Becker et al., 2022] Becker, S.; Dhindsa, K.; Mousapour, L. & Al Dabagh, Y.: , 2022; Bci illiteracy: it’s us, not them. optimizing bcis for individual brains; en 2022 10th International Winter Conference on Brain-Computer Interface (BCI); IEEE; págs. 1–3. | |
| dc.relation.references | [Bellier et al., 2023] Bellier, L.; Llorens, A.; Marciano, D.; Gunduz, A.; Schalk, G.; Brunner, P. et al.: , 2023; Music can be reconstructed from human auditory cortex activity using nonlinear decoding models; PLoS Biology; 21 (8): e3002176. | |
| dc.relation.references | Bellos et al., 2025] Bellos, C. V.; Stefanou, K.; Tzallas, A. T.; Stergios, G. & Tsipouras, M. G.: , 2025; Methods and approaches for user engagement and user experience analysis based on electroencephalography recordings: A systematic review; Electronics. | |
| dc.relation.references | [Benetos et al., 2018] Benetos, E.; Dixon, S.; Duan, Z. & Ewert, S.: , 2018; Automatic music transcription: An overview; IEEE Signal Processing Magazine; 36 (1): 20–30. | |
| dc.relation.references | [Bertran et al., 2021] Bertran, A.; Masood, S. S.; Van Huffel, S. & Hunyadi, B.: , 2021; End-to-end learnable eeg channel selection for deep neural networks; Journal of Neural Engineering; 18 (3): 036016. | |
| dc.relation.references | [Bhandari & Colton, 2024] Bhandari, K. & Colton, S.: , 2024; Motifs, phrases, and beyond: The modelling of structure in symbolic music generation; en Artificial Intelligence in Music, Sound, Art and Design (Editado por Johnson, C.; Rebelo, S. M. & Santos, I.); Springer Nature Switzerland, Cham; págs. 33–51. | |
| dc.relation.references | [Bhattacharjee et al., 2021] Bhattacharjee, S.; Kashyap, R.; Abualait, T.; Annabel Chen, S.-H.; Yoo, W.-K. & Bashir, S.: , 2021; The role of primary motor cortex: more than movement execution; Journal of motor behavior; 53 (2): 258–274. | |
| dc.relation.references | [Bhurtel et al., 2024] Bhurtel, M.; Rawat, D. B. & Rice, D. O.: , 2024; Recent advances on generative models for semantic segmentation: a survey; en Artificial Intelligence and Machine Learning for Multi-Domain Operations Applications VI, tomo 13051 (Editado por Schwartz, P. J.; Jensen, B. & Hohil, M. E.); International Society for Optics and Photonics; SPIE; pág. 1305113. | |
| dc.relation.references | [Bittner et al., 2022] Bittner, R. M.; Bosch, J. J.; Rubinstein, D.; Meseguer-Brocal, G. & Ewert, S.: , 2022; A lightweight instrument-agnostic model for polyphonic note transcription and multipitch estimation; en Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP); Singapore. | |
| dc.relation.references | [Blankertz et al., 2010] Blankertz, B.; Sannelli, C.; Halder, S.; Hammer, E. M.; Kübler, A.; Müller, K.-R.; Curio, G. & Dickhaus, T.: , 2010; Neurophysiological predictor of smr-based bci performance; Neuroimage; 51 (4): 1303–1309. | |
| dc.relation.references | [Borgheai et al., 2024] Borgheai, S. B.; Zisk, A. H.; McLinden, J.; Mcintyre, J.; Sadjadi, R. & Shahriari, Y.: , 2024; Multimodal pre-screening can predict bci performance variability: A novel subject-specific experimental scheme; Computers in Biology and Medicine; 168: 107658. | |
| dc.relation.references | [Brunner et al., 2008] Brunner, C.; Leeb, R.; Müller-Putz, G.; Schlögl, A. & Pfurtscheller, G.: , 2008; Bci competition 2008–graz data set a; Institute for knowledge discovery (laboratory of brain-computer interfaces), Graz University of Technology; 16 (1-6): 1. | |
| dc.relation.references | [Brunner et al., 2015] Brunner, C.; Birbaumer, N.; Blankertz, B.; Guger, C.; Kübler, A.; Mattia, D.; Millán, J. d. R.; Miralles, F.; Nijholt, A.; Opisso, E. et al.: , 2015; Bnci horizon 2020: towards a roadmap for the bci community; Brain-computer interfaces; 2 (1): 1–10. | |
| dc.relation.references | [Caicedo-Acosta et al., 2021] Caicedo-Acosta, J.; Castaño, G. A.; Acosta-Medina, C.; Alvarez-Meza, A. & Castellanos-Dominguez, G.: , 2021; Deep neural regression prediction of motor imagery skills using eeg functional connectivity indicators; Sensors; 21 (6): 1932. | |
| dc.relation.references | [Cárdenas-Peña et al., 2017] Cárdenas-Peña, D.; Collazos-Huertas, D. & Castellanos- Dominguez, G.: , 2017; Enhanced data representation by kernel metric learning for dementia diagnosis; Frontiers in neuroscience; 11: 413. | |
| dc.relation.references | [Carrara & Papadopoulo, 2024] Carrara, I. & Papadopoulo, T.: , 2024; Classification of bci-eeg based on the augmented covariance matrix; IEEE Transactions on Biomedical Engineering. | |
| dc.relation.references | [Carrara et al., 2024] Carrara, I.; Aristimunha, B.; Corsi, M.-C.; de Camargo, R. Y.; Chevallier, S. & Papadopoulo, T.: , 2024; Geometric neural network based on phase space for bci decoding; arXiv preprint arXiv:2403.05645. | |
| dc.relation.references | [Carvalho & Bernardes, 2023] Carvalho, N. & Bernardes, G.: , 2023; Exploring latent spaces of tonal music using variational autoencoders. | |
| dc.relation.references | [Chaddad et al., 2023] Chaddad, A.; Wu, Y.; Kateb, R. & Bouridane, A.: , 2023; Electroencephalography signal processing: A comprehensive review and analysis of methods and techniques; Sensors; 23 (14): 6434. | |
| dc.relation.references | [Chakladar & Chakraborty, 2018] Chakladar, D. D. & Chakraborty, S.: , 2018; Eeg based emotion classification using “correlation based subset selection”; Biologically inspired cognitive architectures; 24: 98–106. | |
| dc.relation.references | [Chapman et al., 1998] Chapman, C.; Xu, Y.; Haykin, S. & Racine, R.: , 1998; Betafrequency (15–35 hz) electroencephalogram activities elicited by toluene and electrical stimulation in the behaving rat; Neuroscience; 86 (4): 1307–1319. | |
| dc.relation.references | [Chen et al., 2024a] Chen, H.; Zeng, W.; Cai, L.; Li, Y.; Wang, L.; Lu, J.; Yan, H.; Siok, W. T. & zhuan Wang, N.: , 2024a; You only acquire sparse-channel (yoas): A unified framework for dense-channel eeg generation; ArXiv; abs/2406.15269. | |
| dc.relation.references | [Chen et al., 2024b] Chen, S.; Kong, X.; Han, J.; Wu, C. & Zhang, T.: , 2024b; Improved motor imagery classification using elastic net-based feature optimization and a heterogeneous ensemble classifier; en 2024 IEEE International Symposium on Parallel and Distributed Processing with Applications (ISPA); IEEE; págs. 997–1006. | |
| dc.relation.references | [Chen & Sun, 2024] Chen, Y. & Sun, Y.: , 2024; The usage of artificial intelligence technology in music education system under deep learning; IEEE Access; 12: 130546– 130556. | |
| dc.relation.references | [Chen et al., 2024c] Chen, Y.; Huang, L. & Gou, T.: , 2024c; Applications and advances of artificial intelligence in music generation:a review. | |
| dc.relation.references | [Cheuk et al., 2023] Cheuk, K. W.; Choi, K.; Kong, Q.; Li, B.; Won, M.; Wang, J.-C. & Herremans, Y.-N. H. D.: , 2023; Jointist: Simultaneous improvement of multi-instrument transcription and music source separation via joint training; arXiv preprint arXiv:2302.00286. | |
| dc.relation.references | [Chi et al., 2024] Chi, X.; Wang, Y.; Cheng, A.; Fang, P.; Tian, Z.; He, Y.-Y.; Liu, Z.; Qi, X.; Pan, J.; Zhang, R.; Li, M.; Yuan, R.; Jiang, Y.; Xue, W.; Luo, W.; Chen, Q.; Zhang, S.; fei Liu, Q. & Guo, Y.-T.: , 2024; Mmtrail: A multimodal trailer video dataset with language and music descriptions; ArXiv; abs/2407.20962. | |
| dc.relation.references | [Chiarion et al., 2023] Chiarion, G.; Sparacino, L.; Antonacci, Y.; Faes, L. & Mesin, L.: , 2023; Connectivity analysis in eeg data: a tutorial review of the state of the art and emerging trends; Bioengineering; 10 (3): 372. | |
| dc.relation.references | [Cho et al., 2017] Cho, H.; Ahn, M.; Ahn, S.; Kwon, M. & Jun, S. C.: , 2017; Eeg datasets for motor imagery brain–computer interface; GigaScience; 6 (7): gix034. | |
| dc.relation.references | [Cho, 2024] Cho, H. e. a.: , 2024; Eeg datasets for motor imagery brain–computer interface; GigaScience; 13 (2): gix034. | |
| dc.relation.references | [Choi et al., 2015] Choi, W.; Kim, J. & Lee, B.: , 2015; Eeg classification of word perception using common spatial pattern filter; en The 3rd International Winter Conference on Brain-Computer Interface; págs. 1–4. | |
| dc.relation.references | [Cideron et al., 2024] Cideron, G.; Girgin, S.; Verzetti, M.; Vincent, D.; Kastelic, M.; Borsos, Z.; McWilliams, B.; Ungureanu, V.; Bachem, O.; Pietquin, O.; Geist, M.; Hussenot, L.; Zeghidour, N. & Agostinelli, A.: , 2024; Musicrl: Aligning music generation to human preferences. | |
| dc.relation.references | [Collazos-Huertas et al., 2020a] Collazos-Huertas, D. F.; Álvarez-Meza, A. M.; Acosta-Medina, C. D.; Castaño-Duque, G. & Castellanos-Dominguez, G.: , 2020a; Cnn-based framework using spatial dropping for enhanced interpretation of neural activity in motor imagery classification; Brain Informatics; 7 (1): 8. | |
| dc.relation.references | [Collazos-Huertas et al., 2020b] Collazos-Huertas, D. F.; Álvarez Meza, A. M.; Acosta-Medina, C. D. & Castellanos-Domínguez, G.: , 2020b; Cnn-based framework using spatial dropping for enhanced interpretation of neural activity in motor imagery classification; Brain Informatics; 7 (8): 1–13. | |
| dc.relation.references | [Collazos-Huertas et al., 2021] Collazos-Huertas, D. F.; Velasquez-Martinez, L. F.; Perez-Nastar, H. D.; Alvarez-Meza, A. M. & Castellanos-Dominguez, G.: , 2021; Deep and wide transfer learning with kernel matching for pooling data from electroencephalography and psychological questionnaires; Sensors; 21 (15). | |
| dc.relation.references | [Collazos-Huertas et al., 2022] Collazos-Huertas, D. F.; Álvarez-Meza, A. M. & Castellanos-Dominguez, G.: , 2022; Image-based learning using gradient class activation maps for enhanced physiological interpretability of motor imagery skills; Applied Sciences; 12 (3): 1695. | |
| dc.relation.references | [Collazos-Huertas et al., 2023] Collazos-Huertas, D. F.; Álvarez-Meza, A. M.; Cárdenas-Peña, D. A.; Castaño-Duque, G. A. & Castellanos-Domínguez, C. G.: , 2023; Posthoc interpretability of neural responses by grouping subject motor imagery skills using cnn-based connectivity; Sensors; 23 (5): 2750. | |
| dc.relation.references | [Cortes & Vapnik, 1995] Cortes, C. & Vapnik, V.: , 1995; Support-vector networks; Machine Learning; 20 (3): 273–297. | |
| dc.relation.references | [Craik et al., 2019a] Craik, A.; He, Y. & Contreras-Vidal, J. L.: , 2019a; Deep learning for electroencephalogram (eeg) classification tasks: a review; Journal of Neural Engineering; 16 (3): 031001. | |
| dc.relation.references | [Craik et al., 2019b] Craik, A.; He, Y. & Contreras-Vidal, J. L.: , 2019b; Deep learning for electroencephalogram (eeg) classification tasks: a review; Journal of neural engineering; 16 (3): 031001. | |
| dc.relation.references | [Cui et al., 2022a] Cui, F.; Wang, R.; Ding, W.; Chen, Y. & Huang, L.: , 2022a; A novel de-cnn-bilstm multi-fusion model for eeg emotion recognition; Mathematics; 10 (4): 582. | |
| dc.relation.references | [Cui et al., 2022b] Cui, X.; Wu, Y.; Wu, J.; You, Z.; Xiahou, J. & Ouyang, M.: , 2022b; A review: Music-emotion recognition and analysis based on eeg signals; Frontiers in Neuroinformatics; 16. | |
| dc.relation.references | [Cui et al., 2022c] Cui, X.; Wu, Y.; Wu, J.; You, Z.; Xiahou, J. & Ouyang, M.: , 2022c; A review: Music-emotion recognition and analysis based on eeg signals; Frontiers in neuroinformatics; 16: 997282. | |
| dc.relation.references | [Dai et al., 2019] Dai, M.; Zheng, D.; Na, R.; Wang, S. & Zhang, S.: , 2019; Eeg classification of motor imagery using a novel deep learning framework; Sensors; 19 (3): 551. | |
| dc.relation.references | [Dash & Agres, 2024] Dash, A. & Agres, K.: , 2024; Ai-based affective music generation systems: A review of methods and challenges; ACM Comput. Surv.; 56 (11). | |
| dc.relation.references | [De Haan, 2013] De Haan, M.: , 2013; Infant EEG and event-related potentials; Psychology Press. | |
| dc.relation.references | [Demir et al., 2021a] Demir, F.; Sobahi, N.; Siuly, S. & Sengur, A.: , 2021a; Exploring deep learning features for automatic classification of human emotion using eeg rhythms; IEEE Sensors Journal; 21: 14923–14930; [CrossRef]. | |
| dc.relation.references | [Demir et al., 2021b] Demir, F.; Sobahi, N.; Siuly, S. & Sengur, A.: , 2021b; Exploring deep learning features for automatic classification of human emotion using eeg rhythms; IEEE Sensors Journal; 21 (13): 14923–14930. | |
| dc.relation.references | [Deng et al., 2021] Deng, X.; Zhu, J. & Yang, S.: , 2021; Sfe-net: Eeg-based emotion recognition with symmetrical spatial feature extraction; en Proceedings of the 29th ACM international conference on multimedia; págs. 2391–2400. | |
| dc.relation.references | [Ding et al., 2021] Ding, Y.; Robinson, N.; Zhang, S.; Zeng, Q. & Guan, C.: , 2021; Tsception: Capturing temporal dynamics and spatial asymmetry from eeg for emotion recognition; arXiv preprint arXiv:2104.02935. | |
| dc.relation.references | [Dissanayake et al., 2022] Dissanayake, U. C.; Steuber, V. & Amirabdollahian, F.: , 2022; Eeg spectral feature modulations associated with fatigue in robot-mediated upper limb gross and fine motor interactions; Frontiers in Neurorobotics; 15: 788494. | |
| dc.relation.references | [Du et al., 2025] Du, B.; Yu, H.; Yao, H.; Wang, Y. & Wang, C.: , 2025; Research on δ-γ phase-amplitude coupling characteristics of motor imagery based on eeg; Biomedical Signal Processing and Control; 100: 106958. | |
| dc.relation.references | [Duan et al., 2023a] Duan, J.; Cheng, H.; Wang, S.; Zavalny, A.; Wang, C.; Xu, R.; Kailkhura, B. & Xu, K.: , 2023a; Shifting attention to relevance: Towards the predictive uncertainty quantification of free-form large language models; arXiv preprint arXiv:2307.01379. | |
| dc.relation.references | [Duan et al., 2023b] Duan, T.; Wang, Z.; Liu, S.; Yin, Y. & Srihari, S. N.: , 2023b; Uncer: A framework for uncertainty estimation and reduction in neural decoding of eeg signals; Neurocomputing; 538: 126210. | |
| dc.relation.references | [Duraisamy et al., 2024] Duraisamy, P.; V, N.; N, K. G. & S, N.: , 2024; Music generation algorithms: An in-depth review of future directions and applications explored; 2024 IEEE International Conference on Big Data & Machine Learning (ICBDML): 198–203. | |
| dc.relation.references | [Echtioui et al., 2024] Echtioui, A.; Zouch, W.; Ghorbel, M.; Mhiri, C. & Hamam, H.: , 2024; Classification of bci multiclass motor imagery task based on artificial neural network; Clinical EEG and Neuroscience; 55 (4): 455–464. | |
| dc.relation.references | [Edelman et al., 2025] Edelman, B. J.; Zhang, S.; Schalk, G.; Brunner, P.; Müller- Putz, G.; Guan, C. & He, B.: , 2025; Non-invasive brain-computer interfaces: State of the art and trends; IEEE Reviews in Biomedical Engineering; 18: 26–49. | |
| dc.relation.references | [El-Haddad & Laouris, 2011] El-Haddad, C. & Laouris, Y.: , 2011; The ability of children with mild learning disabilities to encode emotions through facial expressions; Toward Autonomous, Adaptive, and Context-Aware Multimodal Interfaces. Theoretical and Practical Issues: Third COST 2102 International Training School, Caserta, Italy, March 15-19, 2010, Revised Selected Papers: 387–402. | |
| dc.relation.references | [El Ouahidi et al., 2024] El Ouahidi, Y.; Gripon, V.; Pasdeloup, B.; Bouallegue, G.; Farrugia, N. & Lioi, G.: , 2024; A strong and simple deep learning baseline for bci motor imagery decoding; IEEE Transactions on Neural Systems and Rehabilitation Engineering. | |
| dc.relation.references | [ELAhwal et al., 2021] ELAhwal, S. A.; El-Heneedy, Y. A. E.; Bahnasy, W. S.; Amer, R. A. R. & Rashed, K. H.: , 2021; The interictal activities load and cognitive performance of children with typical absence epilepsy; The Egyptian Journal of Neurology, Psychiatry and Neurosurgery; 57 (1): 57. | |
| dc.relation.references | [Fahimi et al., 2019] Fahimi, F.; Zhang, Z.; Goh, W. B.; Lee, T.-S.; Ang, K. K. & Guan, C.: , 2019; Inter-subject transfer learning with an end-to-end deep convolutional neural network for eeg-based bci; Journal of neural engineering; 16 (2): 026007. | |
| dc.relation.references | [Fahimi et al., 2020] Fahimi, F.; Dosen, S.; Ang, K. K.; Mrachacz-Kersting, N. & Guan, C.: , 2020; Generative adversarial networks-based data augmentation for brain–computer interface; IEEE transactions on neural networks and learning systems; 32 (9): 4039–4051. | |
| dc.relation.references | [Fan et al., 2021a] Fan, C.; Peng, Y.; Peng, S.; Zhang, H.; Wu, Y. & Kwong, S.: , 2021a; Detection of train driver fatigue and distraction based on forehead eeg: a time-series ensemble learning method; IEEE transactions on intelligent transportation systems; 23 (8): 13559–13569. | |
| dc.relation.references | [Fan et al., 2021b] Fan, C.-C.; Yang, H.; Hou, Z.-G.; Ni, Z.-L.; Chen, S. & Fang, Z.: , 2021b; Bilinear neural network with 3-d attention for brain decoding of motor imagery movements from the human eeg; Cognitive Neurodynamics; 15: 181–189. | |
| dc.relation.references | [Feng et al., 2024] Feng, Y.; Li, J. & Song, X.: , 2024; Testing conditional quantile independence with functional covariate; Biometrics; 80 (2): ujae036. | |
| dc.relation.references | [Fernando et al., 2024] Fernando, P.; Mahanama, T. V. & Wickramasinghe, M.: , 2024; Assessment of human emotional responses to ai–composed music: A systematic literature review; 2024 International Research Conference on Smart Computing and Systems Engineering (SCSE); 7: 1–6. | |
| dc.relation.references | [Gal & Ghahramani, 2016] Gal, Y. & Ghahramani, Z.: , 2016; Dropout as a bayesian approximation: Representing model uncertainty in deep learning; en Proceedings of The 33rd International Conference on Machine Learning; PMLR; págs. 1050–1059. | |
| dc.relation.references | [Gallo & Phung, 2022] Gallo, A. & Phung, M. D.: , 2022; Classification of eeg motor imagery using deep learning for brain-computer interface systems; arXiv preprint arXiv:2206.07655. | |
| dc.relation.references | [Gao et al., 2020] Gao, Y.; Gao, B.; Chen, Q.; Liu, J. & Zhang, Y.: , 2020; Deep convolutional neural network-based epileptic electroencephalogram (eeg) signal classification; Frontiers in neurology; 11: 375. | |
| dc.relation.references | [Garcia Murillo, 2024] Garcia Murillo, D. G.: , 2024; Regularized gaussian functional connectivity network with post-hoc interpretation for improved eeg-based motor imagerybci classification. | |
| dc.relation.references | [García-Murillo et al., 2021] García-Murillo, D. G.; Alvarez-Meza, A. & Castellanos- Dominguez, G.: , 2021; Single-trial kernel-based functional connectivity for enhanced feature extraction in motor-related tasks; Sensors; 21 (8): 2750. | |
| dc.relation.references | [García-Murillo et al., 2023] García-Murillo, D. G.; Álvarez-Meza, A. M. & Castellanos-Dominguez, C. G.: , 2023; Kcs-fcnet: Kernel cross-spectral functional connectivity network for eeg-based motor imagery classification; Diagnostics; 13 (6): 1122. | |
| dc.relation.references | García-Murillo et al., 2023] García-Murillo, D.; Álvarez Meza, A. & Castellanos- Dominguez, C.: , 2023; Kcs-fcnet: Kernel cross-spectral functional connectivity network for eeg-based motor imagery classification; Diagnostics; 13: 1122; [CrossRef]. | |
| dc.relation.references | [Gaur et al., 2021a] Gaur, P.; Gupta, H.; Chowdhury, A.; McCreadie, K.; Pachori, R. B. & Wang, H.: , 2021a; A sliding window common spatial pattern for enhancing motor imagery classification in eeg-bci; IEEE Transactions on Instrumentation and Measurement; 70: 1–9. | |
| dc.relation.references | [Gaur et al., 2021b] Gaur, P.; McCreadie, K.; Pachori, R. B.; Wang, H. & Prasad, G.: , 2021b; An automatic subject specific channel selection method for enhancing motor imagery classification in eeg-bci using correlation; Biomedical Signal Processing and Control; 68: 102574. | |
| dc.relation.references | [Geisser & Eddy, 1979] Geisser, S. & Eddy, W. F.: , 1979; A predictive approach to model selection; Journal of the American Statistical Association; 74 (365): 153–160. | |
| dc.relation.references | [Géron, 2022] Géron, A.: , 2022; Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow; " O’Reilly Media, Inc.". | |
| dc.relation.references | [Goldman, 2024] Goldman, A.: , 2024; Neuroscience in music research: Critical challenges and contributions; Music Perception: 1–18. | |
| dc.relation.references | [Gómez Marín et al., 2018] Gómez Marín, D. et al.: , 2018; Similarity and style in electronic dance music drum rhythms; Tesis Doctoral; Universitat Pompeu Fabra. | |
| dc.relation.references | [Gonzalez & Vargas, 2022] Gonzalez, R. & Vargas, L.: , 2022; Deep learning-based mibci classification: A comparative study; Neural Computing and Applications; 34: 10147–10159. | |
| dc.relation.references | [Gordienko et al., 2021] Gordienko, Y.; Kostiukevych, K.; Gordienko, N.; Rokovyi, O.; Alienin, O. & Stirenko, S.: , 2021; Deep learning for grasp-and-lift movement forecasting based on electroencephalography by brain-computer interface; en Advances in Artificial Systems for Logistics Engineering; Springer; págs. 3–12. | |
| dc.relation.references | [Gour et al., 2023] Gour, N.; Hassan, T.; Owais, M.; Ganapathi, I. I.; Khanna, P.; Seghier, M. L. & Werghi, N.: , 2023; Transformers for autonomous recognition of psychiatric dysfunction via raw and imbalanced eeg signals; Brain informatics; 10 (1): 25. | |
| dc.relation.references | [Grajski et al., 1986] Grajski, K. A.; Breiman, L.; Di Prisco, G. V. & Freeman, W. J.: , 1986; Classification of eeg spatial patterns with a tree-structured methodology: Cart; IEEE transactions on biomedical engineering; (12): 1076–1086. | |
| dc.relation.references | [Grigorev et al., 2021] Grigorev, N. A.; Savosenkov, A. O.; Lukoyanov, M. V.; Udoratina, A.; Shusharina, N. N.; Kaplan, A. Y.; Hramov, A. E.; Kazantsev, V. B. & Gordleeva, S.: , 2021; A bci-based vibrotactile neurofeedback training improves motor cortical excitability during motor imagery; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 29: 1583–1592. | |
| dc.relation.references | [Gu et al., 2024a] Gu, X.; Jiang, L.; Chen, H.; Li, M. & Liu, C.: , 2024a; Exploring brain dynamics via eeg and steady-state activation map networks in music composition; Brain Sciences; 14 (3): 216; doi:10.3390/brainsci14030216; URL https://doi.org/ 10.3390/brainsci14030216. | |
| dc.relation.references | [Gu et al., 2024b] Gu, X.; Jiang, L.; Chen, H.; Li, M. & Liu, C.: , 2024b; Exploring brain dynamics via eeg and steady-state activation map networks in music composition; Brain Sciences; 14 (3). | |
| dc.relation.references | [Gundu & Panem, 2025] Gundu, S. R. & Panem, C. A.: , 2025; Chapter 14 - a review on contemporary brain–computer interface researches and limitations; en Brain-Computer Interfaces (Editado por El-Baz, A. S. & Suri, J. S.); Advances in Neural Engineering; Academic Press; págs. 287–295. | |
| dc.relation.references | [Hameed et al., 2024] Hameed, A.; Fourati, R.; Ammar, B.; Ksibi, A.; Alluhaidan, A. S.; Ayed, M. B. & Khleaf, H. K.: , 2024; Temporal–spatial transformer based motor imagery classification for bci using independent component analysis; Biomedical Signal Processing and Control; 87: 105359. | |
| dc.relation.references | [Hammer et al., 2012] Hammer, E. M.; Halder, S.; Blankertz, B.; Sannelli, C.; Dickhaus, T.; Kleih, S.; Müller, K.-R. & Kübler, A.: , 2012; Psychological predictors of smr-bci performance; Biological psychology; 89 (1): 80–86. | |
| dc.relation.references | [Han et al., 2022] Han, D.; Kong, Y.; Han, J. & Wang, G.: , 2022; A survey of music emotion recognition; Frontiers of Computer Science; 16 (6): 166335. | |
| dc.relation.references | [He et al., 2023] He, C.; Chen, Y.-Y.; Phang, C.-R.; Stevenson, C.; Chen, I.-P.; Jung, T.-P. & Ko, L.-W.: , 2023; Diversity and suitability of the state-of-the-artwearable and wireless eeg systems review; IEEE Journal of Biomedical and Health Informatics; 27 (8): 3830–3843. | |
| dc.relation.references | [He et al., 2024] He, Y.; Liu, Z.; Chen, J.; Tian, Z.; Liu, H.; Chi, X. & Chen, Q.: , 2024; Llms meet multimodal generation and editing: A survey; arXiv preprint; arXiv:2405.19334; URL https://arxiv.org/abs/2405.19334. | |
| dc.relation.references | [Hossain et al., 2023] Hossain, K. M.; Islam, M. A.; Hossain, S.; Nijholt, A. & Ahad, M. A. R.: , 2023; Status of deep learning for eeg-based brain–computer interface applications; Frontiers in computational neuroscience; 16: 1006763. | |
| dc.relation.references | [Hu et al., 2022] Hu, H.; Pu, Z.; Li, H.; Liu, Z. & Wang, P.: , 2022; Learning optimal time-frequency-spatial features by the cissa-csp method for motor imagery eeg classification; Sensors; 22 (21): 8526. | |
| dc.relation.references | [Hu et al., 2024] Hu, M.; Ren, J.; Pan, Y.; Cheng, L.; Xu, X.; Tan, C. L.; Sun, H.; Shi, Y. & Yan, S.: , 2024; Scaled elastic hydrogel interfaces for brain electrophysiology; Advanced Functional Materials: 2407926. | |
| dc.relation.references | [Huang et al., 2021] Huang, D.; Chen, S.; Liu, C.; Zheng, L.; Tian, Z. & Jiang, D.: , 2021; Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for eeg emotion recognition; Neurocomputing; 448: 140–151. | |
| dc.relation.references | [Hurtado-Rincón et al., 2016] Hurtado-Rincón, J. V.; Martínez-Vargas, J. D.; Rojas- Jaramillo, S.; Giraldo, E. & Castellanos-Dominguez, G.: , 2016; Identification of relevant inter-channel eeg connectivity patterns: a kernel-based supervised approach; en Brain Informatics and Health: International Conference, BIH 2016, Omaha, NE, USA, October 13-16, 2016 Proceedings; Springer; págs. 14–23. | |
| dc.relation.references | [Ibrahim et al., 2015] Ibrahim, E. F.; Richardson, M. D. & Nestel, D.: , 2015; Mental imagery and learning: a qualitative study in orthopaedic trauma surgery; Medical education; 49 (9): 888–900. | |
| dc.relation.references | [Iyer et al., 2015] Iyer, P. M.; Egan, C.; Pinto-Grau, M.; Burke, T.; Elamin, M.; Nasseroleslami, B.; Pender, N.; Lalor, E. C. & Hardiman, O.: , 2015; Functional connectivity changes in resting-state eeg as potential biomarker for amyotrophic lateral sclerosis; PloS one; 10 (6): e0128682. | |
| dc.relation.references | [Jaipriya & Sriharipriya, 2024a] Jaipriya, D. & Sriharipriya, K.: , 2024a; Brain computer interface-based signal processing techniques for feature extraction and classification of motor imagery using eeg: A literature review; Biomedical Materials & Devices; 2 (2): 601–613. | |
| dc.relation.references | [Jaipriya & Sriharipriya, 2024b] Jaipriya, D. & Sriharipriya, K. C.: , 2024b; Brain computer interface-based signal processing techniques for feature extraction and classification of motor imagery using eeg: A literature review; Biomedical Materials & Devices; 2 (2): 601–613. | |
| dc.relation.references | [Jaiswal et al., 2023] Jaiswal, G.; Rani, R.; Mangotra, H. & Sharma, A.: , 2023; Integration of hyperspectral imaging and autoencoders: Benefits, applications, hyperparameter tunning and challenges; Computer Science Review; 50: 100584. | |
| dc.relation.references | [Jamshidi et al., 2024a] Jamshidi, F.; Pike, G.; Das, A. & Chapman, R.: , 2024a; Machine learning techniques in automatic music transcription: A systematic survey; ArXiv; abs/2406.15249. | |
| dc.relation.references | [Jamshidi et al., 2024b] Jamshidi, F.; Pike, G.; Das, A. & Chapman, R.: , 2024b; Machine learning techniques in automatic music transcription: A systematic survey; arXiv preprint arXiv:2406.15249. | |
| dc.relation.references | [Janapati et al., 2023] Janapati, R.; Dalal, V. & Sengupta, R.: , 2023; Advances in modern eeg-bci signal processing: A review; Materials Today: Proceedings; 80: 2563–2566. | |
| dc.relation.references | [Jeannerod, 2001a] Jeannerod, M.: , 2001a; Neural simulation of action: a unifying mechanism for motor cognition; Neuroimage; 14 (1): S103–S109. | |
| dc.relation.references | [Jeannerod, 2001b] Jeannerod, M.: , 2001b; Neural simulation of action: a unifying mechanism for motor cognition; Neuroimage; 14 (1): S103–S109. | |
| dc.relation.references | [Ji et al., 2020] Ji, S.; Luo, J. & Yang, X.: , 2020; A comprehensive survey on deep music generation: Multi-level representations, algorithms, evaluations, and future directions; ArXiv; abs/2011.06801. | |
| dc.relation.references | [Ji et al., 2023] Ji, S.; Yang, X. & Luo, J.: , 2023; A survey on deep learning for symbolic music generation: Representations, algorithms, evaluations, and challenges; ACM Computing Surveys; 56: 1 – 39. | |
| dc.relation.references | [Jiang et al., 2023] Jiang, H.; Shen, F.; Chen, L.; Peng, Y.; Guo, H. & Gao, H.: , 2023; Joint domain symmetry and predictive balance for cross-dataset eeg emotion recognition; Journal of Neuroscience Methods; 400: 109978. | |
| dc.relation.references | [Jiang et al., 2021a] Jiang, P.; Zhang, C.; Hou, Q.; Cheng, M. &Wei, Y.: , 2021a; Layercam: Exploring hierarchical class activation maps for localization; IEEE Transactions on Image Processing; 30: 5875–5888; [CrossRef] [PubMed]. | |
| dc.relation.references | [Jiang et al., 2021b] Jiang, P.-T.; Zhang, C.-B.; Hou, Q.; Cheng, M.-M. & Wei, Y.: , 2021b; Layercam: Exploring hierarchical class activation maps for localization; IEEE Transactions on Image Processing; 30: 5875–5888. | |
| dc.relation.references | [Jiang & Chen, 2019] Jiang, T. & Chen, H.: , 2019; Electromagnetic artifacts in eeg: Impact and removal strategies; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 27 (4): 789–798. | |
| dc.relation.references | [Jiang et al., 2024] Jiang, X.; Meng, L.; Chen, X.; Xu, Y. & Wu, D.: , 2024; Csp-net: Common spatial pattern empowered neural networks for eeg-based motor imagery classification; Knowledge-Based Systems; 305: 112668. | |
| dc.relation.references | [Jung & Oh, 2021] Jung, H. & Oh, Y.: , 2021; Towards better explanations of class activation mapping; en Proceedings of the IEEE/CVF International Conference on Computer Vision; Montreal, QC, Canada; págs. 1336–1344; [CrossRef]. | |
| dc.relation.references | [Kang & Herremans, 2024] Kang, J. & Herremans, D.: , 2024; Are we there yet? a brief survey of music emotion prediction datasets, models and outstanding challenges; ArXiv; abs/2406.08809. | |
| dc.relation.references | [Katona, 2022] Katona, J.: , 2022; Measuring cognition load using eye-tracking parameters based on algorithm description tools; Sensors; 22 (3): 912. | |
| dc.relation.references | [Katona & Kovari, 2018] Katona, J. & Kovari, A.: , 2018; The evaluation of bci and pebl-based attention tests; Acta Polytechnica Hungarica; 15 (3): 225–249. | |
| dc.relation.references | [Kim et al., 2023] Kim, H.; Luo, J.; Chu, S.; Cannard, C.; Hoffmann, S. & Miyakoshi, M.: , 2023; Ica’s bug: How ghost ics emerge from effective rank deficiency caused by eeg electrode interpolation and incorrect re-referencing; Frontiers in Signal Processing; 3: 1064138; [CrossRef]. | |
| dc.relation.references | [Kim et al., 2022] Kim, S.-J.; Lee, D.-H. & Lee, S.-W.: , 2022; Rethinking cnn architecture for enhancing decoding performance of motor imagery-based eeg signals; IEEE Access; 10: 96984–96996. | |
| dc.relation.references | [Kim et al., 2018] Kim, Y. K.; Park, E.; Lee, A.; Im, C.-H. & Kim, Y.-H.: , 2018; Changes in network connectivity during motor imagery and execution; PloS one; 13 (1): e0190715. | |
| dc.relation.references | [Ko et al., 2021] Ko, W.; Jeon, E.; Jeong, S.; Phyo, J. & Suk, H.-I.: , 2021; A survey on deep learning-based short/zero-calibration approaches for eeg-based brain–computer interfaces; Frontiers in Human Neuroscience; 15: 643386. | |
| dc.relation.references | [Koelstra et al., 2011] Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A. & Patras, I.: , 2011; Deap: A database for emotion analysis; using physiological signals; IEEE transactions on affective computing; 3 (1): 18–31. | |
| dc.relation.references | [Kong, 2025] Kong, X.: , 2025; Deep learning in music generation: A comprehensive investigation of models, challenges and future directions; en ITM Web of Conferences, tomo 70; EDP Sciences; pág. 04027. | |
| dc.relation.references | [Korkan et al., 2021] Korkan, N.; Olmez, T. & Dokur, Z.: , 2021; Generating ten bci commands using four simple motor imageries; arXiv preprint arXiv:2105.14493. | |
| dc.relation.references | [Krumnikl & Maiwald, 2024] Krumnikl, M. & Maiwald, V.: , 2024; Facial emotion recognition for mobile devices: A practical review; IEEE Access. | |
| dc.relation.references | [Kulkarni & Patil, 2023] Kulkarni, S. & Patil, P.: , 2023; Eeg-based emotion recognition based on deap dataset with genetic algorithm augmented multi-layer perceptron; en 2023 OITS International Conference on Information Technology (OCIT); IEEE; págs. 687–692. | |
| dc.relation.references | [Kumar & Kumar, 2021] Kumar, A. & Kumar, A.: , 2021; Deepher: Human emotion recognition using an eeg-based deep learning network model; Engineering Proceedings; 10 (1): 32. | |
| dc.relation.references | [Kumar et al., 2019] Kumar, S.; Yadava, M. & Roy, P. P.: , 2019; Fusion of eeg response and sentiment analysis of products review to predict customer satisfaction; information fusion; 52: 41–52. | |
| dc.relation.references | [Kumar et al., 2024] Kumar, Y.; Kumar, J. & Sheoran, P.: , 2024; Integration of cloud computing in bci: A review; Biomedical Signal Processing and Control; 87: 105548. | |
| dc.relation.references | [Kutlu et al., 2024] Kutlu, İ. Ç.; Tashan, W.; Shayea, I. & Albatyrova, M.: , 2024; An introductory guide on creating a pandas-based eeg analysis and action prediction tool for bci systems; en 2024 IEEE 13th International Conference on Communication Systems and Network Technologies (CSNT); IEEE; págs. 1372–1378. | |
| dc.relation.references | [Ladda et al., 2021a] Ladda, A. M.; Lebon, F. & Lotze, M.: , 2021a; Using motor imagery practice for improving motor performance – a review; Brain and Cognition; 150; URL https://api.semanticscholar.org/CorpusID:232067009. | |
| dc.relation.references | [Ladda et al., 2021b] Ladda, A. M.; Lebon, F. & Lotze, M.: , 2021b; Using motor imagery practice for improving motor performance–a review; Brain and cognition; 150: 105705. | |
| dc.relation.references | [Lam et al., 2023] Lam, M. W. Y.; Tian, Q.; Li, T.; Yin, Z.; Feng, S.; Tu, M.; Ji, Y.; Xia, R.; Ma, M.; Song, X.; Chen, J.; Yuping, W. & Wang, Y.: , 2023; Efficient neural music generation; en Advances in Neural Information Processing Systems, tomo 36 (Editado por Oh, A.; Naumann, T.; Globerson, A.; Saenko, K.; Hardt, M. & Levine, S.); Curran Associates, Inc.; págs. 17450–17463. | |
| dc.relation.references | [Latake, 2024] Latake, S. P.: , 2024; A comprehensive review on automatic music transcription: Survey of transcription techniques; Multimedia Research. | |
| dc.relation.references | [Lawhern et al., 2018] Lawhern, V.; Solon, A.; Waytowich, N.; Gordon, S.; Hung, C. & Lance, B.: , 2018; Eegnet: A compact convolutional neural network for eeg-based brain–computer interfaces; Journal of Neural Engineering; 15 (5): 056013; [CrossRef]. | |
| dc.relation.references | [Lee et al., 2022] Lee, M.; Yoon, J. & Lee, S.: , 2022; W (2020) predicting motor imagery performance from resting-state eeg using dynamic causal modeling; Inter-and Intra-subject Variability in Brain Imaging and Decoding; 14: 321098. | |
| dc.relation.references | [Li et al., 2021] Li, C.; Qin, C. & Fang, J.: , 2021; Motor-imagery classification model for brain-computer interface: a sparse group filter bank representation model; arXiv. | |
| dc.relation.references | [Li et al., 2025] Li, D.; Zang, Y. & Kong, Q.: , 2025; Piano transcription by hierarchical language modeling with pretrained roll-based encoders. | |
| dc.relation.references | [Li & Zhang, 2018] Li, H. & Zhang, T.: , 2018; Multi-kernel stein spatial patterns for eeg motor imagery classification; Pattern Recognition Letters; 112: 75–82. | |
| dc.relation.references | [Li et al., 2024] Li, H.; Zeng, Y.; Bai, Z.; Li, W.; Wu, K. & Zhou, J.: , 2024; Eegfnirs- based music emotion decoding and individualized music generation; en 2024 5th International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI); págs. 394–397. | |
| dc.relation.references | [Li et al., 2020] Li, M.; Wang, R. & Xu, D.: , 2020; An improved composite multiscale fuzzy entropy for feature extraction of mi-eeg; Entropy; 22 (12): 1356. | |
| dc.relation.references | [Li & Deng, 2021] Li, X. & Deng, C.: , 2021; Eeg-based seizure prediction via model uncertainty learning; en Proceedings of the 43rd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC); IEEE; págs. 2976–2979. | |
| dc.relation.references | [Liang, 2023] Liang, J.: , 2023; Harmonizing minds and machines: survey on transformative power of machine learning in music; Frontiers in Neurorobotics; 17. | |
| dc.relation.references | [Liang et al., 2006] Liang, N.-Y.; Saratchandran, P.; Huang, G.-B. & Sundararajan, N.: , 2006; Classification of mental tasks from eeg signals using extreme learning machine; International journal of neural systems; 16 (01): 29–38. | |
| dc.relation.references | [Liang et al., 2022] Liang, S.; Hang, W.; Lei, B.; Wang, J.; Qin, J.; Choi, K.-S. & Zhang, Y.: , 2022; Adaptive multimodel knowledge transfer matrix machine for eeg classification; IEEE Transactions on Neural Networks and Learning Systems. | |
| dc.relation.references | [Lin et al., 2024a] Lin, P.-J.; Li, W.; Zhai, X.; Li, Z.; Sun, J.; Xu, Q.; Pan, Y.; Ji, L. & Li, C.: , 2024a; Explainable deep-learning prediction for brain-computer interfaces supported lower extremity motor gains based on multi-state fusion; IEEE Transactions on Neural Systems and Rehabilitation Engineering. | |
| dc.relation.references | [Lin et al., 2023] Lin, X.; Chen, J.; Ma, W.; Tang, W. & Wang, Y.: , 2023; Eeg emotion recognition using improved graph neural network with channel selection; Computer Methods and Programs in Biomedicine; 231: 107380. | |
| dc.relation.references | [Lin et al., 2024b] Lin, Y.; Dai, Z. & Kong, Q.: , 2024b; Musicscore: A dataset for music score modeling and generation; ArXiv; abs/2406.11462. | |
| dc.relation.references | [Lin et al., 2025] Lin, Y.-X.; Lin, J.-C.; Wei, W.-L. & Wang, J.-C.: , 2025; Learnable counterfactual attention for music classification; IEEE Transactions on Audio, Speech and Language Processing. | |
| dc.relation.references | [Liu et al., 2021] Liu, H.; Zhang, Y.; Li, Y. & Kong, X.: , 2021; Review on emotion recognition based on electroencephalography; Frontiers in Computational Neuroscience; 15: 758212. | |
| dc.relation.references | [Liu et al., 2024] Liu, H.; Wei, P.; Wang, H.; Lv, X.; Duan, W.; Li, M.; Zhao, Y.; Wang, Q.; Chen, X.; Shi, G. et al.: , 2024; An eeg motor imagery dataset for brain computer interface in acute stroke patients; Scientific Data; 11 (1): 131. | |
| dc.relation.references | [Liuzzi et al., 2022] Liuzzi, P.; Grippo, A.; Campagnini, S.; Scarpino, M.; Draghi, F.; Romoli, A.; Hakiki, B.; Sterpu, R.; Maiorelli, A.; Macchi, C. et al.: , 2022; Merging clinical and eeg biomarkers in an elastic-net regression for disorder of consciousness prognosis prediction; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 30: 1504–1513. | |
| dc.relation.references | [Long et al., 2021] Long, S.; Ding, R.; Wang, J.; Yu, Y.; Lu, J. & Yao, D.: , 2021; Sleep quality and electroencephalogram delta power; Frontiers in Neuroscience; 15: 803507. | |
| dc.relation.references | [Lopez & Ramirez, 2021] Lopez, S. & Ramirez, A.: , 2021; Mitigating eeg noise: A review of artifact removal techniques; Biomedical Signal Processing; 58: 102129. | |
| dc.relation.references | [Lopez Duarte, 2024] Lopez Duarte, A. E.: , 2024; A progressive-adaptive music generator (pamg): An approach to interactive procedural music for videogames; en Proceedings of the 12th ACM SIGPLAN International Workshop on Functional Art, Music, Modelling, and Design; FARM 2024; Association for Computing Machinery, New York, NY, USA; pág. 65–72. | |
| dc.relation.references | [Lotte & Bougrain, 2018] Lotte, F. & Bougrain, L.: , 2018; A review of classification algorithms for eeg-based brain–computer interfaces; Journal of Neural Engineering; 15 (3): 031005. | |
| dc.relation.references | [Lotte et al., 2018] Lotte, F.; Bougrain, L.; Cichocki, A.; Clerc, M.; Congedo, M.; Rakotomamonjy, A. & Yger, F.: , 2018; A review of classification algorithms for eeg-based brain–computer interfaces: a 10 year update; Journal of neural engineering; 15 (3): 031005. | |
| dc.relation.references | [Luck, 2018] Luck, S. J.: , 2018; Event-related potentials and cognitive neuroscience; Annual Review of Psychology; 69: 447–479. | |
| dc.relation.references | [Luo et al., 2023] Luo, J.; Wang, Y.; Xia, S.; Lu, N.; Ren, X.; Shi, Z. & Hei, X.: , 2023; A shallow mirror transformer for subject-independent motor imagery bci; Computers in Biology and Medicine; 164: 107254. | |
| dc.relation.references | [Lustenberger et al., 2018] Lustenberger, C.; Patel, Y. A.; Alagapan, S.; Page, J. M.; Price, B.; Boyle, M. R. & Fröhlich, F.: , 2018; High-density eeg characterization of brain responses to auditory rhythmic stimuli during wakefulness and nrem sleep; Neuroimage; 169: 57–68. | |
| dc.relation.references | [Maher et al., 2023] Maher, O. N.; Haikal, A. Y.; Elhosseini, M. A. & Saafan, M.: , 2023; An optimized quadratic support vector machine for eeg based brain computer interface; International journal of electrical and computer engineering systems; 14 (1): 83–91. | |
| dc.relation.references | [Marchesotti et al., 2016] Marchesotti, S.; Bassolino, M.; Serino, A.; Bleuler, H. & Blanke, O.: , 2016; Quantifying the role of motor imagery in brain-machine interfaces; Scientific reports; 6 (1): 24076. | |
| dc.relation.references | [Matsuo et al., 2021] Matsuo, M.; Iso, N.; Fujiwara, K.; Moriuchi, T.; Matsuda, D.; Mitsunaga, W.; Nakashima, A. & Higashi, T.: , 2021; Comparison of cerebral activation between motor execution and motor imagery of self-feeding activity; Neural regeneration research; 16 (4): 778–782. | |
| dc.relation.references | [Miladinović et al., 2020] Miladinović, A.; Ajčević, M.; Jarmolowska, J.; Marusic, U.; Silveri, G.; Battaglini, P. P. & Accardo, A.: , 2020; Performance of eeg motor-imagery based spatial filtering methods: A bci study on stroke patients; Procedia Computer Science; 176: 2840–2848. | |
| dc.relation.references | [Milanés-Hermosilla et al., 2021] Milanés-Hermosilla, D.; Trujillo Codorniú, R.; López-Baracaldo, R.; Sagaró-Zamora, R.; Delisle-Rodriguez, D.; Villarejo- Mayor, J. J. & Núñez-Álvarez, J. R.: , 2021; Monte carlo dropout for uncertainty estimation and motor imagery classification; Sensors; 21 (21): 7241. | |
| dc.relation.references | [Mirowski et al., 2009] Mirowski, P.; Madhavan, D.; LeCun, Y. & Kuzniecky, R.: , 2009; Classification of patterns of eeg synchronization for seizure prediction; Clinical neurophysiology; 120 (11): 1927–1940. | |
| dc.relation.references | [Mohammadi & Mosavi, 2017] Mohammadi, M. & Mosavi, M.: , 2017; Improving the efficiency of an eeg-based brain computer interface using filter bank common spatial pattern: 0878–0882. | |
| dc.relation.references | [Molcho et al., 2024] Molcho, L.; Maimon, N. B.; Zeimer, T.; Chibotero, O.; Rabinowicz, S.; Armoni, V.; On, N. B.; Intrator, N. & Sasson, A.: , 2024; Evaluating cognitive decline detection in aging populations with single-channel eeg features: Insights from studies and meta-analysis. | |
| dc.relation.references | [Molina-Giraldo et al., 2015] Molina-Giraldo, S.; Alfonso-Ospina, L.; Parra-Meza, C.; Lancheros-García, E. A.; Rojas-Arias, J. L. & Acuña-Osorio, E.: , 2015; Prevalencia de malformaciones congénitas diagnosticadas por ultrasonido: tres años de experiencia en una unidad de medicina materno fetal universitaria; Ginecología y obstetricia de México; 83 (11): 680–689. | |
| dc.relation.references | [Mou et al., 2021] Mou, L.; Li, J.; Li, J.; Gao, F.; Jain, R. C. & Yin, B.: , 2021; Memomusic: A personalized music recommendation framework based on emotion and memory; 2021 IEEE 4th International Conference on Multimedia Information Processing and Retrieval (MIPR): 341–347. | |
| dc.relation.references | [Moufassih et al., 2024] Moufassih, M.; Tarahi, O.; Hamou, S.; Agounad, S. & Idrissi Azami, H.: , 2024; Boosting motor imagery brain-computer interface classification using multiband and hybrid feature extraction; Multimedia Tools and Applications; 83 (16): 49441–49472. | |
| dc.relation.references | [Mukhamediev et al., 2022] Mukhamediev, R. I.; Popova, Y.; Kuchin, Y. I.; Zaitseva, E. N.; Kalimoldayev, A.; Symagulov, A.; Levashenko, V. G.; Abdoldina, F.; Gopejenko, V. I.; Yakunin, K.; Muhamedijeva, E. & Yelis, M.: , 2022; Review of artificial intelligence and machine learning technologies: Classification, restrictions, opportunities and challenges; Mathematics. | |
| dc.relation.references | [Murphy, 2022] Murphy, K.: , 2022; Probabilistic Machine Learning: An Introduction; MIT Press, Cambridge, MA, USA. | |
| dc.relation.references | [Musallam et al., 2021] Musallam, Y. K.; AlFassam, N. I.; Muhammad, G.; Amin, S. U.; Alsulaiman, M.; Abdul, W.; Altaheri, H.; Bencherif, M. A. & Algabri, M.: , 2021; Electroencephalography-based motor imagery classification using temporal convolutional network fusion; Biomedical Signal Processing and Control; 69: 102826. | |
| dc.relation.references | [Musha et al., 2022] Musha, A.; Al Mamun, A.; Tahabilder, A.; Hossen, M. J.; Jahan, B. & Ranjbari, S.: , 2022; A deep learning approach for covid-19 and pneumonia detection from chest x-ray images.; International Journal of Electrical & Computer Engineering (2088-8708); 12 (4). | |
| dc.relation.references | [Mycka & Mańdziuk, 2024] Mycka, J. & Mańdziuk, J.: , 2024; Artificial intelligence in music: recent trends and challenges; Neural Computing and Applications. | |
| dc.relation.references | [Nguyen & Tsabary, 2024] Nguyen, P. & Tsabary, E.: , 2024; Towards deconstructivist music: Reconstruction paradoxes, neural networks, concatenative synthesis and automated orchestration in the creative process; Organised Sound; 29 (1): 79–90. | |
| dc.relation.references | [Nicolas-Alonso & Gomez-Gil, 2021] Nicolas-Alonso, L. F. & Gomez-Gil, J.: , 2021; Brain computer interfaces, a review; Sensors; 12: 1211–1279. | |
| dc.relation.references | [Noor & Ige, 2024] Noor, M. H. M. & Ige, A. O.: , 2024; A survey on state-of-the-art deep learning applications and challenges. | |
| dc.relation.references | [Pan et al., 2023] Pan, B.; Hirota, K.; Jia, Z. & Dai, Y.: , 2023; A review of multimodal emotion recognition from datasets, preprocessing, features, and fusion methods; Neurocomputing; 561: 126866. | |
| dc.relation.references | [Pan et al., 2024] Pan, H.; Ding, P.; Wang, F.; Li, T.; Zhao, L.; Nan, W.; Fu, Y. & Gong, A.: , 2024; Comprehensive evaluation methods for translating bci into practical applications: usability, user satisfaction and usage of online bci systems; Frontiers in Human Neuroscience; 18: 1429130. | |
| dc.relation.references | [Peksa & Mamchur, 2023] Peksa, J. & Mamchur, D.: , 2023; State-of-the-art on braincomputer interface technology; Sensors; 23 (13): 6001. | |
| dc.relation.references | [Phadikar et al., 2023] Phadikar, S.; Sinha, N. & Ghosh, R.: , 2023; Unsupervised feature extraction with autoencoders for eeg based multiclass motor imagery bci; Expert Systems with Applications; 213: 118901. | |
| dc.relation.references | [Pichiorri et al., 2020] Pichiorri, F.; Morone, G.; Patanè, F.; Toppi, J.; Molinari, M.; Astolfi, L. & Cincotti, F.: , 2020; Brain-computer interface boosts motor imagery practice during stroke recovery; Annals of Neurology; 87 (5): 751–764. | |
| dc.relation.references | [Pirasteh et al., 2024] Pirasteh, A.; Shamseini Ghiyasvand, M. & Pouladian, M.: , 2024; Eeg-based brain-computer interface methods with the aim of rehabilitating advanced stage als patients; Disability and Rehabilitation: Assistive Technology: 1–11. | |
| dc.relation.references | [Pulgarin-Giraldo et al., 2017] Pulgarin-Giraldo, J. D.; Alvarez-Meza, A.; Insuasti- Ceballos, D.; Bouwmans, T. & Castellanos-Dominguez, G.: , 2017; Gmm background modeling using divergence-based weight updating; en Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications: 21st Iberoamerican Congress, CIARP 2016, Lima, Peru, November 8–11, 2016, Proceedings 21 ; Springer; págs. 282–290. | |
| dc.relation.references | [Qendro et al., 2021] Qendro, L.; van Gerven, M. A. J.; Heskes, T.; Gerven, M. A. v. & Heskes, T.: , 2021; Early exit for uncertainty estimation in deep learning-based eeg classification; Journal of neural engineering; 18 (4): 0460d1. | |
| dc.relation.references | [Raffel & Ellis, 2014] Raffel, C. & Ellis, D. P.: , 2014; Intuitive analysis, creation and manipulation of midi data with pretty_midi; en 15th international society for music information retrieval conference late breaking and demo papers; págs. 84–93. | |
| dc.relation.references | [Rahimi et al., 2024] Rahimi, N.; Kumar, C.; McLinden, J.; Hosni, S. I.; Borgheai, S. B.; Shahriari, Y. & Shao, M.: , 2024; Topology-aware multimodal fusion for neural dynamics representation learning and classification; IEEE Sensors Journal. | |
| dc.relation.references | [Rahman & Liu, 2023] Rahman, M. & Liu, X.: , 2023; Subject-independent mi classification: Challenges and solutions; Frontiers in Neuroscience; 17: 117134. | |
| dc.relation.references | [Rajalakshmi & Sridhar, 2024] Rajalakshmi, A. & Sridhar, S.: , 2024; Classification of yoga, meditation, combined yoga–meditation eeg signals using l-svm, knn, and mlp classifiers; Soft Computing; 28 (5): 4607–4619. | |
| dc.relation.references | [Rakshasbhuvankar et al., 2020] Rakshasbhuvankar, A. A.; Nagarajan, L.; Zhelev, Z. & Rao, S. C.: , 2020; Amplitude-integrated electroencephalography compared with conventional video-electroencephalography for detection of neonatal seizures; The Cochrane Database of Systematic Reviews; 2020 (3): CD013546. | |
| dc.relation.references | [Ramaswamy et al., 2024] Ramaswamy, M.; Philip, J. L.; Priya, V.; Priyadarshini, S.; Ramasamy, M.; Jeevitha, G.; Mathkor, D. M.; Haque, S.; Dabaghzadeh, F.; Bhattacharya, P. & Ahmad, F.: , 2024; Therapeutic use of music in neurological disorders: A concise narrative review; Heliyon; 10 (16): e35564. | |
| dc.relation.references | [Ramirez-Aristizabal & Kello, 2022] Ramirez-Aristizabal, A. G. & Kello, C.: , 2022; Eeg2mel: Reconstructing sound from brain responses to music. | |
| dc.relation.references | [Ramkumar & Paulraj, 2024] Ramkumar, E. & Paulraj, M.: , 2024; Optimized ffnn with multichannel csp-ica framework of eeg signal for bci; Computer Methods in Biomechanics and Biomedical Engineering: 1–18. | |
| dc.relation.references | [Ramoser et al., 1999] Ramoser, H.; Müller-Gerking, J. & Pfurtscheller, G.: , 1999; Optimal spatial filtering of single trial eeg during imagined hand movement; IEEE Transactions on Rehabilitation Engineering; 8: 441–446. | |
| dc.relation.references | [Ran et al., 2024] Ran, S.; Zhong, W.; Ma, L.; Duan, D.; Ye, L. & Zhang, Q.: , 2024; Mind to music: An eeg signal-driven real-time emotional music generation system; International Journal of Intelligent Systems. | |
| dc.relation.references | [Raudaschl et al., 2017] Raudaschl, P. F.; Zaffino, P.; Sharp, G. C.; Spadea, M. F.; Chen, A.; Dawant, B. M.; Albrecht, T.; Gass, T.; Langguth, C.; Lüthi, M. et al.: , 2017; Evaluation of segmentation methods on head and neck ct: autosegmentation challenge 2015; Medical physics; 44 (5): 2020–2036. | |
| dc.relation.references | [Rodrigues et al., 2019] Rodrigues, P. G.; Filho, C. A. S.; Attux, R.; Castellano, G. & Soriano, D. C.: , 2019; Space-time recurrences for functional connectivity evaluation and feature extraction in motor imagery brain-computer interfaces; Medical & biological engineering & computing; 57: 1709–1725. | |
| dc.relation.references | [Roy et al., 2019a] Roy, Y.; Banville, H.; Albuquerque, I.; Gramfort, A.; Falk, T. H. & Faubert, J.: , 2019a; Chrononet: A deep recurrent convolutional network for modeling multimodal physiological time series; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 27 (10): 2185–2199. | |
| dc.relation.references | [Roy et al., 2019b] Roy, Y.; Banville, H.; Albuquerque, I.; Gramfort, A.; Falk, T. H. & Faubert, J.: , 2019b; Deep learning-based electroencephalography analysis: a systematic review; Journal of neural engineering; 16 (5): 051001. | |
| dc.relation.references | [Russell, 1980] Russell, J. A.: , 1980; A circumplex model of affect.; Journal of personality and social psychology; 39 (6): 1161. | |
| dc.relation.references | [Rutkowski et al., 2024] Rutkowski, T. M.; Komendziński, T. & Otake-Matsuura, M.: , 2024; Mild cognitive impairment prediction and cognitive score regression in the elderly using eeg topological data analysis and machine learning with awareness assessed in affective reminiscent paradigm; Frontiers in Aging Neuroscience; 15: 1294139. | |
| dc.relation.references | [Saha et al., 2021] Saha, P. K.; Rahman, M. A.; Alam, M. K.; Ferdowsi, A. & Mollah, M. N.: , 2021; Common spatial pattern in frequency domain for feature extraction and classification of multichannel eeg signals; SN Computer Science; 2: 1–11. | |
| dc.relation.references | [Saha & Baumert, 2020] Saha, S. & Baumert, M.: , 2020; Intra-and inter-subject variability in eeg-based sensorimotor brain computer interface: a review; Frontiers in computational neuroscience; 13: 87. | |
| dc.relation.references | [Saibene et al., 2024] Saibene, A.; Ghaemi, H. & Dagdevir, E.: , 2024; Deep learning in motor imagery eeg signal decoding: A systematic review; Neurocomputing; 610: 128577. | |
| dc.relation.references | [Sairamya et al., 2021] Sairamya, N.; Subathra, M.; Suviseshamuthu, E. S. & George, S. T.: , 2021; A new approach for automatic detection of focal eeg signals using wavelet packet decomposition and quad binary pattern method; Biomedical Signal Processing and Control; 63: 102096. | |
| dc.relation.references | [Sakkalis, 2011] Sakkalis, V.: , 2011; Review of advanced techniques for the estimation of brain connectivity measured with eeg/meg; Computers in biology and medicine; 41 (12): 1110–1117. | |
| dc.relation.references | [Sannelli et al., 2016] Sannelli, C.; Vidaurre, C.; Müller, K.-R. & Blankertz, B.: , 2016; Ensembles of adaptive spatial filters increase bci performance: an online evaluation; Journal of neural engineering; 13 (4): 046003. | |
| dc.relation.references | [Sareen et al., 2020] Sareen, E.; Singh, L.; Gupta, A.; Verma, R.; Achary, G. K. & Varkey, B.: , 2020; Functional brain connectivity analysis in intellectual developmental disorder during music perception; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 28 (11): 2420–2430. | |
| dc.relation.references | [Sarker et al., 2024] Sarker, S.; Sarker, P.; Stone, G.; Gorman, R.; Tavakkoli, A.; Bebis, G. & Sattarvand, J.: , 2024; A comprehensive overview of deep learning techniques for 3d point cloud classification and semantic segmentation; Mach. Vis. Appl.; 35: 67. | |
| dc.relation.references | [Schirrmeister et al., 2017] Schirrmeister, R. T.; Springenberg, J. T.; Fiederer, L. D. J.; Glasstetter, M.; Eggensperger, K.; Tangermann, M.; Hutter, F.; Burgard, W. & Ball, T.: , 2017; Deep learning with convolutional neural networks for eeg decoding and visualization; Human brain mapping; 38 (11): 5391–5420. | |
| dc.relation.references | [Schmidt & Lee, 2023] Schmidt, R. & Lee, J.: , 2023; Multimodal fusion for bci: Eeg and physiological signals combined; Frontiers in Human Neuroscience; 17: 115683. | |
| dc.relation.references | [Shao & Qin, 2024] Shao, J. & Qin, P.: , 2024; Brain-computer interface: The design of self-healing music for emotion management; en 2024 9th International Conference on Intelligent Computing and Signal Processing (ICSP); págs. 1060–1064. | |
| dc.relation.references | [Singh et al., 2021a] Singh, A.; Hussain, A. A.; Lal, S. & Guesgen, H. W.: , 2021a; A comprehensive review on critical issues and possible solutions of motor imagery based electroencephalography brain-computer interface; Sensors; 21 (6): 2173. | |
| dc.relation.references | [Singh et al., 2021b] Singh, A.; Hussain, A. A.; Lal, S. & Guesgen, H. W.: , 2021b; A comprehensive review on critical issues and possible solutions of motor imagery based electroencephalography brain-computer interface; Sensors; 21 (6): 2173. | |
| dc.relation.references | [Sinisalo et al., 2024] Sinisalo, H.; Rissanen, I.; Kahilakoski, O.-P.; Souza, V. H.; Tommila, T.; Laine, M.; Nyrhinen, M.; Ukharova, E.; Granö, I.; Soto, A. M. et al.: , 2024; Modulating brain networks in space and time: Multi-locus transcranial magnetic stimulation; Clinical Neurophysiology; 158: 218–224. | |
| dc.relation.references | [Škola et al., 2019] Škola, F.; Tinková, S. & Liarokapis, F.: , 2019; Progressive training for motor imagery brain-computer interfaces using gamification and virtual reality embodiment; Frontiers in human neuroscience; 13: 329. | |
| dc.relation.references | [Smith & Patel, 2020] Smith, R. & Patel, N.: , 2020; Understanding volume conduction effects in eeg signal processing; Journal of Neuroscience Methods; 343: 108859. | |
| dc.relation.references | [Sporns, 2018] Sporns, O.: , 2018; Graph theory methods: applications in brain networks; Dialogues in clinical neuroscience; 20 (2): 111–121. | |
| dc.relation.references | [Sturm et al., 2016] Sturm, I.; Lapuschkin, S.; Samek, W. & Müller, K.-R.: , 2016; Interpretable deep neural networks for single-trial eeg classification; Journal of neuroscience methods; 274: 141–145. | |
| dc.relation.references | [Subramani et al., 2024] Subramani, K.; Smaragdis, P.; Higuchi, T. & Souden, M.: , 2024; Rethinking non-negative matrix factorization with implicit neural representations; arXiv preprint arXiv:2404.04439. | |
| dc.relation.references | [Sun et al., 2019] Sun, S.; Li, X.; Zhu, J.; Wang, Y.; La, R.; Zhang, X.; Wei, L. & Hu, B.: , 2019; Graph theory analysis of functional connectivity in major depression disorder with high-density resting state eeg data; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 27 (3): 429–439. | |
| dc.relation.references | [Sun et al., 2024] Sun, Y.; Kuo, M.; Wang, X.; Li, W. & Bai, Q.: , 2024; Emotionconditioned musiclm: Enhancing emotional resonance in music generation; en 2024 IEEE Congress on Evolutionary Computation (CEC); págs. 1–8. | |
| dc.relation.references | [Swaminathan, 2024] Swaminathan, A.: , 2024; Current techniques and engineering opportunities for advancement and improvement in electroencephalographic acquisition and analyses; Journal of Experimental Neurology; 5 (4): 192–209. | |
| dc.relation.references | [Tabar & Halici, 2016] Tabar, Y. R. & Halici, U.: , 2016; A novel deep learning approach for classification of eeg motor imagery signals; Journal of neural engineering; 14 (1): 016003. | |
| dc.relation.references | [Tangermann et al., 2012] Tangermann, M.; Müller, K.-R.; Aertsen, A.; Birbaumer, N.; Braun, C.; Brunner, C.; Leeb, R.; Mehring, C.; Miller, K. J.; Müller- Putz, G. R. et al.: , 2012; Review of the bci competition iv; Frontiers in neuroscience; 6: 55. | |
| dc.relation.references | [Thanigaivelu et al., 2023] Thanigaivelu, P.; Sridhar, S. & Sulthana, S. F.: , 2023; Oisvm: Optimal incremental support vector machine-based eeg classification for braincomputer interface model; Cognitive Computation; 15 (3): 888–903. | |
| dc.relation.references | [Thundiyil et al., 2023] Thundiyil, S.; Shalamzari, S.; Picone, J. & McKenzie, S.: , 2023; Transformers for modeling long-term dependencies in time series data: A review; 2023 IEEE Signal Processing in Medicine and Biology Symposium (SPMB): 1–5. | |
| dc.relation.references | [Tibshirani, 2018] Tibshirani, R.: , 2018; Regression Shrinkage and Selection Via the Lasso; Journal of the Royal Statistical Society: Series B (Methodological); 58 (1): 267–288. | |
| dc.relation.references | [Tjoa & Guan, 2020] Tjoa, E. & Guan, C.: , 2020; A survey on explainable artificial intelligence (xai): Toward medical xai; IEEE transactions on neural networks and learning systems; 32 (11): 4793–4813. | |
| dc.relation.references | [Tobón-Henao et al., 2023] Tobón-Henao, M.; Álvarez Meza, A. & Castellanos- Dominguez, C.: , 2023; Kernel-based regularized eegnet using centered alignment and gaussian connectivity for motor imagery discrimination; Computers; 12: 145; [CrossRef]. | |
| dc.relation.references | [Úbeda et al., 2017] Úbeda, A.; Azorín, J. M.; Chavarriaga, R. & R. Millán, J. d.: , 2017; Classification of upper limb center-out reaching tasks by means of eeg-based continuous decoding techniques; Journal of neuroengineering and rehabilitation; 14: 1–14. | |
| dc.relation.references | [UNESCO, 2021] UNESCO: , 2021; Engineering for sustainable development report; UNESCO Reports. | |
| dc.relation.references | [Usman et al., 2017] Usman, S. M.; Usman, M. & Fong, S.: , 2017; Epileptic seizures prediction using machine learning methods; Computational and mathematical methods in medicine; 2017 (1): 9074759. | |
| dc.relation.references | [Vanutelli et al., 2023] Vanutelli, M. E.; Salvadore, M. & Lucchiari, C.: , 2023; Bci applications to creativity: Review and future directions, from little-c to c2; Brain Sciences; 13 (4): 665. | |
| dc.relation.references | [Värbu et al., 2022] Värbu, K.; Muhammad, N. & Muhammad, Y.: , 2022; Past, present, and future of eeg-based bci applications; Sensors; 22 (9): 3331. | |
| dc.relation.references | [Vasilyev et al., 2021] Vasilyev, A. N.; Nuzhdin, Y. O. & Kaplan, A. Y.: , 2021; Does real-time feedback affect sensorimotor eeg patterns in routine motor imagery practice?; Brain Sciences; 11 (9): 1234. | |
| dc.relation.references | [Vecchio et al., 2016] Vecchio, F.; Miraglia, F.; Quaranta, D.; Granata, G.; Romanello, R.; Marra, C.; Bramanti, P. & Rossini, P. M.: , 2016; Cortical connectivity and memory performance in cognitive decline: A study via graph theory from eeg data; Neuroscience; 316: 143–150. | |
| dc.relation.references | [Velasquez-Martinez et al., 2020a] Velasquez-Martinez, L.; Caicedo-Acosta, J.; Acosta-Medina, C.; Alvarez-Meza, A. & Castellanos-Dominguez, G.: , 2020a; Regression networks for neurophysiological indicator evaluation in practicing motor imagery tasks; Brain Sciences; 10 (10): 707. | |
| dc.relation.references | [Velasquez-Martinez et al., 2020b] Velasquez-Martinez, L.; Caicedo-Acosta, J. & Castellanos-Dominguez, G.: , 2020b; Entropy-based estimation of event-related de/synchronization in motor imagery using vector-quantized patterns; Entropy; 22 (6): 703. | |
| dc.relation.references | [Velasquez-Martinez et al., 2020c] Velasquez-Martinez, M. C.; Santos-Vera, B.; Velez- Hernandez, M. E.; Vazquez-Torres, R. & Jimenez-Rivera, C. A.: , 2020c; Alpha-1 adrenergic receptors modulate glutamate and gaba neurotransmission onto ventral tegmental dopamine neurons during cocaine sensitization; International Journal of Molecular Sciences; 21 (3): 790. | |
| dc.relation.references | [Vempati & Sharma, 2023a] Vempati, R. & Sharma, L.: , 2023a; Eeg rhythm based emotion recognition using multivariate decomposition and ensemble machine learning classifier; Journal of Neuroscience Methods; 393: 109879; [CrossRef] [PubMed]. | |
| dc.relation.references | [Vempati & Sharma, 2023b] Vempati, R. & Sharma, L. D.: , 2023b; Eeg rhythm based emotion recognition using multivariate decomposition and ensemble machine learning classifier; Journal of Neuroscience Methods; 393: 109879. | |
| dc.relation.references | [Verboom, 2023] Verboom, M.: , 2023; Electroencephalography monitoring in the critically ill. | |
| dc.relation.references | [Vieluf et al., 2023] Vieluf, S.; Hasija, T.; Kuschel, M.; Reinsberger, C. & Loddenkemper, T.: , 2023; Developing a deep canonical correlation-based technique for seizure prediction; Expert Systems with Applications; 234: 120986. | |
| dc.relation.references | [Wagh & Vasanth, 2022] Wagh, K. P. & Vasanth, K.: , 2022; Performance evaluation of multi-channel electroencephalogram signal (eeg) based time frequency analysis for human emotion recognition; Biomedical Signal Processing and Control; 78: 103966. | |
| dc.relation.references | [Walder, 2016] Walder, C.: , 2016; Modelling symbolic music: Beyond the piano roll; en Asian conference on machine learning; PMLR; págs. 174–189. | |
| dc.relation.references | [Wang & Hu, 2023] Wang, C. & Hu, Y.: , 2023; Transformer-based architectures for eeg signal classification; IEEE Access; 11: 92345–92359. | |
| dc.relation.references | [Wang et al., 2024] Wang, L.; Zhao, Z.; Liu, H.; Pang, J.; Qin, Y. & Wu, Q.: , 2024; A review of intelligent music generation systems; Neural Computing and Applications; 36 (12): 6381–6401. | |
| dc.relation.references | [Wang et al., 2018] Wang, P.; Jiang, A.; Liu, X.; Shang, J. & Zhang, L.: , 2018; Lstm-based eeg classification in motor imagery tasks; IEEE transactions on neural systems and rehabilitation engineering; 26 (11): 2086–2095. | |
| dc.relation.references | [Wang et al., 2022a] Wang, S.; Du, Y.; Guo, X.; Pan, B. & Zhao, L.: , 2022a; Controllable data generation by deep learning: A review; ACM Computing Surveys; 56: 1 – 38. | |
| dc.relation.references | [Wang et al., 2022b] Wang, W.; Liang, J.; Liu, R.; Song, Y. & Zhang, M.: , 2022b; A robust variable selection method for sparse online regression via the elastic net penalty; Mathematics; 10 (16): 2985. | |
| dc.relation.references | [Warrens, 2015] Warrens, M. J.: , 2015; Five ways to look at cohen’s kappa; Journal of Psychology & Psychotherapy; 5 (4): 1. | |
| dc.relation.references | [Wen & Ting, 2023] Wen, Y.-W. & Ting, C.-K.: , 2023; Recent advances of computational intelligence techniques for composing music; IEEE Transactions on Emerging Topics in Computational Intelligence; 7 (2): 578–597. | |
| dc.relation.references | [Williams et al., 2015] Williams, D.; Kirke, A.; Miranda, E. R.; Roesch, E.; Daly, I. & Nasuto, S.: , 2015; Investigating affect in algorithmic composition systems; Psychology of Music; 43 (6): 831–854. | |
| dc.relation.references | [Wu et al., 2019] Wu, H.; Niu, Y.; Li, F.; Li, Y.; Fu, B.; Shi, G. & Dong, M.: , 2019; A parallel multiscale filter bank convolutional neural networks for motor imagery eeg classification; Frontiers in neuroscience; 13: 1275. | |
| dc.relation.references | [Xiao et al., 2018] Xiao, R.; Shida-Tokeshi, J.; Vanderbilt, D. L. & Smith, B. A.: , 2018; Electroencephalography power and coherence changes with age and motor skill development across the first half year of life; PloS one; 13 (1): e0190276. | |
| dc.relation.references | [Xie et al., 2022] Xie, J.; Zhang, J.; Sun, J.; Ma, Z.; Qin, L.; Li, G.; Zhou, H. & Zhan, Y.: , 2022; A transformer-based approach combining deep learning network and spatial-temporal information for raw eeg classification; IEEE Transactions on Neural Systems and Rehabilitation Engineering; 30: 2126–2136. | |
| dc.relation.references | [Xu et al., 2022] Xu, S.; Zhu, L.; Kong, W.; Peng, Y.; Hu, H. & Cao, J.: , 2022; A novel classification method for eeg-based motor imagery with narrow band spatial filters and deep convolutional neural network; Cognitive Neurodynamics: 1–11. | |
| dc.relation.references | [Yan et al., 2024] Yan, Z. N.; Liu, P. R.; Zhou, H.; Zhang, J. Y.; Liu, S. X.; Xie, Y. & Ye, Z. W.: , 2024; Brain-computer interaction in the smart era; Current Medical Science: 1–9. | |
| dc.relation.references | [Yang et al., 2021] Yang, H.; Hu, Z.; Imai, F.; Yang, Y. & Ogawa, K.: , 2021; Effects of neurofeedback on the activities of motor-related areas by using motor execution and imagery; Neuroscience letters; 746: 135653. | |
| dc.relation.references | [Yang & Lerch, 2020] Yang, L.-C. & Lerch, A.: , 2020; On the evaluation of generative models in music; Neural Computing and Applications; 32 (9): 4773–4784. | |
| dc.relation.references | [Yang et al., 2018] Yang, Y.; Wu, Q.; Fu, Y. & Chen, X.: , 2018; Continuous convolutional neural network with 3d input for eeg-based emotion recognition; en Neural Information Processing: 25th International Conference, ICONIP 2018, Siem Reap, Cambodia, December 13–16, 2018, Proceedings, Part VII 25 ; Springer; págs. 433–443. | |
| dc.relation.references | [Ye, 2024] Ye, H.: , 2024; Research on the application of intelligent algorithms in the automation of music generation and composition; en 2024 International Conference on Computers, Information Processing and Advanced Education (CIPAE); págs. 668–673. | |
| dc.relation.references | [Yin et al., 2024] Yin, K.; Lim, E. Y. & Lee, S.-W.: , 2024; Gitgan: Generative intersubject transfer for eeg motor imagery analysis; Pattern Recognition; 146: 110015. | |
| dc.relation.references | [Zander et al., 2011] Zander, T. O.; Lehne, M.; Ihme, K.; Jatzev, S.; Correia, J.; Kothe, C.; Picht, B. & Nijboer, F.: , 2011; A dry eeg-system for scientific research and brain–computer interfaces; Frontiers in neuroscience; 5: 53. | |
| dc.relation.references | [Zapała et al., 2020] Zapała, D.; Iwanowicz, P.; Francuz, P. & Augustynowicz, P.: , 2020; Handedness effects on movement imagery during kinesthetic and visual-motor conditions. an eeg study. | |
| dc.relation.references | [Zhang et al., 2021a] Zhang, A.; Lipton, Z. C.; Li, M. & Smola, A. J.: , 2021a; Dive into deep learning; arXiv preprint arXiv:2106.11342. | |
| dc.relation.references | [Zhang et al., 2021b] Zhang, C.; Kim, Y.-K. & Eskandarian, A.: , 2021b; Eeg-inception: an accurate and robust end-to-end neural network for eeg-based motor imagery classification; Journal of Neural Engineering; 18 (4): 046014. | |
| dc.relation.references | [Zhang et al., 2016] Zhang, J.; Chen, M.; Zhao, S.; Hu, S.; Shi, Z. & Cao, Y.: , 2016; Relieff-based eeg sensor selection methods for emotion recognition; Sensors; 16 (10): 1558. | |
| dc.relation.references | [Zhang et al., 2013] Zhang, R.; Xu, P.; Guo, L. & Yao, D.: , 2013; Prediction of ssvepbased bci performance by the resting-state eeg network; Journal of Neural Engineering; 10 (6): 066017. | |
| dc.relation.references | [Zhang & Wang, 2020] Zhang, Y. & Wang, P.: , 2020; Sparse common spatial patterns for robust eeg signal classification; Neurocomputing; 403: 253–261. | |
| dc.relation.references | [Zhang et al., 2019] Zhang, Y.; Guo, D.; Li, Y.; Zhang, Y.; Li, P. & Zhang, Y.: , 2019; A novel hybrid deep learning scheme for four-class motor imagery classification; Journal of Neural Engineering; 16 (6): 066004. | |
| dc.relation.references | [Zhou et al., 2023] Zhou, A.; Zhang, L.; Yuan, X. & Li, C.: , 2023; A signal predictionbased method for motor imagery eeg classification; Biomedical Signal Processing and Control; 86: 105139. | |
| dc.relation.references | [Zhou, 2024] Zhou, Y.: , 2024; Music generation based on bidirectional gru model; Highlights in Science, Engineering and Technology. | |
| dc.relation.references | [Zhu et al., 2024] Zhu, H. Y.; Hieu, N. Q.; Hoang, D. T.; Nguyen, D. N. & Lin, C.-T.: , 2024; A human-centric metaverse enabled by brain-computer interface: A survey; IEEE Communications Surveys & Tutorials. | |
| dc.relation.references | [Zolfaghari et al., 2024] Zolfaghari, S.; Yousefi Rezaii, T. & Meshgini, S.: , 2024; Applying common spatial pattern and convolutional neural network to classify movements via eeg signals; Clinical EEG and Neuroscience: 15500594241234836. | |
| dc.relation.references | [Zou et al., 2011] Zou, D.; Gao, K. & Xia, J.: , 2011; Dark respiration in the light and in darkness of three marine macroalgal species grown under ambient and elevated co 2 concentrations; Acta Oceanologica Sinica; 30: 106–112. | |
| dc.relation.references | [Álvarez Meza et al., 2023] Álvarez Meza, A. M.; Torres-Cardona, H. F.; Orozco- Alzate, M.; Pérez-Nastar, H. D. & Castellanos-Dominguez, G.: , 2023; Affective neural responses sonified through labeled correlation alignment; Sensors; 23 (12): 5574. | |
| dc.rights.accessrights | info:eu-repo/semantics/openAccess | |
| dc.rights.license | Atribución-NoComercial 4.0 Internacional | |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc/4.0/ | |
| dc.subject.ddc | 000 - Ciencias de la computación, información y obras generales::006 - Métodos especiales de computación | |
| dc.subject.proposal | Interfaz cerebro-computadora (BCI) | spa |
| dc.subject.proposal | Aprendizaje profundo | spa |
| dc.subject.proposal | Imágenes motoras (IM) | spa |
| dc.subject.proposal | Electroencefalografía (EEG) | spa |
| dc.subject.proposal | Análisis de regularización | spa |
| dc.subject.proposal | Análisis de regresión múltiple | spa |
| dc.subject.proposal | Brain-computer interface | eng |
| dc.subject.proposal | Deep learning | eng |
| dc.subject.proposal | Motor imagery | eng |
| dc.subject.proposal | Electroencephalography | eng |
| dc.subject.proposal | Regularization analysis | eng |
| dc.subject.proposal | Multiple regression analysis | eng |
| dc.subject.unesco | Inteligencia artificial | |
| dc.subject.unesco | Artificial intelligence | |
| dc.subject.unesco | Aprendizaje | |
| dc.subject.unesco | Learning | |
| dc.title | Enhanced interpretability using regression networks for assessing domain dependences in motor imagery | eng |
| dc.title.translated | Aprendizaje automático y conectividad funcional para la interpretación de señales EEG en sistemas BCI | spa |
| dc.type | Trabajo de grado - Doctorado | |
| dc.type.coar | http://purl.org/coar/resource_type/c_db06 | |
| dc.type.coarversion | http://purl.org/coar/version/c_ab4af688f83e57aa | |
| dc.type.content | Text | |
| dc.type.content | Dataset | |
| dc.type.driver | info:eu-repo/semantics/doctoralThesis | |
| dc.type.version | info:eu-repo/semantics/acceptedVersion | |
| dcterms.audience.professionaldevelopment | Bibliotecarios | |
| dcterms.audience.professionaldevelopment | Estudiantes | |
| dcterms.audience.professionaldevelopment | Investigadores | |
| dcterms.audience.professionaldevelopment | Maestros | |
| dcterms.audience.professionaldevelopment | Público general | |
| oaire.accessrights | http://purl.org/coar/access_right/c_abf2 |
Archivos
Bloque original
1 - 1 de 1
Cargando...
- Nombre:
- Tesis de Doctorado en Ingeniería - Automática.pdf
- Tamaño:
- 9.15 MB
- Formato:
- Adobe Portable Document Format
- Descripción:
- Tesis de Doctorado en Ingeniería - Automática
Bloque de licencias
1 - 1 de 1
Cargando...
- Nombre:
- license.txt
- Tamaño:
- 5.74 KB
- Formato:
- Item-specific license agreed upon to submission
- Descripción:

