Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica

dc.contributor.advisorNiño Vásquez, Luis Fernando
dc.contributor.authorTovar Díaz, Dorian Abad
dc.contributor.researchgrouplaboratorio de Investigación en Sistemas Inteligentes Lisispa
dc.date.accessioned2023-10-10T22:15:02Z
dc.date.available2023-10-10T22:15:02Z
dc.date.issued2023
dc.description.abstractLos pacientes con esclerosis lateral amiotrófica (ELA) se enfrentan a problemas de comunicación debido a la pérdida de las capacidades del habla y la escritura. Los sistemas de comunicación ocular han surgido como una posible solución, pero su uso presenta retos y limitaciones para el usuario, como el uso de dispositivos de captura muy complejos y costosos. El objetivo de este trabajo es desarrollar un prototipo de software basado en técnicas de visión por computador para mejorar la comunicación de los pacientes con ELA mediante el seguimiento y la clasificación de sus movimientos oculares. Se utiliza la videooculografía para capturar las características oculares, mientras que para la clasificación de los movimientos se seleccionó el modelo de red neuronal convolucional Inception V3. Este modelo se entrenó con un conjunto de imágenes sintéticas generadas con la herramienta UnityEyes. El sistema de comunicación Vocal Eyes se utiliza para traducir los movimientos oculares en el mensaje del paciente. El prototipo logra una precisión del 99 % en la transmisión de cada mensaje, con una tasa de acierto del 99.3 % en los movimientos realizados. Sin embargo, se observan dificultades en la clasificación de los movimientos oculares de la mirada inferior. Este resultado representa un avance significativo en la mejora de la comunicación ocular para pacientes con ELA, respalda la viabilidad de la comunicación ocular de bajo costo y ofrece oportunidades para futuras investigaciones y mejoras en el sistema. (Texto tomado de la fuente)spa
dc.description.abstractPatients with amyotrophic lateral sclerosis (ALS) face communication challenges due to the loss of speech and writing ability. Eye communication systems have emerged as a potential solution, but their use still presents challenges and limitations, such as the use of highly complex and costly capture devices. The aim of this work is to develop a software prototype based on computer vision techniques to improve the communication of ALS patients by monitoring and classifying their eye movements. Video-oculography is used to capture ocular features, while the convolutional neural network model Inception V3 was selected for movement classification. This model was trained with a set of synthetic images generated by the UnityEyes tool. The Vocal Eyes communication system is used to translate the eye movements into the patient’s message. The prototype achieves 99 % accuracy in the transmission of each message, with a 99.3 % success rate in the movements made. However, difficulties are observed in the classification of lower gaze eye movements. This result represents significant progress in improving eye communication for ALS patients, supports the feasibility of low-cost eye communication, and provides opportunity for further research and system improvements.eng
dc.description.abstractilustraciones, diagramas, fotografíasspa
dc.description.degreelevelMaestríaspa
dc.description.degreenameMagíster en Ingeniería de Sistemas y Computaciónspa
dc.description.researchareaSistemas Inteligentesspa
dc.format.extentxv, 63 páginasspa
dc.format.mimetypeapplication/pdfspa
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/84794
dc.language.isospaspa
dc.publisherUniversidad Nacional de Colombiaspa
dc.publisher.branchUniversidad Nacional de Colombia - Sede Bogotáspa
dc.publisher.facultyFacultad de Ingenieríaspa
dc.publisher.placeBogotá, Colombiaspa
dc.publisher.programBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computaciónspa
dc.relation.referencesD. Purves, G. J. Augustine, D. Fitzpatrick, L. C. Katz, A.-S. LaMantia, J. O. McNamara, and S. M. Williams, Neuroscience, 2nd ed. Sunderland: Sinauer Associates, 2001. [Online]. Available: https://www.ncbi.nlm.nih.gov/books/NBK10799/spa
dc.relation.referencesS. Haykin, Neural Networks and Learning Machines, 3rd ed. New York: Pearson Prentice Hall, 2009.spa
dc.relation.referencesA. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6, pp. 84–90, 5 2017. [Online]. Available: https://dl.acm.org/doi/10.1145/3065386spa
dc.relation.referencesC. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 2818–2826, 12 2015. [Online]. Available: https://arxiv.org/abs/1512.00567v3spa
dc.relation.referencesK. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 770–778, 12 2016.spa
dc.relation.referencesE. Wood, T. Baltrušaitis, L. P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” Eye Tracking Research and Applications Symposium (ETRA), vol. 14, pp. 131–138, 3 2016. [Online]. Available: https://dl.acm.org/doi/10.1145/2857491.2857492spa
dc.relation.referencesJ. Becker and G. Becker, “Vocal Eyes Becker Communication System,” 5 2017. [Online]. Available: https://patient-innovation.com/post/1705spa
dc.relation.referencesI. Grishchenko, A. Ablavatski, Y. Kartynnik, K. Raveendran, and M. Grundmann, “Attention Mesh: High-fidelity Face Mesh Prediction in Real-time,” CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020. [Online]. Available: https://arxiv.org/abs/2006.10962spa
dc.relation.referencesM. C. Kiernan, S. Vucic, B. C. Cheah, M. R. Turner, A. Eisen, O. Hardiman, J. R. Burrell, and M. C. Zoing, “Amyotrophic lateral sclerosis,” The Lancet, vol. 377, no. 9769, pp. 942–955, 3 2011.spa
dc.relation.referencesG. Bauer, F. Gerstenbrand, and E. Rumpl, “Varieties of the locked-in syndrome”, Journal of Neurology, vol. 221, no. 2, pp. 77–91, 8 1979. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/BF00313105spa
dc.relation.referencesR. Pugliese, R. Sala, S. Regondi, B. Beltrami, and C. Lunetta, “Emerging technologies for management of patients with amyotrophic lateral sclerosis: from telehealth to assistive robotics and neural interfaces”, Journal of Neurology, vol. 269, no. 6, pp. 2910–2921, 6 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s00415-022-10971-wspa
dc.relation.referencesA. Londral, A. Pinto, S. Pinto, L. Azevedo, and M. De Carvalho, “Quality of life in amyotrophic lateral sclerosis patients and caregivers: Impact of assistive communication from early stages”, Muscle & Nerve, vol. 52, no. 6, pp. 933–941, 12 2015. [Online]. Available: https://onlinelibrary-wiley-com.ezproxy.unal.edu.co/doi/10.1002/mus.24659spa
dc.relation.referencesZ. Hossain, M. M. H. Shuvo, and P. Sarker, “Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement”, 4th International Conference on Advances in Electrical Engineering, ICAEE 2017, vol. 2018-January, pp. 132–137, 7 2017.spa
dc.relation.referencesC. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation”, Multimedia Tools and Applications, vol. 77, no. 15, pp. 19 679–19 696, 8 2018. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s11042-017-5426-yspa
dc.relation.referencesZ. Al-Kassim and Q. A. Memon, “Designing a low-cost eyeball tracking keyboard for paralyzed people”, Computers & Electrical Engineering, vol. 58, pp. 20–29, 2 2017.spa
dc.relation.referencesT. L. A. Valente, J. D. S. de Almeida, A. C. Silva, J. A. M. Teixeira, and M. Gattass, “Automatic diagnosis of strabismus in digital videos through cover test", Computer Methods and Programs in Biomedicine, vol. 140, pp. 295–305, 3 2017.spa
dc.relation.referencesH. Y. Lai, G. Saavedra-Pena, C. G. Sodini, V. Sze, and T. Heldt, “Measuring Saccade Latency Using Smartphone Cameras”, IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 3, pp. 885–897, 3 2020.spa
dc.relation.referencesT. K. Reddy, V. Gupta, and L. Behera, “Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection”, Advances in Intelligent Systems and Computing, vol. 799, pp. 229–238, 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-981-13-1135-2 18spa
dc.relation.referencesZ. Wang, J. Chai, and S. Xia, “Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation”, IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 1, pp. 190–203, 1 2021spa
dc.relation.referencesG. Iannizzotto, A. Nucita, R. A. Fabio, T. Caprı̀, and L. L. Bello, “Remote Eye-Tracking for Cognitive Telerehabilitation and Interactive School Tasks in Times of COVID-19”, Information 2020, Vol. 11, Page 296, vol. 11, no. 6, p. 296, 6 2020. [Online]. Available: https://www.mdpi.com/2078-2489/11/6/296/htmspa
dc.relation.referencesI. S. Hwang, Y. Y. Tsai, B. H. Zeng, C. M. Lin, H. S. Shiue, and G. C. Chang, “Integration of eye tracking and lip motion for hands-free computer access”, Universal Access in the Information Society, vol. 20, no. 2, pp. 405–416, 6 2021. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10209-020-00723-wspa
dc.relation.referencesM. H. Lee, J. Williamson, D. O. Won, S. Fazli, and S. W. Lee, “A High Performance Spelling System based on EEG-EOG Signals with Visual Feedback”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 7, pp. 1443–1459, 7 2018spa
dc.relation.referencesD. Chatterjee, R. D. Gavas, K. Chakravarty, A. Sinha, and U. Lahiri, “Eye movements - An early marker of cognitive dysfunctions”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, vol. 2018 July, pp. 4012–4016, 10 2018spa
dc.relation.referencesV. Rajanna and T. Hammond, “A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions”, Eye Tracking Research and Applications Symposium (ETRA), 6 2018. [Online]. Available: https://dl.acm.org/doi/10.1145/3204493.3208344spa
dc.relation.referencesC. Froment Tilikete, “How to assess eye movements clinically”, Neurological Sciences, vol. 43, no. 5, pp. 2969–2981, 5 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10072-022-05981-5spa
dc.relation.referencesA. Khasnobish, R. Gavas, D. Chatterjee, V. Raj, and S. Naitam, “EyeAssist: A communication aid through gaze tracking for patients with neuro-motor disabilities”, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2017, pp. 382–387, 5 2017spa
dc.relation.referencesA. López, F. Ferrero, and O. Postolache, “An Affordable Method for Evaluation of Ataxic Disorders Based on Electrooculography”, Sensors 2019, Vol. 19, Page 3756, vol. 19, no. 17, p. 3756, 8 2019. [Online]. Available: https://www.mdpi.com/1424-8220/19/17/3756/htmhttps://www.mdpi.com/1424-8220/19/17/3756spa
dc.relation.referencesA. Tanwear, X. Liang, Y. Liu, A. Vuckovic, R. Ghannam, T. Bohnert, E. Paz, P. P. Freitas, R. Ferreira, and H. Heidari, “Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control”, IEEE Transactions on Biomedical Circuits and Systems, vol. 14, no. 6, pp. 1299–1310, 12 2020spa
dc.relation.referencesA. Sprenger, B. Neppert, S. Köster, S. Gais, D. Kömpf, C. Helmchen, and H. Kimmig, “Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications”, Journal of Neuroscience Methods, vol. 170, no. 2, pp. 305–309, 5 2008spa
dc.relation.referencesD. Sliney, D. Aron-Rosa, F. Delori, F. Fankhauser, R. Landry, M. Mainster, J. Marshall, B. Rassow, B. Stuck, S. Trokel, T. M. West, and M. Wolffe, “Adjustment of guidelines for exposure of the eye to optical radiation from ocular instruments: statement from a task group of the International Commission on Non-Ionizing Radiation Protection (ICNIRP)”, Applied Optics, Vol. 44, Issue 11, pp. 2162-2176, vol. 44, no. 11, pp. 2162–2176, 4 2005. [Online]. Available: https://opg.optica.org/abstract.cfm?uri=ao-44-11-2162https://opg.optica.org/ao/abstract.cfm?uri=ao-44-11-2162spa
dc.relation.referencesS. Zeng, J. Niu, J. Zhu, and X. Li, “A Study on Depression Detection Using Eye Tracking”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11354 LNCS, pp. 516–523, 3 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-030-15127-0 52spa
dc.relation.referencesN. E. Krausz, D. Lamotte, I. Batzianoulis, L. J. Hargrove, S. Micera, and A. Billard, “Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 6, pp. 1471–1480, 6 2020spa
dc.relation.referencesA. M. Choudhari, P. Porwal, V. Jonnalagedda, and F. Mériaudeau, “An Electrooculography based Human Machine Interface for wheelchair control”, Biocybernetics and Biomedical Engineering, vol. 39, no. 3, pp. 673–685, 7 2019spa
dc.relation.referencesG. Pangestu, F. Utaminingrum, and F. A. Bachtiar, “Eye State Recognition Using Multiple Methods for Applied to Control Smart Wheelchair”, International Journal of Intelligent Engineering and Systems, vol. 12, no. 1, 2019spa
dc.relation.referencesP. Illavarason, J. Arokia Renjit, and P. Mohan Kumar, “Medical Diagnosis of Cerebral Palsy Rehabilitation Using Eye Images in Machine Learning Techniques”, Journal of Medical Systems, vol. 43, no. 8, pp. 1–24, 8 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10916-019-1410-6spa
dc.relation.referencesS. He and Y. Li, “A single-channel EOG-based speller", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 11 2017spa
dc.relation.referencesK. Sakurai, M. Yan, K. Tanno, and H. Tamura, “Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor”, Computational Intelligence and Neuroscience, vol. 2017, 2017spa
dc.relation.referencesR. K. Megalingam, V. Nandakumar, A. Athira, G. S. Gopika, and A. Krishna, “Orthotic arm control using EOG signals and GUI”, International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 - Conference Proceedings, 5 2017spa
dc.relation.referencesM. Thilagaraj, B. Dwarakanath, S. Ramkumar, K. Karthikeyan, A. Prabhu, G. Saravanakumar, M. P. Rajasekaran, and N. Arunkumar, “Eye Movement Signal Classification for Developing Human-Computer Interface Using Electrooculogram”, Journal of Healthcare Engineering, vol. 2021, 2021spa
dc.relation.referencesK. Stingl, T. Peters, T. Strasser, C. Kelbsch, P. Richter, H. Wilhelm, and B. Wilhelm, “Pupillographic campimetry: An objective method to measure the visual field”, Biomedizinische Technik, vol. 63, no. 6, pp. 665–672, 12 2018. [Online]. Available: https://www.degruyter.com/document/doi/10.1515/bmt-2017-0029/htmlspa
dc.relation.referencesD. Mittal, S. Rajalakshmi, and T. Shankar, “DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS”, ARPN Journal of Engineering and Applied Sciences, vol. 13, no. 11, 2018. [Online]. Available: www.arpnjournals.comspa
dc.relation.referencesK. P. Murphy, Machine Learning: A Probabilistic Perspective, 1st ed. The MIT Press, 8 2012. [Online]. Available: https://mitpress.mit.edu/9780262018029/spa
dc.relation.referencesC. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going Deeper with Convolutions”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 07-12-June-2015, pp. 1–9, 9 2014. [Online]. Available: https://arxiv.org/abs/1409.4842v1spa
dc.relation.referencesJ. O. Wobbrock, J. Rubinstein, M. Sawyer, and A. T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures”, The 3rd Conference on Communication by Gaze Interaction – COGAIN 2007: Gaze-based Creativity, 9 2007spa
dc.relation.referencesS. Tantisatirapong and M. Phothisonothai, “Design of User-Friendly Virtual Thai Keyboard Based on Eye-Tracking Controlled System”, ISCIT 2018 - 18th International Symposium on Communication and Information Technology, pp. 359–362, 12 2018spa
dc.relation.referencesH. Cecotti, Y. K. Meena, B. Bhushan, A. Dutta, and G. Prasad, “A multiscript gaze-based assistive virtual keyboard”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 1306–1309, 7 2019spa
dc.relation.referencesO. Tuisku, P. Majaranta, P. Isokoski, and K. J. Räihä, “Now Dasher! Dash away!: Longitudinal study of fast text entry by eye gaze”, Eye Tracking Research and Applications Symposium (ETRA), pp. 19–26, 2008. [Online]. Available: https://dl.acm.org/doi/10.1145/1344471.1344476spa
dc.relation.referencesF. L. Darley, A. E. Aronson, and J. R. Brown, “Differential Diagnostic Patterns of Dysarthria”, Journal of speech and hearing research, vol. 12, no. 2, pp. 246–269, 1969. [Online]. Available: https://pubs.asha.org/doi/10.1044/jshr.1202.246spa
dc.relation.referencesJ. J. Sidtis, J. S. Ahn, C. Gomez, and D. Sidtis, “Speech characteristics associated with three genotypes of ataxia”, Journal of Communication Disorders, vol. 44, no. 4, pp. 478–492, 7 2011spa
dc.relation.referencesE. Roos, D. Mariosa, C. Ingre, C. Lundholm, K. Wirdefeldt, P. M. Roos, and F. Fang, “Depression in amyotrophic lateral sclerosis”, Neurology, vol. 86, no. 24, pp. 2271–2277, 6 2016. [Online]. Available: https://n.neurology.org/content/86/24/2271spa
dc.relation.referencesWorld Health Organization and World Bank, “World report on disability”, World Health Organization, Tech. Rep., 2011. [Online]. Available: https://apps.who.int/iris/handle/10665/44575spa
dc.relation.referencesJ. P. Van Den Berg, S. Kalmijn, E. Lindeman, J. H. Veldink, M. De Visser, M. M. Van Der Graaff, J. H. Wokke, and L. H. Van Den Berg, “Multidisciplinary ALS care improves quality of life in patients with ALS”, Neurology, vol. 65, no. 8, pp. 1264–1267, 10 2005. [Online]. Available: https://n.neurology.org/content/65/8/1264spa
dc.relation.referencesS. Körner, M. Siniawski, K. Kollewe, K. J. Rath, K. Krampfl, A. Zapf, R. Dengler, and S. Petri, “Speech therapy and communication device: Impact on quality of life and mood in patients with amyotrophic lateral sclerosis”, Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, vol. 14, no. 1, pp. 20–25, 1 2013. [Online]. Available: https://www-tandfonline-com.ezproxy.unal.edu.co/doi/abs/10.3109/17482968.2012.692382spa
dc.relation.referencesT. Prell, N. Gaur, B. Stubendorff, A. Rödiger, O. W. Witte, and J. Grosskreutz, “Disease progression impacts health-related quality of life in amyotrophic lateral sclerosis”, Journal of the Neurological Sciences, vol. 397, pp. 92–95, 2 2019spa
dc.relation.referencesL. García, R. Ron-Angevin, B. Loubière, L. Renault, G. Le Masson, V. Lespinet-Najib, and J. M. André, “A comparison of a brain-computer interface and an eye tracker: Is there a more appropriate technology for controlling a virtual keyboard in an ALS patient?”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10306 LNCS, pp. 464–473, 2017. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-319-59147-6 40spa
dc.relation.referencesRaspberry Pi Foundation, “Raspberry Pi - About us.”, [Online]. Available: https://www.raspberrypi.com/about/spa
dc.relation.referencesArduino, “About Arduino.”, [Online]. Available: https://www.arduino.cc/en/aboutspa
dc.relation.referencesOBS Project, “Open Broadcaster Software.”, [Online]. Available: https://obsproject.com/spa
dc.relation.referencesA. Vakunov and D. Lagun, “MediaPipe Iris: Real-time Iris Tracking & Depth Estimation – Google AI Blog”, 2020. [Online]. Available: https://ai.googleblog.com/2020/08/mediapipe-iris-real-time-iris-tracking.htmlspa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseAtribución-NoComercial 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc/4.0/spa
dc.subject.decsEsclerosis Amiotrófica Lateralspa
dc.subject.decsAmyotrophic Lateral Sclerosiseng
dc.subject.decsMétodos de Comunicación Totalspa
dc.subject.decsCommunication Methods, Totaleng
dc.subject.decsEquipos de Comunicación para Personas con Discapacidadspa
dc.subject.decsCommunication Aids for Disabledeng
dc.subject.proposalEsclerosis lateral amiotrófica (ELA)spa
dc.subject.proposalComunicación ocularspa
dc.subject.proposalVideo-oculografíaspa
dc.subject.proposalVocal Eyesspa
dc.subject.proposalRedes neuronales convolucionalesspa
dc.subject.proposalTransmisión de mensajesspa
dc.subject.proposalInterfaz ocularspa
dc.subject.proposalAmyotrophic lateral sclerosis (ALS)eng
dc.subject.proposalOcular communicationeng
dc.subject.proposalVideo-oculographyeng
dc.subject.proposalVocal Eyeseng
dc.subject.proposalConvolutional neural networkseng
dc.subject.proposalMessage transmissioneng
dc.subject.proposalOcular interfaceeng
dc.titleSoftware de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotróficaspa
dc.title.translatedVocal Eyes-based eye communication software for amyotrophic lateral sclerosis patientseng
dc.typeTrabajo de grado - Maestríaspa
dc.type.coarhttp://purl.org/coar/resource_type/c_bdccspa
dc.type.coarversionhttp://purl.org/coar/version/c_ab4af688f83e57aaspa
dc.type.contentTextspa
dc.type.driverinfo:eu-repo/semantics/masterThesisspa
dc.type.redcolhttp://purl.org/redcol/resource_type/TMspa
dc.type.versioninfo:eu-repo/semantics/acceptedVersionspa
dcterms.audience.professionaldevelopmentPúblico generalspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
1015450643.2023.pdf
Tamaño:
13.18 MB
Formato:
Adobe Portable Document Format
Descripción:
Tesis de Maestría en Ingeniería - Ingeniería de Sistemas y Computación

Bloque de licencias

Mostrando 1 - 1 de 1
No hay miniatura disponible
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción: