Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica
dc.contributor.advisor | Niño Vásquez, Luis Fernando | |
dc.contributor.author | Tovar Díaz, Dorian Abad | |
dc.contributor.researchgroup | laboratorio de Investigación en Sistemas Inteligentes Lisi | spa |
dc.date.accessioned | 2023-10-10T22:15:02Z | |
dc.date.available | 2023-10-10T22:15:02Z | |
dc.date.issued | 2023 | |
dc.description.abstract | Los pacientes con esclerosis lateral amiotrófica (ELA) se enfrentan a problemas de comunicación debido a la pérdida de las capacidades del habla y la escritura. Los sistemas de comunicación ocular han surgido como una posible solución, pero su uso presenta retos y limitaciones para el usuario, como el uso de dispositivos de captura muy complejos y costosos. El objetivo de este trabajo es desarrollar un prototipo de software basado en técnicas de visión por computador para mejorar la comunicación de los pacientes con ELA mediante el seguimiento y la clasificación de sus movimientos oculares. Se utiliza la videooculografía para capturar las características oculares, mientras que para la clasificación de los movimientos se seleccionó el modelo de red neuronal convolucional Inception V3. Este modelo se entrenó con un conjunto de imágenes sintéticas generadas con la herramienta UnityEyes. El sistema de comunicación Vocal Eyes se utiliza para traducir los movimientos oculares en el mensaje del paciente. El prototipo logra una precisión del 99 % en la transmisión de cada mensaje, con una tasa de acierto del 99.3 % en los movimientos realizados. Sin embargo, se observan dificultades en la clasificación de los movimientos oculares de la mirada inferior. Este resultado representa un avance significativo en la mejora de la comunicación ocular para pacientes con ELA, respalda la viabilidad de la comunicación ocular de bajo costo y ofrece oportunidades para futuras investigaciones y mejoras en el sistema. (Texto tomado de la fuente) | spa |
dc.description.abstract | Patients with amyotrophic lateral sclerosis (ALS) face communication challenges due to the loss of speech and writing ability. Eye communication systems have emerged as a potential solution, but their use still presents challenges and limitations, such as the use of highly complex and costly capture devices. The aim of this work is to develop a software prototype based on computer vision techniques to improve the communication of ALS patients by monitoring and classifying their eye movements. Video-oculography is used to capture ocular features, while the convolutional neural network model Inception V3 was selected for movement classification. This model was trained with a set of synthetic images generated by the UnityEyes tool. The Vocal Eyes communication system is used to translate the eye movements into the patient’s message. The prototype achieves 99 % accuracy in the transmission of each message, with a 99.3 % success rate in the movements made. However, difficulties are observed in the classification of lower gaze eye movements. This result represents significant progress in improving eye communication for ALS patients, supports the feasibility of low-cost eye communication, and provides opportunity for further research and system improvements. | eng |
dc.description.abstract | ilustraciones, diagramas, fotografías | spa |
dc.description.degreelevel | Maestría | spa |
dc.description.degreename | Magíster en Ingeniería de Sistemas y Computación | spa |
dc.description.researcharea | Sistemas Inteligentes | spa |
dc.format.extent | xv, 63 páginas | spa |
dc.format.mimetype | application/pdf | spa |
dc.identifier.instname | Universidad Nacional de Colombia | spa |
dc.identifier.reponame | Repositorio Institucional Universidad Nacional de Colombia | spa |
dc.identifier.repourl | https://repositorio.unal.edu.co/ | spa |
dc.identifier.uri | https://repositorio.unal.edu.co/handle/unal/84794 | |
dc.language.iso | spa | spa |
dc.publisher | Universidad Nacional de Colombia | spa |
dc.publisher.branch | Universidad Nacional de Colombia - Sede Bogotá | spa |
dc.publisher.faculty | Facultad de Ingeniería | spa |
dc.publisher.place | Bogotá, Colombia | spa |
dc.publisher.program | Bogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación | spa |
dc.relation.references | D. Purves, G. J. Augustine, D. Fitzpatrick, L. C. Katz, A.-S. LaMantia, J. O. McNamara, and S. M. Williams, Neuroscience, 2nd ed. Sunderland: Sinauer Associates, 2001. [Online]. Available: https://www.ncbi.nlm.nih.gov/books/NBK10799/ | spa |
dc.relation.references | S. Haykin, Neural Networks and Learning Machines, 3rd ed. New York: Pearson Prentice Hall, 2009. | spa |
dc.relation.references | A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6, pp. 84–90, 5 2017. [Online]. Available: https://dl.acm.org/doi/10.1145/3065386 | spa |
dc.relation.references | C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 2818–2826, 12 2015. [Online]. Available: https://arxiv.org/abs/1512.00567v3 | spa |
dc.relation.references | K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 770–778, 12 2016. | spa |
dc.relation.references | E. Wood, T. Baltrušaitis, L. P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” Eye Tracking Research and Applications Symposium (ETRA), vol. 14, pp. 131–138, 3 2016. [Online]. Available: https://dl.acm.org/doi/10.1145/2857491.2857492 | spa |
dc.relation.references | J. Becker and G. Becker, “Vocal Eyes Becker Communication System,” 5 2017. [Online]. Available: https://patient-innovation.com/post/1705 | spa |
dc.relation.references | I. Grishchenko, A. Ablavatski, Y. Kartynnik, K. Raveendran, and M. Grundmann, “Attention Mesh: High-fidelity Face Mesh Prediction in Real-time,” CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020. [Online]. Available: https://arxiv.org/abs/2006.10962 | spa |
dc.relation.references | M. C. Kiernan, S. Vucic, B. C. Cheah, M. R. Turner, A. Eisen, O. Hardiman, J. R. Burrell, and M. C. Zoing, “Amyotrophic lateral sclerosis,” The Lancet, vol. 377, no. 9769, pp. 942–955, 3 2011. | spa |
dc.relation.references | G. Bauer, F. Gerstenbrand, and E. Rumpl, “Varieties of the locked-in syndrome”, Journal of Neurology, vol. 221, no. 2, pp. 77–91, 8 1979. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/BF00313105 | spa |
dc.relation.references | R. Pugliese, R. Sala, S. Regondi, B. Beltrami, and C. Lunetta, “Emerging technologies for management of patients with amyotrophic lateral sclerosis: from telehealth to assistive robotics and neural interfaces”, Journal of Neurology, vol. 269, no. 6, pp. 2910–2921, 6 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s00415-022-10971-w | spa |
dc.relation.references | A. Londral, A. Pinto, S. Pinto, L. Azevedo, and M. De Carvalho, “Quality of life in amyotrophic lateral sclerosis patients and caregivers: Impact of assistive communication from early stages”, Muscle & Nerve, vol. 52, no. 6, pp. 933–941, 12 2015. [Online]. Available: https://onlinelibrary-wiley-com.ezproxy.unal.edu.co/doi/10.1002/mus.24659 | spa |
dc.relation.references | Z. Hossain, M. M. H. Shuvo, and P. Sarker, “Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement”, 4th International Conference on Advances in Electrical Engineering, ICAEE 2017, vol. 2018-January, pp. 132–137, 7 2017. | spa |
dc.relation.references | C. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation”, Multimedia Tools and Applications, vol. 77, no. 15, pp. 19 679–19 696, 8 2018. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s11042-017-5426-y | spa |
dc.relation.references | Z. Al-Kassim and Q. A. Memon, “Designing a low-cost eyeball tracking keyboard for paralyzed people”, Computers & Electrical Engineering, vol. 58, pp. 20–29, 2 2017. | spa |
dc.relation.references | T. L. A. Valente, J. D. S. de Almeida, A. C. Silva, J. A. M. Teixeira, and M. Gattass, “Automatic diagnosis of strabismus in digital videos through cover test", Computer Methods and Programs in Biomedicine, vol. 140, pp. 295–305, 3 2017. | spa |
dc.relation.references | H. Y. Lai, G. Saavedra-Pena, C. G. Sodini, V. Sze, and T. Heldt, “Measuring Saccade Latency Using Smartphone Cameras”, IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 3, pp. 885–897, 3 2020. | spa |
dc.relation.references | T. K. Reddy, V. Gupta, and L. Behera, “Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection”, Advances in Intelligent Systems and Computing, vol. 799, pp. 229–238, 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-981-13-1135-2 18 | spa |
dc.relation.references | Z. Wang, J. Chai, and S. Xia, “Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation”, IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 1, pp. 190–203, 1 2021 | spa |
dc.relation.references | G. Iannizzotto, A. Nucita, R. A. Fabio, T. Caprı̀, and L. L. Bello, “Remote Eye-Tracking for Cognitive Telerehabilitation and Interactive School Tasks in Times of COVID-19”, Information 2020, Vol. 11, Page 296, vol. 11, no. 6, p. 296, 6 2020. [Online]. Available: https://www.mdpi.com/2078-2489/11/6/296/htm | spa |
dc.relation.references | I. S. Hwang, Y. Y. Tsai, B. H. Zeng, C. M. Lin, H. S. Shiue, and G. C. Chang, “Integration of eye tracking and lip motion for hands-free computer access”, Universal Access in the Information Society, vol. 20, no. 2, pp. 405–416, 6 2021. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10209-020-00723-w | spa |
dc.relation.references | M. H. Lee, J. Williamson, D. O. Won, S. Fazli, and S. W. Lee, “A High Performance Spelling System based on EEG-EOG Signals with Visual Feedback”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 7, pp. 1443–1459, 7 2018 | spa |
dc.relation.references | D. Chatterjee, R. D. Gavas, K. Chakravarty, A. Sinha, and U. Lahiri, “Eye movements - An early marker of cognitive dysfunctions”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, vol. 2018 July, pp. 4012–4016, 10 2018 | spa |
dc.relation.references | V. Rajanna and T. Hammond, “A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions”, Eye Tracking Research and Applications Symposium (ETRA), 6 2018. [Online]. Available: https://dl.acm.org/doi/10.1145/3204493.3208344 | spa |
dc.relation.references | C. Froment Tilikete, “How to assess eye movements clinically”, Neurological Sciences, vol. 43, no. 5, pp. 2969–2981, 5 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10072-022-05981-5 | spa |
dc.relation.references | A. Khasnobish, R. Gavas, D. Chatterjee, V. Raj, and S. Naitam, “EyeAssist: A communication aid through gaze tracking for patients with neuro-motor disabilities”, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2017, pp. 382–387, 5 2017 | spa |
dc.relation.references | A. López, F. Ferrero, and O. Postolache, “An Affordable Method for Evaluation of Ataxic Disorders Based on Electrooculography”, Sensors 2019, Vol. 19, Page 3756, vol. 19, no. 17, p. 3756, 8 2019. [Online]. Available: https://www.mdpi.com/1424-8220/19/17/3756/htmhttps://www.mdpi.com/1424-8220/19/17/3756 | spa |
dc.relation.references | A. Tanwear, X. Liang, Y. Liu, A. Vuckovic, R. Ghannam, T. Bohnert, E. Paz, P. P. Freitas, R. Ferreira, and H. Heidari, “Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control”, IEEE Transactions on Biomedical Circuits and Systems, vol. 14, no. 6, pp. 1299–1310, 12 2020 | spa |
dc.relation.references | A. Sprenger, B. Neppert, S. Köster, S. Gais, D. Kömpf, C. Helmchen, and H. Kimmig, “Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications”, Journal of Neuroscience Methods, vol. 170, no. 2, pp. 305–309, 5 2008 | spa |
dc.relation.references | D. Sliney, D. Aron-Rosa, F. Delori, F. Fankhauser, R. Landry, M. Mainster, J. Marshall, B. Rassow, B. Stuck, S. Trokel, T. M. West, and M. Wolffe, “Adjustment of guidelines for exposure of the eye to optical radiation from ocular instruments: statement from a task group of the International Commission on Non-Ionizing Radiation Protection (ICNIRP)”, Applied Optics, Vol. 44, Issue 11, pp. 2162-2176, vol. 44, no. 11, pp. 2162–2176, 4 2005. [Online]. Available: https://opg.optica.org/abstract.cfm?uri=ao-44-11-2162https://opg.optica.org/ao/abstract.cfm?uri=ao-44-11-2162 | spa |
dc.relation.references | S. Zeng, J. Niu, J. Zhu, and X. Li, “A Study on Depression Detection Using Eye Tracking”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11354 LNCS, pp. 516–523, 3 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-030-15127-0 52 | spa |
dc.relation.references | N. E. Krausz, D. Lamotte, I. Batzianoulis, L. J. Hargrove, S. Micera, and A. Billard, “Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 6, pp. 1471–1480, 6 2020 | spa |
dc.relation.references | A. M. Choudhari, P. Porwal, V. Jonnalagedda, and F. Mériaudeau, “An Electrooculography based Human Machine Interface for wheelchair control”, Biocybernetics and Biomedical Engineering, vol. 39, no. 3, pp. 673–685, 7 2019 | spa |
dc.relation.references | G. Pangestu, F. Utaminingrum, and F. A. Bachtiar, “Eye State Recognition Using Multiple Methods for Applied to Control Smart Wheelchair”, International Journal of Intelligent Engineering and Systems, vol. 12, no. 1, 2019 | spa |
dc.relation.references | P. Illavarason, J. Arokia Renjit, and P. Mohan Kumar, “Medical Diagnosis of Cerebral Palsy Rehabilitation Using Eye Images in Machine Learning Techniques”, Journal of Medical Systems, vol. 43, no. 8, pp. 1–24, 8 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10916-019-1410-6 | spa |
dc.relation.references | S. He and Y. Li, “A single-channel EOG-based speller", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 11 2017 | spa |
dc.relation.references | K. Sakurai, M. Yan, K. Tanno, and H. Tamura, “Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor”, Computational Intelligence and Neuroscience, vol. 2017, 2017 | spa |
dc.relation.references | R. K. Megalingam, V. Nandakumar, A. Athira, G. S. Gopika, and A. Krishna, “Orthotic arm control using EOG signals and GUI”, International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 - Conference Proceedings, 5 2017 | spa |
dc.relation.references | M. Thilagaraj, B. Dwarakanath, S. Ramkumar, K. Karthikeyan, A. Prabhu, G. Saravanakumar, M. P. Rajasekaran, and N. Arunkumar, “Eye Movement Signal Classification for Developing Human-Computer Interface Using Electrooculogram”, Journal of Healthcare Engineering, vol. 2021, 2021 | spa |
dc.relation.references | K. Stingl, T. Peters, T. Strasser, C. Kelbsch, P. Richter, H. Wilhelm, and B. Wilhelm, “Pupillographic campimetry: An objective method to measure the visual field”, Biomedizinische Technik, vol. 63, no. 6, pp. 665–672, 12 2018. [Online]. Available: https://www.degruyter.com/document/doi/10.1515/bmt-2017-0029/html | spa |
dc.relation.references | D. Mittal, S. Rajalakshmi, and T. Shankar, “DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS”, ARPN Journal of Engineering and Applied Sciences, vol. 13, no. 11, 2018. [Online]. Available: www.arpnjournals.com | spa |
dc.relation.references | K. P. Murphy, Machine Learning: A Probabilistic Perspective, 1st ed. The MIT Press, 8 2012. [Online]. Available: https://mitpress.mit.edu/9780262018029/ | spa |
dc.relation.references | C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going Deeper with Convolutions”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 07-12-June-2015, pp. 1–9, 9 2014. [Online]. Available: https://arxiv.org/abs/1409.4842v1 | spa |
dc.relation.references | J. O. Wobbrock, J. Rubinstein, M. Sawyer, and A. T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures”, The 3rd Conference on Communication by Gaze Interaction – COGAIN 2007: Gaze-based Creativity, 9 2007 | spa |
dc.relation.references | S. Tantisatirapong and M. Phothisonothai, “Design of User-Friendly Virtual Thai Keyboard Based on Eye-Tracking Controlled System”, ISCIT 2018 - 18th International Symposium on Communication and Information Technology, pp. 359–362, 12 2018 | spa |
dc.relation.references | H. Cecotti, Y. K. Meena, B. Bhushan, A. Dutta, and G. Prasad, “A multiscript gaze-based assistive virtual keyboard”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 1306–1309, 7 2019 | spa |
dc.relation.references | O. Tuisku, P. Majaranta, P. Isokoski, and K. J. Räihä, “Now Dasher! Dash away!: Longitudinal study of fast text entry by eye gaze”, Eye Tracking Research and Applications Symposium (ETRA), pp. 19–26, 2008. [Online]. Available: https://dl.acm.org/doi/10.1145/1344471.1344476 | spa |
dc.relation.references | F. L. Darley, A. E. Aronson, and J. R. Brown, “Differential Diagnostic Patterns of Dysarthria”, Journal of speech and hearing research, vol. 12, no. 2, pp. 246–269, 1969. [Online]. Available: https://pubs.asha.org/doi/10.1044/jshr.1202.246 | spa |
dc.relation.references | J. J. Sidtis, J. S. Ahn, C. Gomez, and D. Sidtis, “Speech characteristics associated with three genotypes of ataxia”, Journal of Communication Disorders, vol. 44, no. 4, pp. 478–492, 7 2011 | spa |
dc.relation.references | E. Roos, D. Mariosa, C. Ingre, C. Lundholm, K. Wirdefeldt, P. M. Roos, and F. Fang, “Depression in amyotrophic lateral sclerosis”, Neurology, vol. 86, no. 24, pp. 2271–2277, 6 2016. [Online]. Available: https://n.neurology.org/content/86/24/2271 | spa |
dc.relation.references | World Health Organization and World Bank, “World report on disability”, World Health Organization, Tech. Rep., 2011. [Online]. Available: https://apps.who.int/iris/handle/10665/44575 | spa |
dc.relation.references | J. P. Van Den Berg, S. Kalmijn, E. Lindeman, J. H. Veldink, M. De Visser, M. M. Van Der Graaff, J. H. Wokke, and L. H. Van Den Berg, “Multidisciplinary ALS care improves quality of life in patients with ALS”, Neurology, vol. 65, no. 8, pp. 1264–1267, 10 2005. [Online]. Available: https://n.neurology.org/content/65/8/1264 | spa |
dc.relation.references | S. Körner, M. Siniawski, K. Kollewe, K. J. Rath, K. Krampfl, A. Zapf, R. Dengler, and S. Petri, “Speech therapy and communication device: Impact on quality of life and mood in patients with amyotrophic lateral sclerosis”, Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, vol. 14, no. 1, pp. 20–25, 1 2013. [Online]. Available: https://www-tandfonline-com.ezproxy.unal.edu.co/doi/abs/10.3109/17482968.2012.692382 | spa |
dc.relation.references | T. Prell, N. Gaur, B. Stubendorff, A. Rödiger, O. W. Witte, and J. Grosskreutz, “Disease progression impacts health-related quality of life in amyotrophic lateral sclerosis”, Journal of the Neurological Sciences, vol. 397, pp. 92–95, 2 2019 | spa |
dc.relation.references | L. García, R. Ron-Angevin, B. Loubière, L. Renault, G. Le Masson, V. Lespinet-Najib, and J. M. André, “A comparison of a brain-computer interface and an eye tracker: Is there a more appropriate technology for controlling a virtual keyboard in an ALS patient?”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10306 LNCS, pp. 464–473, 2017. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-319-59147-6 40 | spa |
dc.relation.references | Raspberry Pi Foundation, “Raspberry Pi - About us.”, [Online]. Available: https://www.raspberrypi.com/about/ | spa |
dc.relation.references | Arduino, “About Arduino.”, [Online]. Available: https://www.arduino.cc/en/about | spa |
dc.relation.references | OBS Project, “Open Broadcaster Software.”, [Online]. Available: https://obsproject.com/ | spa |
dc.relation.references | A. Vakunov and D. Lagun, “MediaPipe Iris: Real-time Iris Tracking & Depth Estimation – Google AI Blog”, 2020. [Online]. Available: https://ai.googleblog.com/2020/08/mediapipe-iris-real-time-iris-tracking.html | spa |
dc.rights.accessrights | info:eu-repo/semantics/openAccess | spa |
dc.rights.license | Atribución-NoComercial 4.0 Internacional | spa |
dc.rights.uri | http://creativecommons.org/licenses/by-nc/4.0/ | spa |
dc.subject.decs | Esclerosis Amiotrófica Lateral | spa |
dc.subject.decs | Amyotrophic Lateral Sclerosis | eng |
dc.subject.decs | Métodos de Comunicación Total | spa |
dc.subject.decs | Communication Methods, Total | eng |
dc.subject.decs | Equipos de Comunicación para Personas con Discapacidad | spa |
dc.subject.decs | Communication Aids for Disabled | eng |
dc.subject.proposal | Esclerosis lateral amiotrófica (ELA) | spa |
dc.subject.proposal | Comunicación ocular | spa |
dc.subject.proposal | Video-oculografía | spa |
dc.subject.proposal | Vocal Eyes | spa |
dc.subject.proposal | Redes neuronales convolucionales | spa |
dc.subject.proposal | Transmisión de mensajes | spa |
dc.subject.proposal | Interfaz ocular | spa |
dc.subject.proposal | Amyotrophic lateral sclerosis (ALS) | eng |
dc.subject.proposal | Ocular communication | eng |
dc.subject.proposal | Video-oculography | eng |
dc.subject.proposal | Vocal Eyes | eng |
dc.subject.proposal | Convolutional neural networks | eng |
dc.subject.proposal | Message transmission | eng |
dc.subject.proposal | Ocular interface | eng |
dc.title | Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica | spa |
dc.title.translated | Vocal Eyes-based eye communication software for amyotrophic lateral sclerosis patients | eng |
dc.type | Trabajo de grado - Maestría | spa |
dc.type.coar | http://purl.org/coar/resource_type/c_bdcc | spa |
dc.type.coarversion | http://purl.org/coar/version/c_ab4af688f83e57aa | spa |
dc.type.content | Text | spa |
dc.type.driver | info:eu-repo/semantics/masterThesis | spa |
dc.type.redcol | http://purl.org/redcol/resource_type/TM | spa |
dc.type.version | info:eu-repo/semantics/acceptedVersion | spa |
dcterms.audience.professionaldevelopment | Público general | spa |
oaire.accessrights | http://purl.org/coar/access_right/c_abf2 | spa |
Archivos
Bloque original
1 - 1 de 1
Cargando...
- Nombre:
- 1015450643.2023.pdf
- Tamaño:
- 13.18 MB
- Formato:
- Adobe Portable Document Format
- Descripción:
- Tesis de Maestría en Ingeniería - Ingeniería de Sistemas y Computación
Bloque de licencias
1 - 1 de 1
No hay miniatura disponible
- Nombre:
- license.txt
- Tamaño:
- 5.74 KB
- Formato:
- Item-specific license agreed upon to submission
- Descripción: