Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica
Los pacientes con esclerosis lateral amiotrófica (ELA) se enfrentan a problemas de comunicación debido a la pérdida de las capacidades del habla y la escritura. Los sistemas de comunicación ocular han surgido como una posible solución, pero su uso presenta retos y limitaciones para el usuario, como...
- Autores:
-
Tovar Díaz, Dorian Abad
- Tipo de recurso:
- Fecha de publicación:
- 2023
- Institución:
- Universidad Nacional de Colombia
- Repositorio:
- Universidad Nacional de Colombia
- Idioma:
- spa
- OAI Identifier:
- oai:repositorio.unal.edu.co:unal/84794
- Palabra clave:
- Esclerosis Amiotrófica Lateral
Métodos de Comunicación Total
Equipos de Comunicación para Personas con Discapacidad
Amyotrophic Lateral Sclerosis
Communication Methods, Total
Communication Aids for Disabled
Esclerosis lateral amiotrófica (ELA)
Comunicación ocular
Video-oculografía
Vocal Eyes
Redes neuronales convolucionales
Transmisión de mensajes
Interfaz ocular
Amyotrophic lateral sclerosis (ALS)
Ocular communication
Video-oculography
Vocal Eyes
Convolutional neural networks
Message transmission
Ocular interface
- Rights
- openAccess
- License
- Atribución-NoComercial 4.0 Internacional
id |
UNACIONAL2_2484ece29cd76d791e88b864552e2a36 |
---|---|
oai_identifier_str |
oai:repositorio.unal.edu.co:unal/84794 |
network_acronym_str |
UNACIONAL2 |
network_name_str |
Universidad Nacional de Colombia |
repository_id_str |
|
dc.title.spa.fl_str_mv |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
dc.title.translated.eng.fl_str_mv |
Vocal Eyes-based eye communication software for amyotrophic lateral sclerosis patients |
title |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
spellingShingle |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica Esclerosis Amiotrófica Lateral Métodos de Comunicación Total Equipos de Comunicación para Personas con Discapacidad Amyotrophic Lateral Sclerosis Communication Methods, Total Communication Aids for Disabled Esclerosis lateral amiotrófica (ELA) Comunicación ocular Video-oculografía Vocal Eyes Redes neuronales convolucionales Transmisión de mensajes Interfaz ocular Amyotrophic lateral sclerosis (ALS) Ocular communication Video-oculography Vocal Eyes Convolutional neural networks Message transmission Ocular interface |
title_short |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
title_full |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
title_fullStr |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
title_full_unstemmed |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
title_sort |
Software de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotrófica |
dc.creator.fl_str_mv |
Tovar Díaz, Dorian Abad |
dc.contributor.advisor.none.fl_str_mv |
Niño Vásquez, Luis Fernando |
dc.contributor.author.none.fl_str_mv |
Tovar Díaz, Dorian Abad |
dc.contributor.researchgroup.spa.fl_str_mv |
laboratorio de Investigación en Sistemas Inteligentes Lisi |
dc.subject.decs.spa.fl_str_mv |
Esclerosis Amiotrófica Lateral Métodos de Comunicación Total Equipos de Comunicación para Personas con Discapacidad |
topic |
Esclerosis Amiotrófica Lateral Métodos de Comunicación Total Equipos de Comunicación para Personas con Discapacidad Amyotrophic Lateral Sclerosis Communication Methods, Total Communication Aids for Disabled Esclerosis lateral amiotrófica (ELA) Comunicación ocular Video-oculografía Vocal Eyes Redes neuronales convolucionales Transmisión de mensajes Interfaz ocular Amyotrophic lateral sclerosis (ALS) Ocular communication Video-oculography Vocal Eyes Convolutional neural networks Message transmission Ocular interface |
dc.subject.decs.eng.fl_str_mv |
Amyotrophic Lateral Sclerosis Communication Methods, Total Communication Aids for Disabled |
dc.subject.proposal.spa.fl_str_mv |
Esclerosis lateral amiotrófica (ELA) Comunicación ocular Video-oculografía Vocal Eyes Redes neuronales convolucionales Transmisión de mensajes Interfaz ocular |
dc.subject.proposal.eng.fl_str_mv |
Amyotrophic lateral sclerosis (ALS) Ocular communication Video-oculography Vocal Eyes Convolutional neural networks Message transmission Ocular interface |
description |
Los pacientes con esclerosis lateral amiotrófica (ELA) se enfrentan a problemas de comunicación debido a la pérdida de las capacidades del habla y la escritura. Los sistemas de comunicación ocular han surgido como una posible solución, pero su uso presenta retos y limitaciones para el usuario, como el uso de dispositivos de captura muy complejos y costosos. El objetivo de este trabajo es desarrollar un prototipo de software basado en técnicas de visión por computador para mejorar la comunicación de los pacientes con ELA mediante el seguimiento y la clasificación de sus movimientos oculares. Se utiliza la videooculografía para capturar las características oculares, mientras que para la clasificación de los movimientos se seleccionó el modelo de red neuronal convolucional Inception V3. Este modelo se entrenó con un conjunto de imágenes sintéticas generadas con la herramienta UnityEyes. El sistema de comunicación Vocal Eyes se utiliza para traducir los movimientos oculares en el mensaje del paciente. El prototipo logra una precisión del 99 % en la transmisión de cada mensaje, con una tasa de acierto del 99.3 % en los movimientos realizados. Sin embargo, se observan dificultades en la clasificación de los movimientos oculares de la mirada inferior. Este resultado representa un avance significativo en la mejora de la comunicación ocular para pacientes con ELA, respalda la viabilidad de la comunicación ocular de bajo costo y ofrece oportunidades para futuras investigaciones y mejoras en el sistema. (Texto tomado de la fuente) |
publishDate |
2023 |
dc.date.accessioned.none.fl_str_mv |
2023-10-10T22:15:02Z |
dc.date.available.none.fl_str_mv |
2023-10-10T22:15:02Z |
dc.date.issued.none.fl_str_mv |
2023 |
dc.type.spa.fl_str_mv |
Trabajo de grado - Maestría |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/masterThesis |
dc.type.version.spa.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.content.spa.fl_str_mv |
Text |
dc.type.redcol.spa.fl_str_mv |
http://purl.org/redcol/resource_type/TM |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://repositorio.unal.edu.co/handle/unal/84794 |
dc.identifier.instname.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.identifier.reponame.spa.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
dc.identifier.repourl.spa.fl_str_mv |
https://repositorio.unal.edu.co/ |
url |
https://repositorio.unal.edu.co/handle/unal/84794 https://repositorio.unal.edu.co/ |
identifier_str_mv |
Universidad Nacional de Colombia Repositorio Institucional Universidad Nacional de Colombia |
dc.language.iso.spa.fl_str_mv |
spa |
language |
spa |
dc.relation.references.spa.fl_str_mv |
D. Purves, G. J. Augustine, D. Fitzpatrick, L. C. Katz, A.-S. LaMantia, J. O. McNamara, and S. M. Williams, Neuroscience, 2nd ed. Sunderland: Sinauer Associates, 2001. [Online]. Available: https://www.ncbi.nlm.nih.gov/books/NBK10799/ S. Haykin, Neural Networks and Learning Machines, 3rd ed. New York: Pearson Prentice Hall, 2009. A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6, pp. 84–90, 5 2017. [Online]. Available: https://dl.acm.org/doi/10.1145/3065386 C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 2818–2826, 12 2015. [Online]. Available: https://arxiv.org/abs/1512.00567v3 K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 770–778, 12 2016. E. Wood, T. Baltrušaitis, L. P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” Eye Tracking Research and Applications Symposium (ETRA), vol. 14, pp. 131–138, 3 2016. [Online]. Available: https://dl.acm.org/doi/10.1145/2857491.2857492 J. Becker and G. Becker, “Vocal Eyes Becker Communication System,” 5 2017. [Online]. Available: https://patient-innovation.com/post/1705 I. Grishchenko, A. Ablavatski, Y. Kartynnik, K. Raveendran, and M. Grundmann, “Attention Mesh: High-fidelity Face Mesh Prediction in Real-time,” CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020. [Online]. Available: https://arxiv.org/abs/2006.10962 M. C. Kiernan, S. Vucic, B. C. Cheah, M. R. Turner, A. Eisen, O. Hardiman, J. R. Burrell, and M. C. Zoing, “Amyotrophic lateral sclerosis,” The Lancet, vol. 377, no. 9769, pp. 942–955, 3 2011. G. Bauer, F. Gerstenbrand, and E. Rumpl, “Varieties of the locked-in syndrome”, Journal of Neurology, vol. 221, no. 2, pp. 77–91, 8 1979. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/BF00313105 R. Pugliese, R. Sala, S. Regondi, B. Beltrami, and C. Lunetta, “Emerging technologies for management of patients with amyotrophic lateral sclerosis: from telehealth to assistive robotics and neural interfaces”, Journal of Neurology, vol. 269, no. 6, pp. 2910–2921, 6 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s00415-022-10971-w A. Londral, A. Pinto, S. Pinto, L. Azevedo, and M. De Carvalho, “Quality of life in amyotrophic lateral sclerosis patients and caregivers: Impact of assistive communication from early stages”, Muscle & Nerve, vol. 52, no. 6, pp. 933–941, 12 2015. [Online]. Available: https://onlinelibrary-wiley-com.ezproxy.unal.edu.co/doi/10.1002/mus.24659 Z. Hossain, M. M. H. Shuvo, and P. Sarker, “Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement”, 4th International Conference on Advances in Electrical Engineering, ICAEE 2017, vol. 2018-January, pp. 132–137, 7 2017. C. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation”, Multimedia Tools and Applications, vol. 77, no. 15, pp. 19 679–19 696, 8 2018. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s11042-017-5426-y Z. Al-Kassim and Q. A. Memon, “Designing a low-cost eyeball tracking keyboard for paralyzed people”, Computers & Electrical Engineering, vol. 58, pp. 20–29, 2 2017. T. L. A. Valente, J. D. S. de Almeida, A. C. Silva, J. A. M. Teixeira, and M. Gattass, “Automatic diagnosis of strabismus in digital videos through cover test", Computer Methods and Programs in Biomedicine, vol. 140, pp. 295–305, 3 2017. H. Y. Lai, G. Saavedra-Pena, C. G. Sodini, V. Sze, and T. Heldt, “Measuring Saccade Latency Using Smartphone Cameras”, IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 3, pp. 885–897, 3 2020. T. K. Reddy, V. Gupta, and L. Behera, “Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection”, Advances in Intelligent Systems and Computing, vol. 799, pp. 229–238, 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-981-13-1135-2 18 Z. Wang, J. Chai, and S. Xia, “Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation”, IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 1, pp. 190–203, 1 2021 G. Iannizzotto, A. Nucita, R. A. Fabio, T. Caprı̀, and L. L. Bello, “Remote Eye-Tracking for Cognitive Telerehabilitation and Interactive School Tasks in Times of COVID-19”, Information 2020, Vol. 11, Page 296, vol. 11, no. 6, p. 296, 6 2020. [Online]. Available: https://www.mdpi.com/2078-2489/11/6/296/htm I. S. Hwang, Y. Y. Tsai, B. H. Zeng, C. M. Lin, H. S. Shiue, and G. C. Chang, “Integration of eye tracking and lip motion for hands-free computer access”, Universal Access in the Information Society, vol. 20, no. 2, pp. 405–416, 6 2021. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10209-020-00723-w M. H. Lee, J. Williamson, D. O. Won, S. Fazli, and S. W. Lee, “A High Performance Spelling System based on EEG-EOG Signals with Visual Feedback”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 7, pp. 1443–1459, 7 2018 D. Chatterjee, R. D. Gavas, K. Chakravarty, A. Sinha, and U. Lahiri, “Eye movements - An early marker of cognitive dysfunctions”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, vol. 2018 July, pp. 4012–4016, 10 2018 V. Rajanna and T. Hammond, “A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions”, Eye Tracking Research and Applications Symposium (ETRA), 6 2018. [Online]. Available: https://dl.acm.org/doi/10.1145/3204493.3208344 C. Froment Tilikete, “How to assess eye movements clinically”, Neurological Sciences, vol. 43, no. 5, pp. 2969–2981, 5 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10072-022-05981-5 A. Khasnobish, R. Gavas, D. Chatterjee, V. Raj, and S. Naitam, “EyeAssist: A communication aid through gaze tracking for patients with neuro-motor disabilities”, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2017, pp. 382–387, 5 2017 A. López, F. Ferrero, and O. Postolache, “An Affordable Method for Evaluation of Ataxic Disorders Based on Electrooculography”, Sensors 2019, Vol. 19, Page 3756, vol. 19, no. 17, p. 3756, 8 2019. [Online]. Available: https://www.mdpi.com/1424-8220/19/17/3756/htmhttps://www.mdpi.com/1424-8220/19/17/3756 A. Tanwear, X. Liang, Y. Liu, A. Vuckovic, R. Ghannam, T. Bohnert, E. Paz, P. P. Freitas, R. Ferreira, and H. Heidari, “Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control”, IEEE Transactions on Biomedical Circuits and Systems, vol. 14, no. 6, pp. 1299–1310, 12 2020 A. Sprenger, B. Neppert, S. Köster, S. Gais, D. Kömpf, C. Helmchen, and H. Kimmig, “Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications”, Journal of Neuroscience Methods, vol. 170, no. 2, pp. 305–309, 5 2008 D. Sliney, D. Aron-Rosa, F. Delori, F. Fankhauser, R. Landry, M. Mainster, J. Marshall, B. Rassow, B. Stuck, S. Trokel, T. M. West, and M. Wolffe, “Adjustment of guidelines for exposure of the eye to optical radiation from ocular instruments: statement from a task group of the International Commission on Non-Ionizing Radiation Protection (ICNIRP)”, Applied Optics, Vol. 44, Issue 11, pp. 2162-2176, vol. 44, no. 11, pp. 2162–2176, 4 2005. [Online]. Available: https://opg.optica.org/abstract.cfm?uri=ao-44-11-2162https://opg.optica.org/ao/abstract.cfm?uri=ao-44-11-2162 S. Zeng, J. Niu, J. Zhu, and X. Li, “A Study on Depression Detection Using Eye Tracking”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11354 LNCS, pp. 516–523, 3 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-030-15127-0 52 N. E. Krausz, D. Lamotte, I. Batzianoulis, L. J. Hargrove, S. Micera, and A. Billard, “Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 6, pp. 1471–1480, 6 2020 A. M. Choudhari, P. Porwal, V. Jonnalagedda, and F. Mériaudeau, “An Electrooculography based Human Machine Interface for wheelchair control”, Biocybernetics and Biomedical Engineering, vol. 39, no. 3, pp. 673–685, 7 2019 G. Pangestu, F. Utaminingrum, and F. A. Bachtiar, “Eye State Recognition Using Multiple Methods for Applied to Control Smart Wheelchair”, International Journal of Intelligent Engineering and Systems, vol. 12, no. 1, 2019 P. Illavarason, J. Arokia Renjit, and P. Mohan Kumar, “Medical Diagnosis of Cerebral Palsy Rehabilitation Using Eye Images in Machine Learning Techniques”, Journal of Medical Systems, vol. 43, no. 8, pp. 1–24, 8 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10916-019-1410-6 S. He and Y. Li, “A single-channel EOG-based speller", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 11 2017 K. Sakurai, M. Yan, K. Tanno, and H. Tamura, “Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor”, Computational Intelligence and Neuroscience, vol. 2017, 2017 R. K. Megalingam, V. Nandakumar, A. Athira, G. S. Gopika, and A. Krishna, “Orthotic arm control using EOG signals and GUI”, International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 - Conference Proceedings, 5 2017 M. Thilagaraj, B. Dwarakanath, S. Ramkumar, K. Karthikeyan, A. Prabhu, G. Saravanakumar, M. P. Rajasekaran, and N. Arunkumar, “Eye Movement Signal Classification for Developing Human-Computer Interface Using Electrooculogram”, Journal of Healthcare Engineering, vol. 2021, 2021 K. Stingl, T. Peters, T. Strasser, C. Kelbsch, P. Richter, H. Wilhelm, and B. Wilhelm, “Pupillographic campimetry: An objective method to measure the visual field”, Biomedizinische Technik, vol. 63, no. 6, pp. 665–672, 12 2018. [Online]. Available: https://www.degruyter.com/document/doi/10.1515/bmt-2017-0029/html D. Mittal, S. Rajalakshmi, and T. Shankar, “DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS”, ARPN Journal of Engineering and Applied Sciences, vol. 13, no. 11, 2018. [Online]. Available: www.arpnjournals.com K. P. Murphy, Machine Learning: A Probabilistic Perspective, 1st ed. The MIT Press, 8 2012. [Online]. Available: https://mitpress.mit.edu/9780262018029/ C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going Deeper with Convolutions”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 07-12-June-2015, pp. 1–9, 9 2014. [Online]. Available: https://arxiv.org/abs/1409.4842v1 J. O. Wobbrock, J. Rubinstein, M. Sawyer, and A. T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures”, The 3rd Conference on Communication by Gaze Interaction – COGAIN 2007: Gaze-based Creativity, 9 2007 S. Tantisatirapong and M. Phothisonothai, “Design of User-Friendly Virtual Thai Keyboard Based on Eye-Tracking Controlled System”, ISCIT 2018 - 18th International Symposium on Communication and Information Technology, pp. 359–362, 12 2018 H. Cecotti, Y. K. Meena, B. Bhushan, A. Dutta, and G. Prasad, “A multiscript gaze-based assistive virtual keyboard”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 1306–1309, 7 2019 O. Tuisku, P. Majaranta, P. Isokoski, and K. J. Räihä, “Now Dasher! Dash away!: Longitudinal study of fast text entry by eye gaze”, Eye Tracking Research and Applications Symposium (ETRA), pp. 19–26, 2008. [Online]. Available: https://dl.acm.org/doi/10.1145/1344471.1344476 F. L. Darley, A. E. Aronson, and J. R. Brown, “Differential Diagnostic Patterns of Dysarthria”, Journal of speech and hearing research, vol. 12, no. 2, pp. 246–269, 1969. [Online]. Available: https://pubs.asha.org/doi/10.1044/jshr.1202.246 J. J. Sidtis, J. S. Ahn, C. Gomez, and D. Sidtis, “Speech characteristics associated with three genotypes of ataxia”, Journal of Communication Disorders, vol. 44, no. 4, pp. 478–492, 7 2011 E. Roos, D. Mariosa, C. Ingre, C. Lundholm, K. Wirdefeldt, P. M. Roos, and F. Fang, “Depression in amyotrophic lateral sclerosis”, Neurology, vol. 86, no. 24, pp. 2271–2277, 6 2016. [Online]. Available: https://n.neurology.org/content/86/24/2271 World Health Organization and World Bank, “World report on disability”, World Health Organization, Tech. Rep., 2011. [Online]. Available: https://apps.who.int/iris/handle/10665/44575 J. P. Van Den Berg, S. Kalmijn, E. Lindeman, J. H. Veldink, M. De Visser, M. M. Van Der Graaff, J. H. Wokke, and L. H. Van Den Berg, “Multidisciplinary ALS care improves quality of life in patients with ALS”, Neurology, vol. 65, no. 8, pp. 1264–1267, 10 2005. [Online]. Available: https://n.neurology.org/content/65/8/1264 S. Körner, M. Siniawski, K. Kollewe, K. J. Rath, K. Krampfl, A. Zapf, R. Dengler, and S. Petri, “Speech therapy and communication device: Impact on quality of life and mood in patients with amyotrophic lateral sclerosis”, Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, vol. 14, no. 1, pp. 20–25, 1 2013. [Online]. Available: https://www-tandfonline-com.ezproxy.unal.edu.co/doi/abs/10.3109/17482968.2012.692382 T. Prell, N. Gaur, B. Stubendorff, A. Rödiger, O. W. Witte, and J. Grosskreutz, “Disease progression impacts health-related quality of life in amyotrophic lateral sclerosis”, Journal of the Neurological Sciences, vol. 397, pp. 92–95, 2 2019 L. García, R. Ron-Angevin, B. Loubière, L. Renault, G. Le Masson, V. Lespinet-Najib, and J. M. André, “A comparison of a brain-computer interface and an eye tracker: Is there a more appropriate technology for controlling a virtual keyboard in an ALS patient?”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10306 LNCS, pp. 464–473, 2017. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-319-59147-6 40 Raspberry Pi Foundation, “Raspberry Pi - About us.”, [Online]. Available: https://www.raspberrypi.com/about/ Arduino, “About Arduino.”, [Online]. Available: https://www.arduino.cc/en/about OBS Project, “Open Broadcaster Software.”, [Online]. Available: https://obsproject.com/ A. Vakunov and D. Lagun, “MediaPipe Iris: Real-time Iris Tracking & Depth Estimation – Google AI Blog”, 2020. [Online]. Available: https://ai.googleblog.com/2020/08/mediapipe-iris-real-time-iris-tracking.html |
dc.rights.coar.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
dc.rights.license.spa.fl_str_mv |
Atribución-NoComercial 4.0 Internacional |
dc.rights.uri.spa.fl_str_mv |
http://creativecommons.org/licenses/by-nc/4.0/ |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
rights_invalid_str_mv |
Atribución-NoComercial 4.0 Internacional http://creativecommons.org/licenses/by-nc/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.spa.fl_str_mv |
xv, 63 páginas |
dc.format.mimetype.spa.fl_str_mv |
application/pdf |
dc.publisher.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.publisher.program.spa.fl_str_mv |
Bogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación |
dc.publisher.faculty.spa.fl_str_mv |
Facultad de Ingeniería |
dc.publisher.place.spa.fl_str_mv |
Bogotá, Colombia |
dc.publisher.branch.spa.fl_str_mv |
Universidad Nacional de Colombia - Sede Bogotá |
institution |
Universidad Nacional de Colombia |
bitstream.url.fl_str_mv |
https://repositorio.unal.edu.co/bitstream/unal/84794/1/license.txt https://repositorio.unal.edu.co/bitstream/unal/84794/2/1015450643.2023.pdf https://repositorio.unal.edu.co/bitstream/unal/84794/3/1015450643.2023.pdf.jpg |
bitstream.checksum.fl_str_mv |
eb34b1cf90b7e1103fc9dfd26be24b4a c54dabed1eb3dc04a7d05aa02998937f 379d4c8f29bf8ce5af66decebefd94a1 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
repository.mail.fl_str_mv |
repositorio_nal@unal.edu.co |
_version_ |
1814089802296852480 |
spelling |
Atribución-NoComercial 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Niño Vásquez, Luis Fernandobc784b82735e16fe53653c3f5c8f3bbeTovar Díaz, Dorian Abad10f9709cc813deaae4ea58ef3993638flaboratorio de Investigación en Sistemas Inteligentes Lisi2023-10-10T22:15:02Z2023-10-10T22:15:02Z2023https://repositorio.unal.edu.co/handle/unal/84794Universidad Nacional de ColombiaRepositorio Institucional Universidad Nacional de Colombiahttps://repositorio.unal.edu.co/Los pacientes con esclerosis lateral amiotrófica (ELA) se enfrentan a problemas de comunicación debido a la pérdida de las capacidades del habla y la escritura. Los sistemas de comunicación ocular han surgido como una posible solución, pero su uso presenta retos y limitaciones para el usuario, como el uso de dispositivos de captura muy complejos y costosos. El objetivo de este trabajo es desarrollar un prototipo de software basado en técnicas de visión por computador para mejorar la comunicación de los pacientes con ELA mediante el seguimiento y la clasificación de sus movimientos oculares. Se utiliza la videooculografía para capturar las características oculares, mientras que para la clasificación de los movimientos se seleccionó el modelo de red neuronal convolucional Inception V3. Este modelo se entrenó con un conjunto de imágenes sintéticas generadas con la herramienta UnityEyes. El sistema de comunicación Vocal Eyes se utiliza para traducir los movimientos oculares en el mensaje del paciente. El prototipo logra una precisión del 99 % en la transmisión de cada mensaje, con una tasa de acierto del 99.3 % en los movimientos realizados. Sin embargo, se observan dificultades en la clasificación de los movimientos oculares de la mirada inferior. Este resultado representa un avance significativo en la mejora de la comunicación ocular para pacientes con ELA, respalda la viabilidad de la comunicación ocular de bajo costo y ofrece oportunidades para futuras investigaciones y mejoras en el sistema. (Texto tomado de la fuente)ilustraciones, diagramas, fotografíasPatients with amyotrophic lateral sclerosis (ALS) face communication challenges due to the loss of speech and writing ability. Eye communication systems have emerged as a potential solution, but their use still presents challenges and limitations, such as the use of highly complex and costly capture devices. The aim of this work is to develop a software prototype based on computer vision techniques to improve the communication of ALS patients by monitoring and classifying their eye movements. Video-oculography is used to capture ocular features, while the convolutional neural network model Inception V3 was selected for movement classification. This model was trained with a set of synthetic images generated by the UnityEyes tool. The Vocal Eyes communication system is used to translate the eye movements into the patient’s message. The prototype achieves 99 % accuracy in the transmission of each message, with a 99.3 % success rate in the movements made. However, difficulties are observed in the classification of lower gaze eye movements. This result represents significant progress in improving eye communication for ALS patients, supports the feasibility of low-cost eye communication, and provides opportunity for further research and system improvements.MaestríaMagíster en Ingeniería de Sistemas y ComputaciónSistemas Inteligentesxv, 63 páginasapplication/pdfspaUniversidad Nacional de ColombiaBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y ComputaciónFacultad de IngenieríaBogotá, ColombiaUniversidad Nacional de Colombia - Sede BogotáSoftware de comunicación ocular basado en vocal eyes para pacientes con esclerosis lateral amiotróficaVocal Eyes-based eye communication software for amyotrophic lateral sclerosis patientsTrabajo de grado - Maestríainfo:eu-repo/semantics/masterThesisinfo:eu-repo/semantics/acceptedVersionTexthttp://purl.org/redcol/resource_type/TMD. Purves, G. J. Augustine, D. Fitzpatrick, L. C. Katz, A.-S. LaMantia, J. O. McNamara, and S. M. Williams, Neuroscience, 2nd ed. Sunderland: Sinauer Associates, 2001. [Online]. Available: https://www.ncbi.nlm.nih.gov/books/NBK10799/S. Haykin, Neural Networks and Learning Machines, 3rd ed. New York: Pearson Prentice Hall, 2009.A. Krizhevsky, I. Sutskever, and G. E. Hinton, “ImageNet classification with deep convolutional neural networks,” Communications of the ACM, vol. 60, no. 6, pp. 84–90, 5 2017. [Online]. Available: https://dl.acm.org/doi/10.1145/3065386C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, “Rethinking the Inception Architecture for Computer Vision,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 2818–2826, 12 2015. [Online]. Available: https://arxiv.org/abs/1512.00567v3K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2016-December, pp. 770–778, 12 2016.E. Wood, T. Baltrušaitis, L. P. Morency, P. Robinson, and A. Bulling, “Learning an appearance-based gaze estimator from one million synthesised images,” Eye Tracking Research and Applications Symposium (ETRA), vol. 14, pp. 131–138, 3 2016. [Online]. Available: https://dl.acm.org/doi/10.1145/2857491.2857492J. Becker and G. Becker, “Vocal Eyes Becker Communication System,” 5 2017. [Online]. Available: https://patient-innovation.com/post/1705I. Grishchenko, A. Ablavatski, Y. Kartynnik, K. Raveendran, and M. Grundmann, “Attention Mesh: High-fidelity Face Mesh Prediction in Real-time,” CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020. [Online]. Available: https://arxiv.org/abs/2006.10962M. C. Kiernan, S. Vucic, B. C. Cheah, M. R. Turner, A. Eisen, O. Hardiman, J. R. Burrell, and M. C. Zoing, “Amyotrophic lateral sclerosis,” The Lancet, vol. 377, no. 9769, pp. 942–955, 3 2011.G. Bauer, F. Gerstenbrand, and E. Rumpl, “Varieties of the locked-in syndrome”, Journal of Neurology, vol. 221, no. 2, pp. 77–91, 8 1979. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/BF00313105R. Pugliese, R. Sala, S. Regondi, B. Beltrami, and C. Lunetta, “Emerging technologies for management of patients with amyotrophic lateral sclerosis: from telehealth to assistive robotics and neural interfaces”, Journal of Neurology, vol. 269, no. 6, pp. 2910–2921, 6 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s00415-022-10971-wA. Londral, A. Pinto, S. Pinto, L. Azevedo, and M. De Carvalho, “Quality of life in amyotrophic lateral sclerosis patients and caregivers: Impact of assistive communication from early stages”, Muscle & Nerve, vol. 52, no. 6, pp. 933–941, 12 2015. [Online]. Available: https://onlinelibrary-wiley-com.ezproxy.unal.edu.co/doi/10.1002/mus.24659Z. Hossain, M. M. H. Shuvo, and P. Sarker, “Hardware and software implementation of real time electrooculogram (EOG) acquisition system to control computer cursor with eyeball movement”, 4th International Conference on Advances in Electrical Engineering, ICAEE 2017, vol. 2018-January, pp. 132–137, 7 2017.C. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation”, Multimedia Tools and Applications, vol. 77, no. 15, pp. 19 679–19 696, 8 2018. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s11042-017-5426-yZ. Al-Kassim and Q. A. Memon, “Designing a low-cost eyeball tracking keyboard for paralyzed people”, Computers & Electrical Engineering, vol. 58, pp. 20–29, 2 2017.T. L. A. Valente, J. D. S. de Almeida, A. C. Silva, J. A. M. Teixeira, and M. Gattass, “Automatic diagnosis of strabismus in digital videos through cover test", Computer Methods and Programs in Biomedicine, vol. 140, pp. 295–305, 3 2017.H. Y. Lai, G. Saavedra-Pena, C. G. Sodini, V. Sze, and T. Heldt, “Measuring Saccade Latency Using Smartphone Cameras”, IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 3, pp. 885–897, 3 2020.T. K. Reddy, V. Gupta, and L. Behera, “Autoencoding Convolutional Representations for Real-Time Eye-Gaze Detection”, Advances in Intelligent Systems and Computing, vol. 799, pp. 229–238, 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-981-13-1135-2 18Z. Wang, J. Chai, and S. Xia, “Realtime and Accurate 3D Eye Gaze Capture with DCNN-Based Iris and Pupil Segmentation”, IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 1, pp. 190–203, 1 2021G. Iannizzotto, A. Nucita, R. A. Fabio, T. Caprı̀, and L. L. Bello, “Remote Eye-Tracking for Cognitive Telerehabilitation and Interactive School Tasks in Times of COVID-19”, Information 2020, Vol. 11, Page 296, vol. 11, no. 6, p. 296, 6 2020. [Online]. Available: https://www.mdpi.com/2078-2489/11/6/296/htmI. S. Hwang, Y. Y. Tsai, B. H. Zeng, C. M. Lin, H. S. Shiue, and G. C. Chang, “Integration of eye tracking and lip motion for hands-free computer access”, Universal Access in the Information Society, vol. 20, no. 2, pp. 405–416, 6 2021. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10209-020-00723-wM. H. Lee, J. Williamson, D. O. Won, S. Fazli, and S. W. Lee, “A High Performance Spelling System based on EEG-EOG Signals with Visual Feedback”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 26, no. 7, pp. 1443–1459, 7 2018D. Chatterjee, R. D. Gavas, K. Chakravarty, A. Sinha, and U. Lahiri, “Eye movements - An early marker of cognitive dysfunctions”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, vol. 2018 July, pp. 4012–4016, 10 2018V. Rajanna and T. Hammond, “A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions”, Eye Tracking Research and Applications Symposium (ETRA), 6 2018. [Online]. Available: https://dl.acm.org/doi/10.1145/3204493.3208344C. Froment Tilikete, “How to assess eye movements clinically”, Neurological Sciences, vol. 43, no. 5, pp. 2969–2981, 5 2022. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10072-022-05981-5A. Khasnobish, R. Gavas, D. Chatterjee, V. Raj, and S. Naitam, “EyeAssist: A communication aid through gaze tracking for patients with neuro-motor disabilities”, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2017, pp. 382–387, 5 2017A. López, F. Ferrero, and O. Postolache, “An Affordable Method for Evaluation of Ataxic Disorders Based on Electrooculography”, Sensors 2019, Vol. 19, Page 3756, vol. 19, no. 17, p. 3756, 8 2019. [Online]. Available: https://www.mdpi.com/1424-8220/19/17/3756/htmhttps://www.mdpi.com/1424-8220/19/17/3756A. Tanwear, X. Liang, Y. Liu, A. Vuckovic, R. Ghannam, T. Bohnert, E. Paz, P. P. Freitas, R. Ferreira, and H. Heidari, “Spintronic Sensors Based on Magnetic Tunnel Junctions for Wireless Eye Movement Gesture Control”, IEEE Transactions on Biomedical Circuits and Systems, vol. 14, no. 6, pp. 1299–1310, 12 2020A. Sprenger, B. Neppert, S. Köster, S. Gais, D. Kömpf, C. Helmchen, and H. Kimmig, “Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications”, Journal of Neuroscience Methods, vol. 170, no. 2, pp. 305–309, 5 2008D. Sliney, D. Aron-Rosa, F. Delori, F. Fankhauser, R. Landry, M. Mainster, J. Marshall, B. Rassow, B. Stuck, S. Trokel, T. M. West, and M. Wolffe, “Adjustment of guidelines for exposure of the eye to optical radiation from ocular instruments: statement from a task group of the International Commission on Non-Ionizing Radiation Protection (ICNIRP)”, Applied Optics, Vol. 44, Issue 11, pp. 2162-2176, vol. 44, no. 11, pp. 2162–2176, 4 2005. [Online]. Available: https://opg.optica.org/abstract.cfm?uri=ao-44-11-2162https://opg.optica.org/ao/abstract.cfm?uri=ao-44-11-2162S. Zeng, J. Niu, J. Zhu, and X. Li, “A Study on Depression Detection Using Eye Tracking”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11354 LNCS, pp. 516–523, 3 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-030-15127-0 52N. E. Krausz, D. Lamotte, I. Batzianoulis, L. J. Hargrove, S. Micera, and A. Billard, “Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis”, IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 6, pp. 1471–1480, 6 2020A. M. Choudhari, P. Porwal, V. Jonnalagedda, and F. Mériaudeau, “An Electrooculography based Human Machine Interface for wheelchair control”, Biocybernetics and Biomedical Engineering, vol. 39, no. 3, pp. 673–685, 7 2019G. Pangestu, F. Utaminingrum, and F. A. Bachtiar, “Eye State Recognition Using Multiple Methods for Applied to Control Smart Wheelchair”, International Journal of Intelligent Engineering and Systems, vol. 12, no. 1, 2019P. Illavarason, J. Arokia Renjit, and P. Mohan Kumar, “Medical Diagnosis of Cerebral Palsy Rehabilitation Using Eye Images in Machine Learning Techniques”, Journal of Medical Systems, vol. 43, no. 8, pp. 1–24, 8 2019. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/article/10.1007/s10916-019-1410-6S. He and Y. Li, “A single-channel EOG-based speller", IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 11 2017K. Sakurai, M. Yan, K. Tanno, and H. Tamura, “Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor”, Computational Intelligence and Neuroscience, vol. 2017, 2017R. K. Megalingam, V. Nandakumar, A. Athira, G. S. Gopika, and A. Krishna, “Orthotic arm control using EOG signals and GUI”, International Conference on Robotics and Automation for Humanitarian Applications, RAHA 2016 - Conference Proceedings, 5 2017M. Thilagaraj, B. Dwarakanath, S. Ramkumar, K. Karthikeyan, A. Prabhu, G. Saravanakumar, M. P. Rajasekaran, and N. Arunkumar, “Eye Movement Signal Classification for Developing Human-Computer Interface Using Electrooculogram”, Journal of Healthcare Engineering, vol. 2021, 2021K. Stingl, T. Peters, T. Strasser, C. Kelbsch, P. Richter, H. Wilhelm, and B. Wilhelm, “Pupillographic campimetry: An objective method to measure the visual field”, Biomedizinische Technik, vol. 63, no. 6, pp. 665–672, 12 2018. [Online]. Available: https://www.degruyter.com/document/doi/10.1515/bmt-2017-0029/htmlD. Mittal, S. Rajalakshmi, and T. Shankar, “DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS”, ARPN Journal of Engineering and Applied Sciences, vol. 13, no. 11, 2018. [Online]. Available: www.arpnjournals.comK. P. Murphy, Machine Learning: A Probabilistic Perspective, 1st ed. The MIT Press, 8 2012. [Online]. Available: https://mitpress.mit.edu/9780262018029/C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich, “Going Deeper with Convolutions”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 07-12-June-2015, pp. 1–9, 9 2014. [Online]. Available: https://arxiv.org/abs/1409.4842v1J. O. Wobbrock, J. Rubinstein, M. Sawyer, and A. T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures”, The 3rd Conference on Communication by Gaze Interaction – COGAIN 2007: Gaze-based Creativity, 9 2007S. Tantisatirapong and M. Phothisonothai, “Design of User-Friendly Virtual Thai Keyboard Based on Eye-Tracking Controlled System”, ISCIT 2018 - 18th International Symposium on Communication and Information Technology, pp. 359–362, 12 2018H. Cecotti, Y. K. Meena, B. Bhushan, A. Dutta, and G. Prasad, “A multiscript gaze-based assistive virtual keyboard”, Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 1306–1309, 7 2019O. Tuisku, P. Majaranta, P. Isokoski, and K. J. Räihä, “Now Dasher! Dash away!: Longitudinal study of fast text entry by eye gaze”, Eye Tracking Research and Applications Symposium (ETRA), pp. 19–26, 2008. [Online]. Available: https://dl.acm.org/doi/10.1145/1344471.1344476F. L. Darley, A. E. Aronson, and J. R. Brown, “Differential Diagnostic Patterns of Dysarthria”, Journal of speech and hearing research, vol. 12, no. 2, pp. 246–269, 1969. [Online]. Available: https://pubs.asha.org/doi/10.1044/jshr.1202.246J. J. Sidtis, J. S. Ahn, C. Gomez, and D. Sidtis, “Speech characteristics associated with three genotypes of ataxia”, Journal of Communication Disorders, vol. 44, no. 4, pp. 478–492, 7 2011E. Roos, D. Mariosa, C. Ingre, C. Lundholm, K. Wirdefeldt, P. M. Roos, and F. Fang, “Depression in amyotrophic lateral sclerosis”, Neurology, vol. 86, no. 24, pp. 2271–2277, 6 2016. [Online]. Available: https://n.neurology.org/content/86/24/2271World Health Organization and World Bank, “World report on disability”, World Health Organization, Tech. Rep., 2011. [Online]. Available: https://apps.who.int/iris/handle/10665/44575J. P. Van Den Berg, S. Kalmijn, E. Lindeman, J. H. Veldink, M. De Visser, M. M. Van Der Graaff, J. H. Wokke, and L. H. Van Den Berg, “Multidisciplinary ALS care improves quality of life in patients with ALS”, Neurology, vol. 65, no. 8, pp. 1264–1267, 10 2005. [Online]. Available: https://n.neurology.org/content/65/8/1264S. Körner, M. Siniawski, K. Kollewe, K. J. Rath, K. Krampfl, A. Zapf, R. Dengler, and S. Petri, “Speech therapy and communication device: Impact on quality of life and mood in patients with amyotrophic lateral sclerosis”, Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, vol. 14, no. 1, pp. 20–25, 1 2013. [Online]. Available: https://www-tandfonline-com.ezproxy.unal.edu.co/doi/abs/10.3109/17482968.2012.692382T. Prell, N. Gaur, B. Stubendorff, A. Rödiger, O. W. Witte, and J. Grosskreutz, “Disease progression impacts health-related quality of life in amyotrophic lateral sclerosis”, Journal of the Neurological Sciences, vol. 397, pp. 92–95, 2 2019L. García, R. Ron-Angevin, B. Loubière, L. Renault, G. Le Masson, V. Lespinet-Najib, and J. M. André, “A comparison of a brain-computer interface and an eye tracker: Is there a more appropriate technology for controlling a virtual keyboard in an ALS patient?”, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10306 LNCS, pp. 464–473, 2017. [Online]. Available: https://link-springer-com.ezproxy.unal.edu.co/chapter/10.1007/978-3-319-59147-6 40Raspberry Pi Foundation, “Raspberry Pi - About us.”, [Online]. Available: https://www.raspberrypi.com/about/Arduino, “About Arduino.”, [Online]. Available: https://www.arduino.cc/en/aboutOBS Project, “Open Broadcaster Software.”, [Online]. Available: https://obsproject.com/A. Vakunov and D. Lagun, “MediaPipe Iris: Real-time Iris Tracking & Depth Estimation – Google AI Blog”, 2020. [Online]. Available: https://ai.googleblog.com/2020/08/mediapipe-iris-real-time-iris-tracking.htmlEsclerosis Amiotrófica LateralMétodos de Comunicación TotalEquipos de Comunicación para Personas con DiscapacidadAmyotrophic Lateral SclerosisCommunication Methods, TotalCommunication Aids for DisabledEsclerosis lateral amiotrófica (ELA)Comunicación ocularVideo-oculografíaVocal EyesRedes neuronales convolucionalesTransmisión de mensajesInterfaz ocularAmyotrophic lateral sclerosis (ALS)Ocular communicationVideo-oculographyVocal EyesConvolutional neural networksMessage transmissionOcular interfacePúblico generalLICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/84794/1/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD51ORIGINAL1015450643.2023.pdf1015450643.2023.pdfTesis de Maestría en Ingeniería - Ingeniería de Sistemas y Computaciónapplication/pdf13819229https://repositorio.unal.edu.co/bitstream/unal/84794/2/1015450643.2023.pdfc54dabed1eb3dc04a7d05aa02998937fMD52THUMBNAIL1015450643.2023.pdf.jpg1015450643.2023.pdf.jpgGenerated Thumbnailimage/jpeg4588https://repositorio.unal.edu.co/bitstream/unal/84794/3/1015450643.2023.pdf.jpg379d4c8f29bf8ce5af66decebefd94a1MD53unal/84794oai:repositorio.unal.edu.co:unal/847942024-08-19 23:10:24.53Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo= |