EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing
EyeNav is a novel system that combines eye tracking and natural language processing (NLP) to enhance accessibility and enable automated test generation. This research demonstrates the integration of these technologies for intuitive web interaction, enabling pointer control via gaze and natural langu...
- Autores:
-
Yepes Parra, Juan Diego
- Tipo de recurso:
- Trabajo de grado de pregrado
- Fecha de publicación:
- 2025
- Institución:
- Universidad de los Andes
- Repositorio:
- Séneca: repositorio Uniandes
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.uniandes.edu.co:1992/75385
- Acceso en línea:
- https://hdl.handle.net/1992/75385
- Palabra clave:
- Eye-tracking
Automated test generation
Assistive technology
Natural language processing
Web applications
Accessibility
Ingeniería
- Rights
- openAccess
- License
- Attribution 4.0 International
id |
UNIANDES2_726c75c306fb8e941efe712aaffed319 |
---|---|
oai_identifier_str |
oai:repositorio.uniandes.edu.co:1992/75385 |
network_acronym_str |
UNIANDES2 |
network_name_str |
Séneca: repositorio Uniandes |
repository_id_str |
|
dc.title.eng.fl_str_mv |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
title |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
spellingShingle |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing Eye-tracking Automated test generation Assistive technology Natural language processing Web applications Accessibility Ingeniería |
title_short |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
title_full |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
title_fullStr |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
title_full_unstemmed |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
title_sort |
EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processing |
dc.creator.fl_str_mv |
Yepes Parra, Juan Diego |
dc.contributor.advisor.none.fl_str_mv |
Escobar Velasquez, Camilo Andres |
dc.contributor.author.none.fl_str_mv |
Yepes Parra, Juan Diego |
dc.subject.keyword.eng.fl_str_mv |
Eye-tracking Automated test generation Assistive technology Natural language processing Web applications Accessibility |
topic |
Eye-tracking Automated test generation Assistive technology Natural language processing Web applications Accessibility Ingeniería |
dc.subject.themes.spa.fl_str_mv |
Ingeniería |
description |
EyeNav is a novel system that combines eye tracking and natural language processing (NLP) to enhance accessibility and enable automated test generation. This research demonstrates the integration of these technologies for intuitive web interaction, enabling pointer control via gaze and natural language processing for interpreting user intentions. Additionally, it presents a record-and-replay module for generating automated test scripts. Preliminary user evaluations yielded positive results in terms of usability. The ultimate goal is to demonstrate that eye tracking combined with NLP can be effectively used not only as a possible assistive technology but also as an innovative approach to software testing. |
publishDate |
2025 |
dc.date.accessioned.none.fl_str_mv |
2025-01-14T12:46:20Z |
dc.date.available.none.fl_str_mv |
2025-01-14T12:46:20Z |
dc.date.issued.none.fl_str_mv |
2025-01-07 |
dc.type.none.fl_str_mv |
Trabajo de grado - Pregrado |
dc.type.driver.none.fl_str_mv |
info:eu-repo/semantics/bachelorThesis |
dc.type.version.none.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.coar.none.fl_str_mv |
http://purl.org/coar/resource_type/c_7a1f |
dc.type.content.none.fl_str_mv |
Text |
dc.type.redcol.none.fl_str_mv |
http://purl.org/redcol/resource_type/TP |
format |
http://purl.org/coar/resource_type/c_7a1f |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/1992/75385 |
dc.identifier.instname.none.fl_str_mv |
instname:Universidad de los Andes |
dc.identifier.reponame.none.fl_str_mv |
reponame:Repositorio Institucional Séneca |
dc.identifier.repourl.none.fl_str_mv |
repourl:https://repositorio.uniandes.edu.co/ |
url |
https://hdl.handle.net/1992/75385 |
identifier_str_mv |
instname:Universidad de los Andes reponame:Repositorio Institucional Séneca repourl:https://repositorio.uniandes.edu.co/ |
dc.language.iso.none.fl_str_mv |
eng |
language |
eng |
dc.relation.references.none.fl_str_mv |
Apple. (2024). Wwdc24: Optimize for the spatial web. Retrieved from https://www.youtube.com/watch?v=5tjPBF2qoY4 Brooke, J. (1996). Sus-a quick and dirty usability scale. Usability evaluation in industry, 189(194), 4–7. Cantoni, V., & Porta, M. (2014). Eye tracking as a computer input and interaction method. In Proceedings of the 15th international conference on computer systems and technologies (pp. 1–12). Cucumber-Docs. (2023). Gherkin reference documentation. Retrieved from https://cucumber.io/docs/gherkin/ Dondi, P., & Porta, M. (2023). Gaze-based human–computer interaction for museums and exhibitions: technologies, applications and future perspectives. Electronics, 12(14), 3064. Fernandes, A. S., Murdison, T. S., & Proulx, M. J. (2023). Leveling the playing field: A comparative reevaluation of unmodified eye tracking as an input and interaction modality for vr. IEEE TRANSACTIONSON VISUALIZATION AND COMPUTER GRAPHICS, 29. Girón Bastidas, J. P., Salcedo Parra, O. J., & Espitia R, M. J. (2019). Natural language processing services in assistive technology. Juan Pablo Girón Bastidas, Octavio José Salcedo Parra and Miguel J. Espitia R., Natural Language Processing Services in Assistive Technology. International Journal of Mechanical Engineering and Technology, 10(7). Hammoudi, M., Rothermel, G., & Tonella, P. (2016). Why do record/replay tests of web applications break? In 2016 ieee international conference on software testing, verification and validation (icst) (pp. 180–190). Huang, Z., Zhu, G., Duan, X., Wang, R., Li, Y., Zhang, S., & Wang, Z. (2024). Measuring eye-tracking accuracy and its impact on usability in apple vision pro. arXiv preprint arXiv:2406.00255. Jacob, R. J., & Karn, K. S. (2003). Commentary on section 4. eye tracking in human-computer interaction and usability research: Ready to deliver the promises. The mind’s eye, 2(3), 573–605. Kochmar, E. (2022). Getting started with natural language processing. Manning. Panwar, V. (2024). Web evolution to revolution: Navigating the future of web application development. International Journal of Computer Trends and Technology, 72. 23 Ravelo-Méndez, W., Escobar-Velásquez, C., & Linares-Vásquez, M. (2023). Kraken 2.0: A platform-agnostic and cross-device interaction testing tool. Science of Computer Programming, 225, 102897. Sáiz-Manzanares, M. C., Marticorena-Sánchez, R., Martin Anton, L. J., González-Díez, I., & Carbonero Martín, M. Á. (2024). Using eye tracking technology to analyse cognitive load in multichannel activities in university students. International Journal of Human–Computer Interaction, 40(12), 3263–3281. Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2024). Mining eyetracking data for text summarization. International Journal of Human–Computer Interaction, 40(17), 4887–4905. Terzopoulos, G., & Satratzemi, M. (2020). Voice assistants and smart speakers in everyday life and in education. Informatics in Education, 19(3). Tobii-AB. (n.d.). Tobii pro nano: Enter the world of eye tracking research [Computer software manual]. Stockholm, Sweden. Retrieved from https://tobiipro.com Tobii-AB. (2023). Eye tracker accuracy and precision. Retrieved from https://connect.tobii.com/s/article/eye-tracker-accuracy-and-precision WebAIM. (2024). The techreport:webaim-2024 million: The 2024 report on the accessibility of the top 1,000,000 home pages. (Tech. Rep.). techreport:webaim-2024. Retrieved from https://techreport:webaim-2024.org/projects/million/ Zelinskyi, S., & Boyko, Y. (2024). Integrating session recording and eye-tracking: development and evaluation of a chrome extension for user behavior analysis. Radioelectronic and Computer Systems, 2024(3), 38–54 |
dc.rights.en.fl_str_mv |
Attribution 4.0 International |
dc.rights.uri.none.fl_str_mv |
http://creativecommons.org/licenses/by/4.0/ |
dc.rights.accessrights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.none.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
Attribution 4.0 International http://creativecommons.org/licenses/by/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.none.fl_str_mv |
25 páginas |
dc.format.mimetype.none.fl_str_mv |
application/pdf |
dc.publisher.none.fl_str_mv |
Universidad de los Andes |
dc.publisher.program.none.fl_str_mv |
Ingeniería de Sistemas y Computación |
dc.publisher.faculty.none.fl_str_mv |
Facultad de Ingeniería |
dc.publisher.department.none.fl_str_mv |
Departamento de Ingeniería de Sistemas y Computación |
publisher.none.fl_str_mv |
Universidad de los Andes |
institution |
Universidad de los Andes |
bitstream.url.fl_str_mv |
https://repositorio.uniandes.edu.co/bitstreams/ad59b75a-e6bd-4965-9aa1-87535dc5dac3/download https://repositorio.uniandes.edu.co/bitstreams/a9e1930d-7249-4c1c-918a-a372b4c8c331/download https://repositorio.uniandes.edu.co/bitstreams/412191dc-e1a0-4f36-b9d0-4d4717c09a60/download https://repositorio.uniandes.edu.co/bitstreams/76865a69-eeb0-4f54-a9b9-e4dc0636fdde/download https://repositorio.uniandes.edu.co/bitstreams/fdaff89e-843e-4ecd-97d8-c8681f9930c1/download https://repositorio.uniandes.edu.co/bitstreams/1f2839df-9aed-4991-8dc4-e3926da5df0a/download https://repositorio.uniandes.edu.co/bitstreams/84fdc475-545a-4276-90c5-00af26db21c3/download https://repositorio.uniandes.edu.co/bitstreams/848fb1d9-c336-482e-b5d8-c13e660bbacd/download |
bitstream.checksum.fl_str_mv |
67b90eb6b0f91009535780aa14482c76 d00096632c6baf7ca9e47f85dad1d7a6 ae9e573a68e7f92501b6913cc846c39f 0175ea4a2d4caec4bbcc37e300941108 1407f4d39d44e61f9e2d68b04f045754 8e39615dbadf9427287cc6ce6ab22cd9 88235080931108d49594b6be3c285ef8 4b00761b587064f9005843c53a6828d3 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio institucional Séneca |
repository.mail.fl_str_mv |
adminrepositorio@uniandes.edu.co |
_version_ |
1831927758151745536 |
spelling |
Escobar Velasquez, Camilo Andresvirtual::22040-1Yepes Parra, Juan Diego2025-01-14T12:46:20Z2025-01-14T12:46:20Z2025-01-07https://hdl.handle.net/1992/75385instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/EyeNav is a novel system that combines eye tracking and natural language processing (NLP) to enhance accessibility and enable automated test generation. This research demonstrates the integration of these technologies for intuitive web interaction, enabling pointer control via gaze and natural language processing for interpreting user intentions. Additionally, it presents a record-and-replay module for generating automated test scripts. Preliminary user evaluations yielded positive results in terms of usability. The ultimate goal is to demonstrate that eye tracking combined with NLP can be effectively used not only as a possible assistive technology but also as an innovative approach to software testing.Pregrado25 páginasapplication/pdfengUniversidad de los AndesIngeniería de Sistemas y ComputaciónFacultad de IngenieríaDepartamento de Ingeniería de Sistemas y ComputaciónAttribution 4.0 Internationalhttp://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2EyeNav: a novel accessibility-driven system for interaction and automated test generation using eyetracking and natural language processingTrabajo de grado - Pregradoinfo:eu-repo/semantics/bachelorThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_7a1fTexthttp://purl.org/redcol/resource_type/TPEye-trackingAutomated test generationAssistive technologyNatural language processingWeb applicationsAccessibilityIngenieríaApple. (2024). Wwdc24: Optimize for the spatial web. Retrieved from https://www.youtube.com/watch?v=5tjPBF2qoY4Brooke, J. (1996). Sus-a quick and dirty usability scale. Usability evaluation in industry, 189(194), 4–7.Cantoni, V., & Porta, M. (2014). Eye tracking as a computer input and interaction method. In Proceedings of the 15th international conference on computer systems and technologies (pp. 1–12).Cucumber-Docs. (2023). Gherkin reference documentation. Retrieved from https://cucumber.io/docs/gherkin/Dondi, P., & Porta, M. (2023). Gaze-based human–computer interaction for museums and exhibitions: technologies, applications and future perspectives. Electronics, 12(14), 3064.Fernandes, A. S., Murdison, T. S., & Proulx, M. J. (2023). Leveling the playing field: A comparative reevaluation of unmodified eye tracking as an input and interaction modality for vr. IEEE TRANSACTIONSON VISUALIZATION AND COMPUTER GRAPHICS, 29.Girón Bastidas, J. P., Salcedo Parra, O. J., & Espitia R, M. J. (2019). Natural language processing services in assistive technology. Juan Pablo Girón Bastidas, Octavio José Salcedo Parra and Miguel J. Espitia R., Natural Language Processing Services in Assistive Technology. International Journal of Mechanical Engineering and Technology, 10(7).Hammoudi, M., Rothermel, G., & Tonella, P. (2016). Why do record/replay tests of web applications break? In 2016 ieee international conference on software testing, verification and validation (icst) (pp. 180–190).Huang, Z., Zhu, G., Duan, X., Wang, R., Li, Y., Zhang, S., & Wang, Z. (2024). Measuring eye-tracking accuracy and its impact on usability in apple vision pro. arXiv preprint arXiv:2406.00255. Jacob, R. J., & Karn, K. S. (2003). Commentary on section 4. eye tracking in human-computer interaction and usability research: Ready to deliver the promises. The mind’s eye, 2(3), 573–605.Kochmar, E. (2022). Getting started with natural language processing. Manning.Panwar, V. (2024). Web evolution to revolution: Navigating the future of web application development. International Journal of Computer Trends and Technology, 72. 23Ravelo-Méndez, W., Escobar-Velásquez, C., & Linares-Vásquez, M. (2023). Kraken 2.0: A platform-agnostic and cross-device interaction testing tool. Science of Computer Programming, 225, 102897.Sáiz-Manzanares, M. C., Marticorena-Sánchez, R., Martin Anton, L. J., González-Díez, I., & Carbonero Martín, M. Á. (2024). Using eye tracking technology to analyse cognitive load in multichannel activities in university students. International Journal of Human–Computer Interaction, 40(12), 3263–3281.Taieb-Maimon, M., Romanovski-Chernik, A., Last, M., Litvak, M., & Elhadad, M. (2024). Mining eyetracking data for text summarization. International Journal of Human–Computer Interaction, 40(17), 4887–4905.Terzopoulos, G., & Satratzemi, M. (2020). Voice assistants and smart speakers in everyday life and in education. Informatics in Education, 19(3).Tobii-AB. (n.d.). Tobii pro nano: Enter the world of eye tracking research [Computer software manual]. Stockholm, Sweden. Retrieved from https://tobiipro.comTobii-AB. (2023). Eye tracker accuracy and precision. Retrieved from https://connect.tobii.com/s/article/eye-tracker-accuracy-and-precisionWebAIM. (2024). The techreport:webaim-2024 million: The 2024 report on the accessibility of the top 1,000,000 home pages. (Tech. Rep.). techreport:webaim-2024. Retrieved from https://techreport:webaim-2024.org/projects/million/Zelinskyi, S., & Boyko, Y. (2024). Integrating session recording and eye-tracking: development and evaluation of a chrome extension for user behavior analysis. Radioelectronic and Computer Systems, 2024(3), 38–54202022391Publication0000-0001-8414-9301virtual::22040-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=00016890402019118830virtual::22040-190cbaa5a-48e9-458f-949b-4a661bbe3291virtual::22040-190cbaa5a-48e9-458f-949b-4a661bbe3291virtual::22040-1ORIGINALeyenav.pdfeyenav.pdfapplication/pdf2880360https://repositorio.uniandes.edu.co/bitstreams/ad59b75a-e6bd-4965-9aa1-87535dc5dac3/download67b90eb6b0f91009535780aa14482c76MD51autorizacion_tesis.pdfautorizacion_tesis.pdfHIDEapplication/pdf258838https://repositorio.uniandes.edu.co/bitstreams/a9e1930d-7249-4c1c-918a-a372b4c8c331/downloadd00096632c6baf7ca9e47f85dad1d7a6MD52LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/412191dc-e1a0-4f36-b9d0-4d4717c09a60/downloadae9e573a68e7f92501b6913cc846c39fMD53CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8908https://repositorio.uniandes.edu.co/bitstreams/76865a69-eeb0-4f54-a9b9-e4dc0636fdde/download0175ea4a2d4caec4bbcc37e300941108MD54TEXTeyenav.pdf.txteyenav.pdf.txtExtracted texttext/plain34623https://repositorio.uniandes.edu.co/bitstreams/fdaff89e-843e-4ecd-97d8-c8681f9930c1/download1407f4d39d44e61f9e2d68b04f045754MD55autorizacion_tesis.pdf.txtautorizacion_tesis.pdf.txtExtracted texttext/plain2064https://repositorio.uniandes.edu.co/bitstreams/1f2839df-9aed-4991-8dc4-e3926da5df0a/download8e39615dbadf9427287cc6ce6ab22cd9MD57THUMBNAILeyenav.pdf.jpgeyenav.pdf.jpgGenerated Thumbnailimage/jpeg8881https://repositorio.uniandes.edu.co/bitstreams/84fdc475-545a-4276-90c5-00af26db21c3/download88235080931108d49594b6be3c285ef8MD56autorizacion_tesis.pdf.jpgautorizacion_tesis.pdf.jpgGenerated Thumbnailimage/jpeg10936https://repositorio.uniandes.edu.co/bitstreams/848fb1d9-c336-482e-b5d8-c13e660bbacd/download4b00761b587064f9005843c53a6828d3MD581992/75385oai:repositorio.uniandes.edu.co:1992/753852025-01-15 03:11:13.908http://creativecommons.org/licenses/by/4.0/Attribution 4.0 Internationalopen.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K |