Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering
Human Activity Recognition (HAR) is a topic of great relevance due to its wide range of applications, with various approaches being proposed to recognize these activities, from comparing signals with thresholds to applying machine learning and deep learning techniques. The development of computation...
- Autores:
-
Pacheco Cuentas, Rosberg Yaser
- Tipo de recurso:
- Fecha de publicación:
- 2024
- Institución:
- Corporación Universidad de la Costa
- Repositorio:
- REDICUC - Repositorio CUC
- Idioma:
- spa
- OAI Identifier:
- oai:repositorio.cuc.edu.co:11323/13501
- Acceso en línea:
- https://hdl.handle.net/11323/13501
https://repositorio.cuc.edu.co/
- Palabra clave:
- Human activity recognition
HAR
Daily life activities
ADL
Classification methods
Smart home
Clustering
Ensemble methods
Reconocimiento de actividades humanas
Actividades de la vida diaria
Métodos de clasificación
Hogar inteligente
Clustering
Métodos ensamblados
- Rights
- openAccess
- License
- Atribución-NoComercial-CompartirIgual 4.0 Internacional (CC BY-NC-SA 4.0)
id |
RCUC2_0437fb6a3ce078942161bfc66aee00da |
---|---|
oai_identifier_str |
oai:repositorio.cuc.edu.co:11323/13501 |
network_acronym_str |
RCUC2 |
network_name_str |
REDICUC - Repositorio CUC |
repository_id_str |
|
dc.title.spa.fl_str_mv |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
title |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
spellingShingle |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering Human activity recognition HAR Daily life activities ADL Classification methods Smart home Clustering Ensemble methods Reconocimiento de actividades humanas Actividades de la vida diaria Métodos de clasificación Hogar inteligente Clustering Métodos ensamblados |
title_short |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
title_full |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
title_fullStr |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
title_full_unstemmed |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
title_sort |
Modelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clustering |
dc.creator.fl_str_mv |
Pacheco Cuentas, Rosberg Yaser |
dc.contributor.advisor.none.fl_str_mv |
Morales Otega Roberto Ariza Colpas Paola |
dc.contributor.author.none.fl_str_mv |
Pacheco Cuentas, Rosberg Yaser |
dc.contributor.jury.none.fl_str_mv |
Melendez Pertuz Farid Patiño Saucedo Janns Diaz Martinez Jorge |
dc.subject.proposal.eng.fl_str_mv |
Human activity recognition HAR Daily life activities ADL Classification methods Smart home Clustering Ensemble methods |
topic |
Human activity recognition HAR Daily life activities ADL Classification methods Smart home Clustering Ensemble methods Reconocimiento de actividades humanas Actividades de la vida diaria Métodos de clasificación Hogar inteligente Clustering Métodos ensamblados |
dc.subject.proposal.spa.fl_str_mv |
Reconocimiento de actividades humanas Actividades de la vida diaria Métodos de clasificación Hogar inteligente Clustering Métodos ensamblados |
description |
Human Activity Recognition (HAR) is a topic of great relevance due to its wide range of applications, with various approaches being proposed to recognize these activities, from comparing signals with thresholds to applying machine learning and deep learning techniques. The development of computational systems capable of performing this recognition and extracting truthful, useful, compact, and natural language-like information is a very active area of knowledge and encompasses a research field that subscribes to an investigative framework, which is the study of daily life activities (ADL), where efforts from researchers in different areas of knowledge come together. This frames the present research and its fundamental objective is to advance in the development of a model that allows solving the problem of human activity recognition through the automatic analysis of datasets based on unsupervised learning techniques. The research required the execution of a series of phases: characterization, experimentation, and evaluation. During the characterization phase, a public human activity recognition dataset, CASAS Kyoto, was selected, which is stored in databases that pre-trained models use to generate useful knowledge, from analyzing patterns, generating predictions, and identifying behavior trends. During the experimentation phase, an ensemble-based model was applied, which utilized the advantages of both supervised and unsupervised methods to consolidate results capable of supporting a closer identification of these activities |
publishDate |
2024 |
dc.date.accessioned.none.fl_str_mv |
2024-10-23T16:32:07Z |
dc.date.available.none.fl_str_mv |
2024-10-23T16:32:07Z |
dc.date.issued.none.fl_str_mv |
2024 |
dc.type.none.fl_str_mv |
Trabajo de grado - Maestría |
dc.type.content.none.fl_str_mv |
Text |
dc.type.driver.none.fl_str_mv |
info:eu-repo/semantics/masterThesis |
dc.type.redcol.none.fl_str_mv |
http://purl.org/redcol/resource_type/TM |
dc.type.version.none.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/11323/13501 |
dc.identifier.instname.none.fl_str_mv |
Corporación Universidad de la Costa |
dc.identifier.reponame.none.fl_str_mv |
REDICUC - Repositorio CUC |
dc.identifier.repourl.none.fl_str_mv |
https://repositorio.cuc.edu.co/ |
url |
https://hdl.handle.net/11323/13501 https://repositorio.cuc.edu.co/ |
identifier_str_mv |
Corporación Universidad de la Costa REDICUC - Repositorio CUC |
dc.language.iso.none.fl_str_mv |
spa |
language |
spa |
dc.relation.references.none.fl_str_mv |
Organización Panamericana de la Salud (OPS). (2023, octubre). Demencia. Available online: https://www.paho.org/es/temas/demencia Sikder, N., & Nahid, A.-A. (2021). KU-HAR: An open dataset for heterogeneous human activity recognition. Pattern Recognition Letters, 146, 46–54. Popescu, A.-C., Mocanu, I., & Cramariuc, B. (2019). PRECIS HAR. Retrieved from https://ieee dataport.org/open-access/precishar (accessed on 30 October Martínez-Villaseñor, L., Ponce, H., Brieva, J., Moya-Albor, E., Núñez-Martínez, J., & Peñafort Asturiano, C. (2019). UP-Fall Detection Dataset: A multimodal approach. Sensors, 19(1988). Van Kasteren, T.; Noulas, A.; Englebienne, G.; Kröse, B. Accurate activity recognition in a home setting. In Proceedings of the 10th International Conference on Ubiquitous Computing, Seoul, Korea, 21–24 September 2008; pp. 1–9. Singla, G.; Cook, D.; Schmitter-Edgecombe, M. Recognizing independent and joint activities among multiple residents in smart environments. Ambient. Intell. Humaniz. Comput. J. 2010, 1, 57–63. Weiss, G.M.; Yoneda, K.; Hayajneh, T. Smartphone and Smartwatch-Based Biometrics Using Activities of Daily Living. IEEE Access 2019, 7, 133190–133202. Gallissot, M.; Caelen, J.; Bonnefond, N.; Meillon, B.; Pons, S. Using the Multicom Domus Dataset; Research Report RR-LIG-020; LIG: Grenoble, France, 2011 Roggen, D.; Calatroni, A.; Rossi, M.; Holleczek, T.; Tröster, G.; Lukowicz, P.; Pirkl, G.; Bannach, D.; Ferscha, A.; Doppler, J.; et al. Collecting complex activity data sets in highly rich networked sensor environments. In Proceedings of the Seventh International Conference on Networked Sensing Systems (INSS’10), Kassel, Germany, 15–18 June 2010. Cook, D. Learning setting-generalized activity mdoels for smart spaces. IEEE Intell. Syst. 2010, 1. Zhang, M.; Sawchuk, A.A. USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors. In Proceedings of the ACM International Conference on Ubiquitous Computing (UbiComp) Workshop on Situation, Activity and Goal Awareness (SAGAware), Pittsburgh, PA, USA, 5–8 September 2012. Logan, B.; Healey, B.J.; Philipose, J.M.; Tapia, E.M.; Intille, S. A long-term evaluation of sensing modalities for activity recognition. In Proceedings of the International Conference on Ubiquitous Computing, Taipei, Taiwan, 17–20 December 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 483–500 Nugent, C.D.; Mulvenna, M.D.; Hong, X.; Devlin, S. Experiences in the development of a Smart Lab. Int. J. Biomed. Eng. Technol. 2009, 2, 319–331. Schmitter-Edgecombe, M.; Cook, D.J. Assessing the Quality of Activities in a Smart Environment. Methods Inf. Med. 2009, 48, 480–485. Reiss, A.; Stricker, D. Introducing a New Benchmarked Dataset for Activity Monitoring. In Proceedings of the 16th IEEE International Symposium on Wearable Computers (ISWC), Newcastle, UK, 18–22 June 2012. Banos, O.; Garcia, R.; Holgado, J.A.; Damas, M.; Pomares, H.; Rojas, I.; Saez, A.; Villalonga, C. mHealthDroid: A novel framework for agile development of mobile health applications. In Proceedings of the 6th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2014), Belfast, UK, 2–5 December 2014. Barshan, B.; Yüksek, M.C. Recognizing daily and sports activities in two open source machine learning environments using body-worn sensor units. Comput. J. 2014, 57, 1649–1667 Espinilla, M.; Martínez, L.; Medina, J.; Nugent, C. The experience of developing theUJAmI Smart lab. IEEE Access. 2018, 6, 34631–34642. Y. Chen and C. Shen, “Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition,” IEEE Access, vol. 5, pp. 3095–3110, 2017, DOI: 10.1109/ACCESS.2017.2676168. C. A. Ronao and S. B. Cho, “Human activity recognition with smartphone sensors using deep learning neural networks,” Expert Systems with Applications, vol. 59, pp. 235–244, Oct. 2016, DOI: 10.1016/j.eswa.2016.04.032. N. A. Capela, E. D. Lemaire, and N. Baddour, “Feature selection for wearable smartphone-based human activity recog-nition with able bodied, elderly, and stroke patients,” PLoS ONE, vol. 10, no. 4, p. e0124414, Apr. 2015, DOI: 10.1371/journal.pone.0124414. V. N. Gudivada, J. Ding, and A. Apon, “Data Quality Considerations for Big Data and Machine Learning: Going Be-yond Data Cleaning and Transformations Flow Cytometry of 3-D structure View project Data Quality Considerations for Big Data and Machine Learning: Going Beyond Data Cleaning and Transf,” no. October, pp. 1–20, 2017, Accessed: Sep. 11, 2022. [Online]. Available: https://www.researchgate.net/publication/318432363 Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27. Quinlan, J. R. (1986). Induction of decision trees. Machine learning, 1, 81-106. Kohavi, R., & Quinlan, J. R. (2002). Data mining tasks and methods: Classification: decision-tree discovery. In Handbook of data mining and knowledge discovery (pp. 267-276). Vapnik, V. (1998). The support vector method of function estimation. Nonlinear modeling: Advanced black-box techniques, 55-85. Castelli, V. 1994. The relative value of labeled and unlabeled samples in Pattern Recognition. Ph. D. Dissertation Stanford University. Gabrys, B., & Petrakieva, L. (2004). Combining labelled and unlabelled data in the design of pattern classification systems. International journal of approximate reasoning, 35(3), 251-273. Triguero, I., García, S., & Herrera, F. (2015). Self-labeled techniques for semi-supervised learning: taxonomy, software and empirical study. Knowledge and Information systems, 42, 245-284. Tasmin, M.; Ishtiak, T.; Ruman, S.U.; Suhan, A.U.R.C.; Islam, N.S.; Jahan, S.; Rahman, R.M. Comparative Study of Classifiers on Human Activity Recognition by Different Feature Engineering Techniques. In Proceedings of the 2020 IEEE 10th Interna-tional Conference on Intelligent Systems (IS), Varna, Bulgaria, 28–30 August 2020; pp. 93–101 Bozkurt, F. A Comparative Study on Classifying Human Activities Using Classical Machine and Deep Learning Methods. Arab. J. Sci. Eng. 2021, 47, 1507–1521. Subasi, A.; Radhwan, M.; Kurdi, R.; Khateeb, K. IoT based mobile healthcare system for human activity recognition. In Pro-ceedings of the 2018 15th Learning and Technology Conference (L&T), Jeddah, Saudi Arabia, 25–26 February 2018; pp. 29–34. Maswadi, K.; Ghani, N.A.; Hamid, S.; Rasheed, M.B. Human activity classification using Decision Tree and Naïve Bayes classifiers. Multimed. Tools Appl. 2021, 80, 21709–21726. Wang, A.; Zhao, S.; Zheng, C.; Chen, H.; Liu, L.; Chen, G. HierHAR: Sensor-Based Data-Driven Hierarchical Human Activity Recognition. IEEE Sens. J. 2020, 21, 3353–3365. Demrozi, F.; Turetta, C.; Pravadelli, G. B-HAR: An open-source baseline framework for in depth study of human activity recognition datasets and workflows. arXiv 2021, arXiv:2101.10870. Xu, Z.; Wang, G.; Guo, X. Sensor-based activity recognition of solitary elderly via stigmergy and two layer framework. Eng. Appl. Artif. Intell. 2020, 95, 10385. Hussain, F.; Hussain, F.; Ehatisham-ul-Haq, M.; Azam, M.A. Activity-aware fall detection and recognition based on weara-ble sensors. IEEE Sens. J. 2019, 19, 4528–4536 F. Demrozi, G. Pravadelli, A. Bihorac and P. Rashidi, "Human Activity Recognition Using Inertial, Physiological and Environmental Sensors: A Comprehensive Survey," in IEEE Access, vol. 8, pp. 210816-210836, 2020, doi: 10.1109/ACCESS.2020.3037715. Liciotti, D.; Bernardini, M.; Romeo, L.; Frontoni, E. A sequential deep learning application for recognising human activities in smart homes. Neurocomputing 2019, 396, 501–513 Igwe, O.M.; Wang, Y.; Giakos, G.C.; Fu, J. Human activity recognition in smart environments employing margin setting algo-rithm. J. Ambient Intell. Humaniz. Comput. 2020, 1–13. https://doi.org/10.1007/s12652-020-02229-y Oukrich, N. Daily Human Activity Recognition in Smart Home Based on Feature Selection, Neural Network and Load Sig-nature of Appliances. Ph.D. Thesis, Université Mohamed V, Ecole Mohammadia d'Ingénieurs-Université Mohammed V de Rabat-Maroc, Rabat, Morocco, 2019. Damodaran, N.; Haruni, E.; Kokhkharova, M.; Schäfer, J. Device free human activity and fall recognition using WiFi channel state information (CSI). CCF Trans. Pervasive Comput. Interact. 2020, 2, 1–17. Franco, P.; Martínez, J.M.; Kim, Y.C.; Ahmed, M.A. IoT based approach for load monitoring and activity recognition in smart homes. IEEE Access 2021, 9, 45325–45339. Bota, P.; Silva, J.; Folgado, D.; Gamboa, H. A Semi-Automatic Annotation Approach for Human Activity Recognition. Sensors 2019, 19, 501. Mohmed, G.; Lotfi, A.; Langensiepen, C.; Pourabdollah, A. Clustering-based fuzzy finite state machine for human activity recognition. In UK Workshop on Computational Intelligence; Springer: Cham, Switzerland, 2018; pp. 264–275. Brena, R.F.; Garcia-Ceja, E. A crowdsourcing approach for personalization in human activities recognition. Intell. Data Anal. 2017, 21, 721–738. Wang, X.; Lu, Y.; Wang, D.; Liu, L.; Zhou, H. Using Jaccard distance measure for unsupervised activity recognition with smartphone accelerometers. In Proceedings of the Asia-Pacific Web (apweb) and Web-Age Information Management (WAIM) Joint Conference on Web and Big Data, Beijing, China, 7–9 July 2017; Springer: Cham, Switzerland, 2017; pp. 74–83. Zhang, M.; Sawchuk, A.A. USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors. In Proceedings of the ACM International Conference on Ubiquitous Computing (UbiComp) Workshop on Situation, Activity and Goal Awareness (SAGAware), Pittsburgh, PA, USA, 5–8 September 2012. Crandall, A. S., & Cook, D. J. (2013). Behaviometrics for Identifying Smart Home Residents (pp. 55– 71). https://doi.org/10.2991/978-94-6239-018-8_4 Center of Advanced Studies In Adaptative System. CASAS DATASETS. https://casas.wsu.edu/datasets/ Jain, A.; Duin, R.; Mao, J. Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. Mach. Intell.2000, 22, 4–37. Min-Ling Zhang; Zhi-Hun Zhou, Solving multi-instance problems with classifier ensemble based on constructive clustering. DOI 10.1007/s10115-006-0029-3 Nguyen Anthony; Moore Darren; McCowan Iain, Unsupervised Clustering of Free-Living Human Activities using Ambulatory Accelerometry. 1-4244-0788-5/07/$20.00 ©2007 IEEE |
dc.rights.license.none.fl_str_mv |
Atribución-NoComercial-CompartirIgual 4.0 Internacional (CC BY-NC-SA 4.0) |
dc.rights.uri.none.fl_str_mv |
https://creativecommons.org/licenses/by-nc-sa/4.0/ |
dc.rights.accessrights.none.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.coar.none.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
rights_invalid_str_mv |
Atribución-NoComercial-CompartirIgual 4.0 Internacional (CC BY-NC-SA 4.0) https://creativecommons.org/licenses/by-nc-sa/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.none.fl_str_mv |
84 páginas |
dc.format.mimetype.none.fl_str_mv |
application/pdf |
dc.publisher.none.fl_str_mv |
Corporacion Universidad de la Costa |
dc.publisher.department.none.fl_str_mv |
Ciencias de la Computación y Electrónica |
dc.publisher.place.none.fl_str_mv |
Barranquilla, Colombia |
dc.publisher.program.none.fl_str_mv |
Maestría en Ingeniería |
publisher.none.fl_str_mv |
Corporacion Universidad de la Costa |
institution |
Corporación Universidad de la Costa |
bitstream.url.fl_str_mv |
https://repositorio.cuc.edu.co/bitstreams/18dab69c-6528-425f-99ee-1374f625a087/download https://repositorio.cuc.edu.co/bitstreams/257a11a6-e023-4a92-bf62-444052c5bca5/download https://repositorio.cuc.edu.co/bitstreams/b757278d-27d3-482f-90d6-d127bdef2adb/download https://repositorio.cuc.edu.co/bitstreams/943e19b6-4ee5-4a71-b66b-220262dfdae9/download |
bitstream.checksum.fl_str_mv |
4e581baa275f0f0b88fb52e3d402ad1b 73a5432e0b76442b22b026844140d683 451d1a4f5e808154c710d84538474fa5 8b85663bcc823d32686175939bf3c639 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio de la Universidad de la Costa CUC |
repository.mail.fl_str_mv |
repdigital@cuc.edu.co |
_version_ |
1828166896599433216 |
spelling |
Atribución-NoComercial-CompartirIgual 4.0 Internacional (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Morales Otega RobertoAriza Colpas PaolaPacheco Cuentas, Rosberg YaserMelendez Pertuz FaridPatiño Saucedo JannsDiaz Martinez Jorge2024-10-23T16:32:07Z2024-10-23T16:32:07Z2024https://hdl.handle.net/11323/13501Corporación Universidad de la CostaREDICUC - Repositorio CUChttps://repositorio.cuc.edu.co/Human Activity Recognition (HAR) is a topic of great relevance due to its wide range of applications, with various approaches being proposed to recognize these activities, from comparing signals with thresholds to applying machine learning and deep learning techniques. The development of computational systems capable of performing this recognition and extracting truthful, useful, compact, and natural language-like information is a very active area of knowledge and encompasses a research field that subscribes to an investigative framework, which is the study of daily life activities (ADL), where efforts from researchers in different areas of knowledge come together. This frames the present research and its fundamental objective is to advance in the development of a model that allows solving the problem of human activity recognition through the automatic analysis of datasets based on unsupervised learning techniques. The research required the execution of a series of phases: characterization, experimentation, and evaluation. During the characterization phase, a public human activity recognition dataset, CASAS Kyoto, was selected, which is stored in databases that pre-trained models use to generate useful knowledge, from analyzing patterns, generating predictions, and identifying behavior trends. During the experimentation phase, an ensemble-based model was applied, which utilized the advantages of both supervised and unsupervised methods to consolidate results capable of supporting a closer identification of these activitiesEl reconocimiento de actividades humanas (HAR), es un tema de mucha relevancia debido a su amplia gama de aplicaciones, siendo propuestos diferentes enfoques para reconocer estas actividades, desde la comparación de señales con umbrales hasta la aplicación de técnicas de aprendizaje automático y profundo. El desarrollo de sistemas computacionales capaces de realizar este reconocimiento y extraer una información veraz, útil, compacta y cercana al lenguaje natural, es un área de conocimiento muy activa y comprende un ámbito de investigación que se suscribe a un marco investigativo, que es el estudio de las actividades de la vida diaria (ADL), en la que se aúnan esfuerzos de investigadores de diferentes áreas de conocimiento. Lo que nos enmarca en la presente investigación y su objetivo fundamental es el avanzar en el planteamiento de un modelo que permita resolver el problema de reconocimiento de actividades humanas mediante el análisis automático de conjuntos de datos (Datasets) basados en técnicas aprendizaje no supervisado. La investigación requirió de la ejecución de una serie de fases: caracterización, experimentación y evaluación. Durante la fase de caracterización se seleccionó un conjunto de datos público de reconocimiento de actividades humanas CASAS Kyoto, el cual, se encuentra almacenado en bases de datos que los modelos pre-entrenados utilizan para generar conocimiento útil, desde el análisis de patrones, la generación de predicciones y la identificación de tendencias de comportamiento. Durante la fase de experimentación se aplicó un modelo basado en ensamble donde se utilizó las bondades del método supervidado y no supervisados para consolidar resultados capaces de apoyar a una identificación más cercana de estas actividades.Lista de tablas y figuras 10 -- Introducción 12 -- Contexto 12 -- Mapa del documento 15 -- Problemática abordada y motivación 16 – Justificación 18 – Objetivos 19 --Fundamentación conceptual 20 -- Reconocimiento de actividades humanas diarias (HAR - ADL) 20 -- Aprendizaje supervisado y no supervisado 24 -- Aprendizaje supervisado 25 -- Aprendizaje no supervisado 27 -- Dataset casas kyoto 30 -- Clustering 32 -- Trabajos relacionados 32 -- Aplicaciones de técnicas supervisadas en el reconocimiento de Actividades Humanas 34 -- Aplicaciones de técnicas no supervisadas en el reconocimiento de Actividades Humanas 50 -- Metodología 56 --Fase 1 - Conjunto de Datos Semi-Supervisado 58 -- Fase 2 – Enfoque de agrupación 60 -- Fuzzy clustering 61 -- Agglomerative clustering 61 -- K-means clustering 61 -- Fase 3 – Enfoque de técnicas de clasificación 62 –Bagging 62 --J48 62 --Experimentación 62 -- Fase 1: Experimentación no supervisada 62 -- Escenario No.1: 65 -- Escenario No.2 66 -- Fase 2 y 3: Enfoque de clustering y clasificación 69 --Agglomerative clustering con bagging 69 -- Agglomerative Clustering con J48 70 -- K-means con Bagging 71 -- K-means con J48 72 – Conclusiones 76 –Referencias 78 --Magíster en IngenieríaMaestría84 páginasapplication/pdfspaCorporacion Universidad de la CostaCiencias de la Computación y ElectrónicaBarranquilla, ColombiaMaestría en IngenieríaModelo de aprendizaje no supervisado aplicado a un conjunto de datos de casas kyoto basado en un enfoque de clusteringTrabajo de grado - MaestríaTextinfo:eu-repo/semantics/masterThesishttp://purl.org/redcol/resource_type/TMinfo:eu-repo/semantics/acceptedVersionOrganización Panamericana de la Salud (OPS). (2023, octubre). Demencia. Available online: https://www.paho.org/es/temas/demenciaSikder, N., & Nahid, A.-A. (2021). KU-HAR: An open dataset for heterogeneous human activity recognition. Pattern Recognition Letters, 146, 46–54.Popescu, A.-C., Mocanu, I., & Cramariuc, B. (2019). PRECIS HAR. Retrieved from https://ieee dataport.org/open-access/precishar (accessed on 30 OctoberMartínez-Villaseñor, L., Ponce, H., Brieva, J., Moya-Albor, E., Núñez-Martínez, J., & Peñafort Asturiano, C. (2019). UP-Fall Detection Dataset: A multimodal approach. Sensors, 19(1988).Van Kasteren, T.; Noulas, A.; Englebienne, G.; Kröse, B. Accurate activity recognition in a home setting. In Proceedings of the 10th International Conference on Ubiquitous Computing, Seoul, Korea, 21–24 September 2008; pp. 1–9.Singla, G.; Cook, D.; Schmitter-Edgecombe, M. Recognizing independent and joint activities among multiple residents in smart environments. Ambient. Intell. Humaniz. Comput. J. 2010, 1, 57–63.Weiss, G.M.; Yoneda, K.; Hayajneh, T. Smartphone and Smartwatch-Based Biometrics Using Activities of Daily Living. IEEE Access 2019, 7, 133190–133202.Gallissot, M.; Caelen, J.; Bonnefond, N.; Meillon, B.; Pons, S. Using the Multicom Domus Dataset; Research Report RR-LIG-020; LIG: Grenoble, France, 2011Roggen, D.; Calatroni, A.; Rossi, M.; Holleczek, T.; Tröster, G.; Lukowicz, P.; Pirkl, G.; Bannach, D.; Ferscha, A.; Doppler, J.; et al. Collecting complex activity data sets in highly rich networked sensor environments. In Proceedings of the Seventh International Conference on Networked Sensing Systems (INSS’10), Kassel, Germany, 15–18 June 2010.Cook, D. Learning setting-generalized activity mdoels for smart spaces. IEEE Intell. Syst. 2010, 1.Zhang, M.; Sawchuk, A.A. USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors. In Proceedings of the ACM International Conference on Ubiquitous Computing (UbiComp) Workshop on Situation, Activity and Goal Awareness (SAGAware), Pittsburgh, PA, USA, 5–8 September 2012.Logan, B.; Healey, B.J.; Philipose, J.M.; Tapia, E.M.; Intille, S. A long-term evaluation of sensing modalities for activity recognition. In Proceedings of the International Conference on Ubiquitous Computing, Taipei, Taiwan, 17–20 December 2007; Springer: Berlin/Heidelberg, Germany, 2007; pp. 483–500Nugent, C.D.; Mulvenna, M.D.; Hong, X.; Devlin, S. Experiences in the development of a Smart Lab. Int. J. Biomed. Eng. Technol. 2009, 2, 319–331.Schmitter-Edgecombe, M.; Cook, D.J. Assessing the Quality of Activities in a Smart Environment. Methods Inf. Med. 2009, 48, 480–485.Reiss, A.; Stricker, D. Introducing a New Benchmarked Dataset for Activity Monitoring. In Proceedings of the 16th IEEE International Symposium on Wearable Computers (ISWC), Newcastle, UK, 18–22 June 2012.Banos, O.; Garcia, R.; Holgado, J.A.; Damas, M.; Pomares, H.; Rojas, I.; Saez, A.; Villalonga, C. mHealthDroid: A novel framework for agile development of mobile health applications. In Proceedings of the 6th International Work-conference on Ambient Assisted Living an Active Ageing (IWAAL 2014), Belfast, UK, 2–5 December 2014.Barshan, B.; Yüksek, M.C. Recognizing daily and sports activities in two open source machine learning environments using body-worn sensor units. Comput. J. 2014, 57, 1649–1667Espinilla, M.; Martínez, L.; Medina, J.; Nugent, C. The experience of developing theUJAmI Smart lab. IEEE Access. 2018, 6, 34631–34642.Y. Chen and C. Shen, “Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition,” IEEE Access, vol. 5, pp. 3095–3110, 2017, DOI: 10.1109/ACCESS.2017.2676168.C. A. Ronao and S. B. Cho, “Human activity recognition with smartphone sensors using deep learning neural networks,” Expert Systems with Applications, vol. 59, pp. 235–244, Oct. 2016, DOI: 10.1016/j.eswa.2016.04.032.N. A. Capela, E. D. Lemaire, and N. Baddour, “Feature selection for wearable smartphone-based human activity recog-nition with able bodied, elderly, and stroke patients,” PLoS ONE, vol. 10, no. 4, p. e0124414, Apr. 2015, DOI: 10.1371/journal.pone.0124414.V. N. Gudivada, J. Ding, and A. Apon, “Data Quality Considerations for Big Data and Machine Learning: Going Be-yond Data Cleaning and Transformations Flow Cytometry of 3-D structure View project Data Quality Considerations for Big Data and Machine Learning: Going Beyond Data Cleaning and Transf,” no. October, pp. 1–20, 2017, Accessed: Sep. 11, 2022. [Online]. Available: https://www.researchgate.net/publication/318432363Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27.Quinlan, J. R. (1986). Induction of decision trees. Machine learning, 1, 81-106.Kohavi, R., & Quinlan, J. R. (2002). Data mining tasks and methods: Classification: decision-tree discovery. In Handbook of data mining and knowledge discovery (pp. 267-276).Vapnik, V. (1998). The support vector method of function estimation. Nonlinear modeling: Advanced black-box techniques, 55-85.Castelli, V. 1994. The relative value of labeled and unlabeled samples in Pattern Recognition. Ph. D. Dissertation Stanford University.Gabrys, B., & Petrakieva, L. (2004). Combining labelled and unlabelled data in the design of pattern classification systems. International journal of approximate reasoning, 35(3), 251-273.Triguero, I., García, S., & Herrera, F. (2015). Self-labeled techniques for semi-supervised learning: taxonomy, software and empirical study. Knowledge and Information systems, 42, 245-284.Tasmin, M.; Ishtiak, T.; Ruman, S.U.; Suhan, A.U.R.C.; Islam, N.S.; Jahan, S.; Rahman, R.M. Comparative Study of Classifiers on Human Activity Recognition by Different Feature Engineering Techniques. In Proceedings of the 2020 IEEE 10th Interna-tional Conference on Intelligent Systems (IS), Varna, Bulgaria, 28–30 August 2020; pp. 93–101Bozkurt, F. A Comparative Study on Classifying Human Activities Using Classical Machine and Deep Learning Methods. Arab. J. Sci. Eng. 2021, 47, 1507–1521.Subasi, A.; Radhwan, M.; Kurdi, R.; Khateeb, K. IoT based mobile healthcare system for human activity recognition. In Pro-ceedings of the 2018 15th Learning and Technology Conference (L&T), Jeddah, Saudi Arabia, 25–26 February 2018; pp. 29–34.Maswadi, K.; Ghani, N.A.; Hamid, S.; Rasheed, M.B. Human activity classification using Decision Tree and Naïve Bayes classifiers. Multimed. Tools Appl. 2021, 80, 21709–21726.Wang, A.; Zhao, S.; Zheng, C.; Chen, H.; Liu, L.; Chen, G. HierHAR: Sensor-Based Data-Driven Hierarchical Human Activity Recognition. IEEE Sens. J. 2020, 21, 3353–3365.Demrozi, F.; Turetta, C.; Pravadelli, G. B-HAR: An open-source baseline framework for in depth study of human activity recognition datasets and workflows. arXiv 2021, arXiv:2101.10870.Xu, Z.; Wang, G.; Guo, X. Sensor-based activity recognition of solitary elderly via stigmergy and two layer framework. Eng. Appl. Artif. Intell. 2020, 95, 10385.Hussain, F.; Hussain, F.; Ehatisham-ul-Haq, M.; Azam, M.A. Activity-aware fall detection and recognition based on weara-ble sensors. IEEE Sens. J. 2019, 19, 4528–4536F. Demrozi, G. Pravadelli, A. Bihorac and P. Rashidi, "Human Activity Recognition Using Inertial, Physiological and Environmental Sensors: A Comprehensive Survey," in IEEE Access, vol. 8, pp. 210816-210836, 2020, doi: 10.1109/ACCESS.2020.3037715.Liciotti, D.; Bernardini, M.; Romeo, L.; Frontoni, E. A sequential deep learning application for recognising human activities in smart homes. Neurocomputing 2019, 396, 501–513Igwe, O.M.; Wang, Y.; Giakos, G.C.; Fu, J. Human activity recognition in smart environments employing margin setting algo-rithm. J. Ambient Intell. Humaniz. Comput. 2020, 1–13. https://doi.org/10.1007/s12652-020-02229-yOukrich, N. Daily Human Activity Recognition in Smart Home Based on Feature Selection, Neural Network and Load Sig-nature of Appliances. Ph.D. Thesis, Université Mohamed V, Ecole Mohammadia d'Ingénieurs-Université Mohammed V de Rabat-Maroc, Rabat, Morocco, 2019.Damodaran, N.; Haruni, E.; Kokhkharova, M.; Schäfer, J. Device free human activity and fall recognition using WiFi channel state information (CSI). CCF Trans. Pervasive Comput. Interact. 2020, 2, 1–17.Franco, P.; Martínez, J.M.; Kim, Y.C.; Ahmed, M.A. IoT based approach for load monitoring and activity recognition in smart homes. IEEE Access 2021, 9, 45325–45339.Bota, P.; Silva, J.; Folgado, D.; Gamboa, H. A Semi-Automatic Annotation Approach for Human Activity Recognition. Sensors 2019, 19, 501.Mohmed, G.; Lotfi, A.; Langensiepen, C.; Pourabdollah, A. Clustering-based fuzzy finite state machine for human activity recognition. In UK Workshop on Computational Intelligence; Springer: Cham, Switzerland, 2018; pp. 264–275.Brena, R.F.; Garcia-Ceja, E. A crowdsourcing approach for personalization in human activities recognition. Intell. Data Anal. 2017, 21, 721–738.Wang, X.; Lu, Y.; Wang, D.; Liu, L.; Zhou, H. Using Jaccard distance measure for unsupervised activity recognition with smartphone accelerometers. In Proceedings of the Asia-Pacific Web (apweb) and Web-Age Information Management (WAIM) Joint Conference on Web and Big Data, Beijing, China, 7–9 July 2017; Springer: Cham, Switzerland, 2017; pp. 74–83.Zhang, M.; Sawchuk, A.A. USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors. In Proceedings of the ACM International Conference on Ubiquitous Computing (UbiComp) Workshop on Situation, Activity and Goal Awareness (SAGAware), Pittsburgh, PA, USA, 5–8 September 2012.Crandall, A. S., & Cook, D. J. (2013). Behaviometrics for Identifying Smart Home Residents (pp. 55– 71). https://doi.org/10.2991/978-94-6239-018-8_4Center of Advanced Studies In Adaptative System. CASAS DATASETS. https://casas.wsu.edu/datasets/Jain, A.; Duin, R.; Mao, J. Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. Mach. Intell.2000, 22, 4–37.Min-Ling Zhang; Zhi-Hun Zhou, Solving multi-instance problems with classifier ensemble based on constructive clustering. DOI 10.1007/s10115-006-0029-3Nguyen Anthony; Moore Darren; McCowan Iain, Unsupervised Clustering of Free-Living Human Activities using Ambulatory Accelerometry. 1-4244-0788-5/07/$20.00 ©2007 IEEEHuman activity recognitionHARDaily life activitiesADLClassification methodsSmart homeClusteringEnsemble methodsReconocimiento de actividades humanasActividades de la vida diariaMétodos de clasificaciónHogar inteligenteClusteringMétodos ensambladosPublicationORIGINALModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdfModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdfapplication/pdf1479116https://repositorio.cuc.edu.co/bitstreams/18dab69c-6528-425f-99ee-1374f625a087/download4e581baa275f0f0b88fb52e3d402ad1bMD51LICENSElicense.txtlicense.txttext/plain; charset=utf-815543https://repositorio.cuc.edu.co/bitstreams/257a11a6-e023-4a92-bf62-444052c5bca5/download73a5432e0b76442b22b026844140d683MD52TEXTModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdf.txtModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdf.txtExtracted texttext/plain101249https://repositorio.cuc.edu.co/bitstreams/b757278d-27d3-482f-90d6-d127bdef2adb/download451d1a4f5e808154c710d84538474fa5MD53THUMBNAILModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdf.jpgModelo de aprendizaje no supervisado aplicado a un conjunto de datos de Casas Kyoto basado en un enfoque de clustering.pdf.jpgGenerated Thumbnailimage/jpeg7193https://repositorio.cuc.edu.co/bitstreams/943e19b6-4ee5-4a71-b66b-220262dfdae9/download8b85663bcc823d32686175939bf3c639MD5411323/13501oai:repositorio.cuc.edu.co:11323/135012024-10-24 03:02:36.246https://creativecommons.org/licenses/by-nc-sa/4.0/open.accesshttps://repositorio.cuc.edu.coRepositorio de la Universidad de la Costa CUCrepdigital@cuc.edu.coPHA+TEEgT0JSQSAoVEFMIFkgQ09NTyBTRSBERUZJTkUgTcOBUyBBREVMQU5URSkgU0UgT1RPUkdBIEJBSk8gTE9TIFRFUk1JTk9TIERFIEVTVEEgTElDRU5DSUEgUMOaQkxJQ0EgREUgQ1JFQVRJVkUgQ09NTU9OUyAo4oCcTFBDQ+KAnSBPIOKAnExJQ0VOQ0lB4oCdKS4gTEEgT0JSQSBFU1TDgSBQUk9URUdJREEgUE9SIERFUkVDSE9TIERFIEFVVE9SIFkvVSBPVFJBUyBMRVlFUyBBUExJQ0FCTEVTLiBRVUVEQSBQUk9ISUJJRE8gQ1VBTFFVSUVSIFVTTyBRVUUgU0UgSEFHQSBERSBMQSBPQlJBIFFVRSBOTyBDVUVOVEUgQ09OIExBIEFVVE9SSVpBQ0nDk04gUEVSVElORU5URSBERSBDT05GT1JNSURBRCBDT04gTE9TIFTDiVJNSU5PUyBERSBFU1RBIExJQ0VOQ0lBIFkgREUgTEEgTEVZIERFIERFUkVDSE8gREUgQVVUT1IuPC9wPgo8cD5NRURJQU5URSBFTCBFSkVSQ0lDSU8gREUgQ1VBTFFVSUVSQSBERSBMT1MgREVSRUNIT1MgUVVFIFNFIE9UT1JHQU4gRU4gRVNUQSBMSUNFTkNJQSwgVVNURUQgQUNFUFRBIFkgQUNVRVJEQSBRVUVEQVIgT0JMSUdBRE8gRU4gTE9TIFRFUk1JTk9TIFFVRSBTRSBTRcORQUxBTiBFTiBFTExBLiBFTCBMSUNFTkNJQU5URSBDT05DRURFIEEgVVNURUQgTE9TIERFUkVDSE9TIENPTlRFTklET1MgRU4gRVNUQSBMSUNFTkNJQSBDT05ESUNJT05BRE9TIEEgTEEgQUNFUFRBQ0nDk04gREUgU1VTIFRFUk1JTk9TIFkgQ09ORElDSU9ORVMuPC9wPgo8b2wgdHlwZT0iMSI+CiAgPGxpPgogICAgRGVmaW5pY2lvbmVzCiAgICA8b2wgdHlwZT1hPgogICAgICA8bGk+T2JyYSBDb2xlY3RpdmEgZXMgdW5hIG9icmEsIHRhbCBjb21vIHVuYSBwdWJsaWNhY2nDs24gcGVyacOzZGljYSwgdW5hIGFudG9sb2fDrWEsIG8gdW5hIGVuY2ljbG9wZWRpYSwgZW4gbGEgcXVlIGxhIG9icmEgZW4gc3UgdG90YWxpZGFkLCBzaW4gbW9kaWZpY2FjacOzbiBhbGd1bmEsIGp1bnRvIGNvbiB1biBncnVwbyBkZSBvdHJhcyBjb250cmlidWNpb25lcyBxdWUgY29uc3RpdHV5ZW4gb2JyYXMgc2VwYXJhZGFzIGUgaW5kZXBlbmRpZW50ZXMgZW4gc8OtIG1pc21hcywgc2UgaW50ZWdyYW4gZW4gdW4gdG9kbyBjb2xlY3Rpdm8uIFVuYSBPYnJhIHF1ZSBjb25zdGl0dXllIHVuYSBvYnJhIGNvbGVjdGl2YSBubyBzZSBjb25zaWRlcmFyw6EgdW5hIE9icmEgRGVyaXZhZGEgKGNvbW8gc2UgZGVmaW5lIGFiYWpvKSBwYXJhIGxvcyBwcm9ww7NzaXRvcyBkZSBlc3RhIGxpY2VuY2lhLiBhcXVlbGxhIHByb2R1Y2lkYSBwb3IgdW4gZ3J1cG8gZGUgYXV0b3JlcywgZW4gcXVlIGxhIE9icmEgc2UgZW5jdWVudHJhIHNpbiBtb2RpZmljYWNpb25lcywganVudG8gY29uIHVuYSBjaWVydGEgY2FudGlkYWQgZGUgb3RyYXMgY29udHJpYnVjaW9uZXMsIHF1ZSBjb25zdGl0dXllbiBlbiBzw60gbWlzbW9zIHRyYWJham9zIHNlcGFyYWRvcyBlIGluZGVwZW5kaWVudGVzLCBxdWUgc29uIGludGVncmFkb3MgYWwgdG9kbyBjb2xlY3Rpdm8sIHRhbGVzIGNvbW8gcHVibGljYWNpb25lcyBwZXJpw7NkaWNhcywgYW50b2xvZ8OtYXMgbyBlbmNpY2xvcGVkaWFzLjwvbGk+CiAgICAgIDxsaT5PYnJhIERlcml2YWRhIHNpZ25pZmljYSB1bmEgb2JyYSBiYXNhZGEgZW4gbGEgb2JyYSBvYmpldG8gZGUgZXN0YSBsaWNlbmNpYSBvIGVuIMOpc3RhIHkgb3RyYXMgb2JyYXMgcHJlZXhpc3RlbnRlcywgdGFsZXMgY29tbyB0cmFkdWNjaW9uZXMsIGFycmVnbG9zIG11c2ljYWxlcywgZHJhbWF0aXphY2lvbmVzLCDigJxmaWNjaW9uYWxpemFjaW9uZXPigJ0sIHZlcnNpb25lcyBwYXJhIGNpbmUsIOKAnGdyYWJhY2lvbmVzIGRlIHNvbmlkb+KAnSwgcmVwcm9kdWNjaW9uZXMgZGUgYXJ0ZSwgcmVzw7ptZW5lcywgY29uZGVuc2FjaW9uZXMsIG8gY3VhbHF1aWVyIG90cmEgZW4gbGEgcXVlIGxhIG9icmEgcHVlZGEgc2VyIHRyYW5zZm9ybWFkYSwgY2FtYmlhZGEgbyBhZGFwdGFkYSwgZXhjZXB0byBhcXVlbGxhcyBxdWUgY29uc3RpdHV5YW4gdW5hIG9icmEgY29sZWN0aXZhLCBsYXMgcXVlIG5vIHNlcsOhbiBjb25zaWRlcmFkYXMgdW5hIG9icmEgZGVyaXZhZGEgcGFyYSBlZmVjdG9zIGRlIGVzdGEgbGljZW5jaWEuIChQYXJhIGV2aXRhciBkdWRhcywgZW4gZWwgY2FzbyBkZSBxdWUgbGEgT2JyYSBzZWEgdW5hIGNvbXBvc2ljacOzbiBtdXNpY2FsIG8gdW5hIGdyYWJhY2nDs24gc29ub3JhLCBwYXJhIGxvcyBlZmVjdG9zIGRlIGVzdGEgTGljZW5jaWEgbGEgc2luY3Jvbml6YWNpw7NuIHRlbXBvcmFsIGRlIGxhIE9icmEgY29uIHVuYSBpbWFnZW4gZW4gbW92aW1pZW50byBzZSBjb25zaWRlcmFyw6EgdW5hIE9icmEgRGVyaXZhZGEgcGFyYSBsb3MgZmluZXMgZGUgZXN0YSBsaWNlbmNpYSkuPC9saT4KICAgICAgPGxpPkxpY2VuY2lhbnRlLCBlcyBlbCBpbmRpdmlkdW8gbyBsYSBlbnRpZGFkIHRpdHVsYXIgZGUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yIHF1ZSBvZnJlY2UgbGEgT2JyYSBlbiBjb25mb3JtaWRhZCBjb24gbGFzIGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEuPC9saT4KICAgICAgPGxpPkF1dG9yIG9yaWdpbmFsLCBlcyBlbCBpbmRpdmlkdW8gcXVlIGNyZcOzIGxhIE9icmEuPC9saT4KICAgICAgPGxpPk9icmEsIGVzIGFxdWVsbGEgb2JyYSBzdXNjZXB0aWJsZSBkZSBwcm90ZWNjacOzbiBwb3IgZWwgcsOpZ2ltZW4gZGUgRGVyZWNobyBkZSBBdXRvciB5IHF1ZSBlcyBvZnJlY2lkYSBlbiBsb3MgdMOpcm1pbm9zIGRlIGVzdGEgbGljZW5jaWE8L2xpPgogICAgICA8bGk+VXN0ZWQsIGVzIGVsIGluZGl2aWR1byBvIGxhIGVudGlkYWQgcXVlIGVqZXJjaXRhIGxvcyBkZXJlY2hvcyBvdG9yZ2Fkb3MgYWwgYW1wYXJvIGRlIGVzdGEgTGljZW5jaWEgeSBxdWUgY29uIGFudGVyaW9yaWRhZCBubyBoYSB2aW9sYWRvIGxhcyBjb25kaWNpb25lcyBkZSBsYSBtaXNtYSByZXNwZWN0byBhIGxhIE9icmEsIG8gcXVlIGhheWEgb2J0ZW5pZG8gYXV0b3JpemFjacOzbiBleHByZXNhIHBvciBwYXJ0ZSBkZWwgTGljZW5jaWFudGUgcGFyYSBlamVyY2VyIGxvcyBkZXJlY2hvcyBhbCBhbXBhcm8gZGUgZXN0YSBMaWNlbmNpYSBwZXNlIGEgdW5hIHZpb2xhY2nDs24gYW50ZXJpb3IuPC9saT4KICAgIDwvb2w+CiAgPC9saT4KICA8YnIvPgogIDxsaT4KICAgIERlcmVjaG9zIGRlIFVzb3MgSG9ucmFkb3MgeSBleGNlcGNpb25lcyBMZWdhbGVzLgogICAgPHA+TmFkYSBlbiBlc3RhIExpY2VuY2lhIHBvZHLDoSBzZXIgaW50ZXJwcmV0YWRvIGNvbW8gdW5hIGRpc21pbnVjacOzbiwgbGltaXRhY2nDs24gbyByZXN0cmljY2nDs24gZGUgbG9zIGRlcmVjaG9zIGRlcml2YWRvcyBkZWwgdXNvIGhvbnJhZG8geSBvdHJhcyBsaW1pdGFjaW9uZXMgbyBleGNlcGNpb25lcyBhIGxvcyBkZXJlY2hvcyBkZWwgYXV0b3IgYmFqbyBlbCByw6lnaW1lbiBsZWdhbCB2aWdlbnRlIG8gZGVyaXZhZG8gZGUgY3VhbHF1aWVyIG90cmEgbm9ybWEgcXVlIHNlIGxlIGFwbGlxdWUuPC9wPgogIDwvbGk+CiAgPGxpPgogICAgQ29uY2VzacOzbiBkZSBsYSBMaWNlbmNpYS4KICAgIDxwPkJham8gbG9zIHTDqXJtaW5vcyB5IGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEsIGVsIExpY2VuY2lhbnRlIG90b3JnYSBhIFVzdGVkIHVuYSBsaWNlbmNpYSBtdW5kaWFsLCBsaWJyZSBkZSByZWdhbMOtYXMsIG5vIGV4Y2x1c2l2YSB5IHBlcnBldHVhIChkdXJhbnRlIHRvZG8gZWwgcGVyw61vZG8gZGUgdmlnZW5jaWEgZGUgbG9zIGRlcmVjaG9zIGRlIGF1dG9yKSBwYXJhIGVqZXJjZXIgZXN0b3MgZGVyZWNob3Mgc29icmUgbGEgT2JyYSB0YWwgeSBjb21vIHNlIGluZGljYSBhIGNvbnRpbnVhY2nDs246PC9wPgogICAgPG9sIHR5cGU9ImEiPgogICAgICA8bGk+UmVwcm9kdWNpciBsYSBPYnJhLCBpbmNvcnBvcmFyIGxhIE9icmEgZW4gdW5hIG8gbcOhcyBPYnJhcyBDb2xlY3RpdmFzLCB5IHJlcHJvZHVjaXIgbGEgT2JyYSBpbmNvcnBvcmFkYSBlbiBsYXMgT2JyYXMgQ29sZWN0aXZhcy48L2xpPgogICAgICA8bGk+RGlzdHJpYnVpciBjb3BpYXMgbyBmb25vZ3JhbWFzIGRlIGxhcyBPYnJhcywgZXhoaWJpcmxhcyBww7pibGljYW1lbnRlLCBlamVjdXRhcmxhcyBww7pibGljYW1lbnRlIHkvbyBwb25lcmxhcyBhIGRpc3Bvc2ljacOzbiBww7pibGljYSwgaW5jbHV5w6luZG9sYXMgY29tbyBpbmNvcnBvcmFkYXMgZW4gT2JyYXMgQ29sZWN0aXZhcywgc2Vnw7puIGNvcnJlc3BvbmRhLjwvbGk+CiAgICAgIDxsaT5EaXN0cmlidWlyIGNvcGlhcyBkZSBsYXMgT2JyYXMgRGVyaXZhZGFzIHF1ZSBzZSBnZW5lcmVuLCBleGhpYmlybGFzIHDDumJsaWNhbWVudGUsIGVqZWN1dGFybGFzIHDDumJsaWNhbWVudGUgeS9vIHBvbmVybGFzIGEgZGlzcG9zaWNpw7NuIHDDumJsaWNhLjwvbGk+CiAgICA8L29sPgogICAgPHA+TG9zIGRlcmVjaG9zIG1lbmNpb25hZG9zIGFudGVyaW9ybWVudGUgcHVlZGVuIHNlciBlamVyY2lkb3MgZW4gdG9kb3MgbG9zIG1lZGlvcyB5IGZvcm1hdG9zLCBhY3R1YWxtZW50ZSBjb25vY2lkb3MgbyBxdWUgc2UgaW52ZW50ZW4gZW4gZWwgZnV0dXJvLiBMb3MgZGVyZWNob3MgYW50ZXMgbWVuY2lvbmFkb3MgaW5jbHV5ZW4gZWwgZGVyZWNobyBhIHJlYWxpemFyIGRpY2hhcyBtb2RpZmljYWNpb25lcyBlbiBsYSBtZWRpZGEgcXVlIHNlYW4gdMOpY25pY2FtZW50ZSBuZWNlc2FyaWFzIHBhcmEgZWplcmNlciBsb3MgZGVyZWNob3MgZW4gb3RybyBtZWRpbyBvIGZvcm1hdG9zLCBwZXJvIGRlIG90cmEgbWFuZXJhIHVzdGVkIG5vIGVzdMOhIGF1dG9yaXphZG8gcGFyYSByZWFsaXphciBvYnJhcyBkZXJpdmFkYXMuIFRvZG9zIGxvcyBkZXJlY2hvcyBubyBvdG9yZ2Fkb3MgZXhwcmVzYW1lbnRlIHBvciBlbCBMaWNlbmNpYW50ZSBxdWVkYW4gcG9yIGVzdGUgbWVkaW8gcmVzZXJ2YWRvcywgaW5jbHV5ZW5kbyBwZXJvIHNpbiBsaW1pdGFyc2UgYSBhcXVlbGxvcyBxdWUgc2UgbWVuY2lvbmFuIGVuIGxhcyBzZWNjaW9uZXMgNChkKSB5IDQoZSkuPC9wPgogIDwvbGk+CiAgPGJyLz4KICA8bGk+CiAgICBSZXN0cmljY2lvbmVzLgogICAgPHA+TGEgbGljZW5jaWEgb3RvcmdhZGEgZW4gbGEgYW50ZXJpb3IgU2VjY2nDs24gMyBlc3TDoSBleHByZXNhbWVudGUgc3VqZXRhIHkgbGltaXRhZGEgcG9yIGxhcyBzaWd1aWVudGVzIHJlc3RyaWNjaW9uZXM6PC9wPgogICAgPG9sIHR5cGU9ImEiPgogICAgICA8bGk+VXN0ZWQgcHVlZGUgZGlzdHJpYnVpciwgZXhoaWJpciBww7pibGljYW1lbnRlLCBlamVjdXRhciBww7pibGljYW1lbnRlLCBvIHBvbmVyIGEgZGlzcG9zaWNpw7NuIHDDumJsaWNhIGxhIE9icmEgc8OzbG8gYmFqbyBsYXMgY29uZGljaW9uZXMgZGUgZXN0YSBMaWNlbmNpYSwgeSBVc3RlZCBkZWJlIGluY2x1aXIgdW5hIGNvcGlhIGRlIGVzdGEgbGljZW5jaWEgbyBkZWwgSWRlbnRpZmljYWRvciBVbml2ZXJzYWwgZGUgUmVjdXJzb3MgZGUgbGEgbWlzbWEgY29uIGNhZGEgY29waWEgZGUgbGEgT2JyYSBxdWUgZGlzdHJpYnV5YSwgZXhoaWJhIHDDumJsaWNhbWVudGUsIGVqZWN1dGUgcMO6YmxpY2FtZW50ZSBvIHBvbmdhIGEgZGlzcG9zaWNpw7NuIHDDumJsaWNhLiBObyBlcyBwb3NpYmxlIG9mcmVjZXIgbyBpbXBvbmVyIG5pbmd1bmEgY29uZGljacOzbiBzb2JyZSBsYSBPYnJhIHF1ZSBhbHRlcmUgbyBsaW1pdGUgbGFzIGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEgbyBlbCBlamVyY2ljaW8gZGUgbG9zIGRlcmVjaG9zIGRlIGxvcyBkZXN0aW5hdGFyaW9zIG90b3JnYWRvcyBlbiBlc3RlIGRvY3VtZW50by4gTm8gZXMgcG9zaWJsZSBzdWJsaWNlbmNpYXIgbGEgT2JyYS4gVXN0ZWQgZGViZSBtYW50ZW5lciBpbnRhY3RvcyB0b2RvcyBsb3MgYXZpc29zIHF1ZSBoYWdhbiByZWZlcmVuY2lhIGEgZXN0YSBMaWNlbmNpYSB5IGEgbGEgY2zDoXVzdWxhIGRlIGxpbWl0YWNpw7NuIGRlIGdhcmFudMOtYXMuIFVzdGVkIG5vIHB1ZWRlIGRpc3RyaWJ1aXIsIGV4aGliaXIgcMO6YmxpY2FtZW50ZSwgZWplY3V0YXIgcMO6YmxpY2FtZW50ZSwgbyBwb25lciBhIGRpc3Bvc2ljacOzbiBww7pibGljYSBsYSBPYnJhIGNvbiBhbGd1bmEgbWVkaWRhIHRlY25vbMOzZ2ljYSBxdWUgY29udHJvbGUgZWwgYWNjZXNvIG8gbGEgdXRpbGl6YWNpw7NuIGRlIGVsbGEgZGUgdW5hIGZvcm1hIHF1ZSBzZWEgaW5jb25zaXN0ZW50ZSBjb24gbGFzIGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEuIExvIGFudGVyaW9yIHNlIGFwbGljYSBhIGxhIE9icmEgaW5jb3Jwb3JhZGEgYSB1bmEgT2JyYSBDb2xlY3RpdmEsIHBlcm8gZXN0byBubyBleGlnZSBxdWUgbGEgT2JyYSBDb2xlY3RpdmEgYXBhcnRlIGRlIGxhIG9icmEgbWlzbWEgcXVlZGUgc3VqZXRhIGEgbGFzIGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEuIFNpIFVzdGVkIGNyZWEgdW5hIE9icmEgQ29sZWN0aXZhLCBwcmV2aW8gYXZpc28gZGUgY3VhbHF1aWVyIExpY2VuY2lhbnRlIGRlYmUsIGVuIGxhIG1lZGlkYSBkZSBsbyBwb3NpYmxlLCBlbGltaW5hciBkZSBsYSBPYnJhIENvbGVjdGl2YSBjdWFscXVpZXIgcmVmZXJlbmNpYSBhIGRpY2hvIExpY2VuY2lhbnRlIG8gYWwgQXV0b3IgT3JpZ2luYWwsIHNlZ8O6biBsbyBzb2xpY2l0YWRvIHBvciBlbCBMaWNlbmNpYW50ZSB5IGNvbmZvcm1lIGxvIGV4aWdlIGxhIGNsw6F1c3VsYSA0KGMpLjwvbGk+CiAgICAgIDxsaT5Vc3RlZCBubyBwdWVkZSBlamVyY2VyIG5pbmd1bm8gZGUgbG9zIGRlcmVjaG9zIHF1ZSBsZSBoYW4gc2lkbyBvdG9yZ2Fkb3MgZW4gbGEgU2VjY2nDs24gMyBwcmVjZWRlbnRlIGRlIG1vZG8gcXVlIGVzdMOpbiBwcmluY2lwYWxtZW50ZSBkZXN0aW5hZG9zIG8gZGlyZWN0YW1lbnRlIGRpcmlnaWRvcyBhIGNvbnNlZ3VpciB1biBwcm92ZWNobyBjb21lcmNpYWwgbyB1bmEgY29tcGVuc2FjacOzbiBtb25ldGFyaWEgcHJpdmFkYS4gRWwgaW50ZXJjYW1iaW8gZGUgbGEgT2JyYSBwb3Igb3RyYXMgb2JyYXMgcHJvdGVnaWRhcyBwb3IgZGVyZWNob3MgZGUgYXV0b3IsIHlhIHNlYSBhIHRyYXbDqXMgZGUgdW4gc2lzdGVtYSBwYXJhIGNvbXBhcnRpciBhcmNoaXZvcyBkaWdpdGFsZXMgKGRpZ2l0YWwgZmlsZS1zaGFyaW5nKSBvIGRlIGN1YWxxdWllciBvdHJhIG1hbmVyYSBubyBzZXLDoSBjb25zaWRlcmFkbyBjb21vIGVzdGFyIGRlc3RpbmFkbyBwcmluY2lwYWxtZW50ZSBvIGRpcmlnaWRvIGRpcmVjdGFtZW50ZSBhIGNvbnNlZ3VpciB1biBwcm92ZWNobyBjb21lcmNpYWwgbyB1bmEgY29tcGVuc2FjacOzbiBtb25ldGFyaWEgcHJpdmFkYSwgc2llbXByZSBxdWUgbm8gc2UgcmVhbGljZSB1biBwYWdvIG1lZGlhbnRlIHVuYSBjb21wZW5zYWNpw7NuIG1vbmV0YXJpYSBlbiByZWxhY2nDs24gY29uIGVsIGludGVyY2FtYmlvIGRlIG9icmFzIHByb3RlZ2lkYXMgcG9yIGVsIGRlcmVjaG8gZGUgYXV0b3IuPC9saT4KICAgICAgPGxpPlNpIHVzdGVkIGRpc3RyaWJ1eWUsIGV4aGliZSBww7pibGljYW1lbnRlLCBlamVjdXRhIHDDumJsaWNhbWVudGUgbyBlamVjdXRhIHDDumJsaWNhbWVudGUgZW4gZm9ybWEgZGlnaXRhbCBsYSBPYnJhIG8gY3VhbHF1aWVyIE9icmEgRGVyaXZhZGEgdSBPYnJhIENvbGVjdGl2YSwgVXN0ZWQgZGViZSBtYW50ZW5lciBpbnRhY3RhIHRvZGEgbGEgaW5mb3JtYWNpw7NuIGRlIGRlcmVjaG8gZGUgYXV0b3IgZGUgbGEgT2JyYSB5IHByb3BvcmNpb25hciwgZGUgZm9ybWEgcmF6b25hYmxlIHNlZ8O6biBlbCBtZWRpbyBvIG1hbmVyYSBxdWUgVXN0ZWQgZXN0w6kgdXRpbGl6YW5kbzogKGkpIGVsIG5vbWJyZSBkZWwgQXV0b3IgT3JpZ2luYWwgc2kgZXN0w6EgcHJvdmlzdG8gKG8gc2V1ZMOzbmltbywgc2kgZnVlcmUgYXBsaWNhYmxlKSwgeS9vIChpaSkgZWwgbm9tYnJlIGRlIGxhIHBhcnRlIG8gbGFzIHBhcnRlcyBxdWUgZWwgQXV0b3IgT3JpZ2luYWwgeS9vIGVsIExpY2VuY2lhbnRlIGh1YmllcmVuIGRlc2lnbmFkbyBwYXJhIGxhIGF0cmlidWNpw7NuICh2LmcuLCB1biBpbnN0aXR1dG8gcGF0cm9jaW5hZG9yLCBlZGl0b3JpYWwsIHB1YmxpY2FjacOzbikgZW4gbGEgaW5mb3JtYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZWwgTGljZW5jaWFudGUsIHTDqXJtaW5vcyBkZSBzZXJ2aWNpb3MgbyBkZSBvdHJhcyBmb3JtYXMgcmF6b25hYmxlczsgZWwgdMOtdHVsbyBkZSBsYSBPYnJhIHNpIGVzdMOhIHByb3Zpc3RvOyBlbiBsYSBtZWRpZGEgZGUgbG8gcmF6b25hYmxlbWVudGUgZmFjdGlibGUgeSwgc2kgZXN0w6EgcHJvdmlzdG8sIGVsIElkZW50aWZpY2Fkb3IgVW5pZm9ybWUgZGUgUmVjdXJzb3MgKFVuaWZvcm0gUmVzb3VyY2UgSWRlbnRpZmllcikgcXVlIGVsIExpY2VuY2lhbnRlIGVzcGVjaWZpY2EgcGFyYSBzZXIgYXNvY2lhZG8gY29uIGxhIE9icmEsIHNhbHZvIHF1ZSB0YWwgVVJJIG5vIHNlIHJlZmllcmEgYSBsYSBub3RhIHNvYnJlIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBvIGEgbGEgaW5mb3JtYWNpw7NuIHNvYnJlIGVsIGxpY2VuY2lhbWllbnRvIGRlIGxhIE9icmE7IHkgZW4gZWwgY2FzbyBkZSB1bmEgT2JyYSBEZXJpdmFkYSwgYXRyaWJ1aXIgZWwgY3LDqWRpdG8gaWRlbnRpZmljYW5kbyBlbCB1c28gZGUgbGEgT2JyYSBlbiBsYSBPYnJhIERlcml2YWRhICh2LmcuLCAiVHJhZHVjY2nDs24gRnJhbmNlc2EgZGUgbGEgT2JyYSBkZWwgQXV0b3IgT3JpZ2luYWwsIiBvICJHdWnDs24gQ2luZW1hdG9ncsOhZmljbyBiYXNhZG8gZW4gbGEgT2JyYSBvcmlnaW5hbCBkZWwgQXV0b3IgT3JpZ2luYWwiKS4gVGFsIGNyw6lkaXRvIHB1ZWRlIHNlciBpbXBsZW1lbnRhZG8gZGUgY3VhbHF1aWVyIGZvcm1hIHJhem9uYWJsZTsgZW4gZWwgY2Fzbywgc2luIGVtYmFyZ28sIGRlIE9icmFzIERlcml2YWRhcyB1IE9icmFzIENvbGVjdGl2YXMsIHRhbCBjcsOpZGl0byBhcGFyZWNlcsOhLCBjb21vIG3DrW5pbW8sIGRvbmRlIGFwYXJlY2UgZWwgY3LDqWRpdG8gZGUgY3VhbHF1aWVyIG90cm8gYXV0b3IgY29tcGFyYWJsZSB5IGRlIHVuYSBtYW5lcmEsIGFsIG1lbm9zLCB0YW4gZGVzdGFjYWRhIGNvbW8gZWwgY3LDqWRpdG8gZGUgb3RybyBhdXRvciBjb21wYXJhYmxlLjwvbGk+CiAgICAgIDxsaT4KICAgICAgICBQYXJhIGV2aXRhciB0b2RhIGNvbmZ1c2nDs24sIGVsIExpY2VuY2lhbnRlIGFjbGFyYSBxdWUsIGN1YW5kbyBsYSBvYnJhIGVzIHVuYSBjb21wb3NpY2nDs24gbXVzaWNhbDoKICAgICAgICA8b2wgdHlwZT0iaSI+CiAgICAgICAgICA8bGk+UmVnYWzDrWFzIHBvciBpbnRlcnByZXRhY2nDs24geSBlamVjdWNpw7NuIGJham8gbGljZW5jaWFzIGdlbmVyYWxlcy4gRWwgTGljZW5jaWFudGUgc2UgcmVzZXJ2YSBlbCBkZXJlY2hvIGV4Y2x1c2l2byBkZSBhdXRvcml6YXIgbGEgZWplY3VjacOzbiBww7pibGljYSBvIGxhIGVqZWN1Y2nDs24gcMO6YmxpY2EgZGlnaXRhbCBkZSBsYSBvYnJhIHkgZGUgcmVjb2xlY3Rhciwgc2VhIGluZGl2aWR1YWxtZW50ZSBvIGEgdHJhdsOpcyBkZSB1bmEgc29jaWVkYWQgZGUgZ2VzdGnDs24gY29sZWN0aXZhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIHkgZGVyZWNob3MgY29uZXhvcyAocG9yIGVqZW1wbG8sIFNBWUNPKSwgbGFzIHJlZ2Fsw61hcyBwb3IgbGEgZWplY3VjacOzbiBww7pibGljYSBvIHBvciBsYSBlamVjdWNpw7NuIHDDumJsaWNhIGRpZ2l0YWwgZGUgbGEgb2JyYSAocG9yIGVqZW1wbG8gV2ViY2FzdCkgbGljZW5jaWFkYSBiYWpvIGxpY2VuY2lhcyBnZW5lcmFsZXMsIHNpIGxhIGludGVycHJldGFjacOzbiBvIGVqZWN1Y2nDs24gZGUgbGEgb2JyYSBlc3TDoSBwcmltb3JkaWFsbWVudGUgb3JpZW50YWRhIHBvciBvIGRpcmlnaWRhIGEgbGEgb2J0ZW5jacOzbiBkZSB1bmEgdmVudGFqYSBjb21lcmNpYWwgbyB1bmEgY29tcGVuc2FjacOzbiBtb25ldGFyaWEgcHJpdmFkYS48L2xpPgogICAgICAgICAgPGxpPlJlZ2Fsw61hcyBwb3IgRm9ub2dyYW1hcy4gRWwgTGljZW5jaWFudGUgc2UgcmVzZXJ2YSBlbCBkZXJlY2hvIGV4Y2x1c2l2byBkZSByZWNvbGVjdGFyLCBpbmRpdmlkdWFsbWVudGUgbyBhIHRyYXbDqXMgZGUgdW5hIHNvY2llZGFkIGRlIGdlc3Rpw7NuIGNvbGVjdGl2YSBkZSBkZXJlY2hvcyBkZSBhdXRvciB5IGRlcmVjaG9zIGNvbmV4b3MgKHBvciBlamVtcGxvLCBsb3MgY29uc2FncmFkb3MgcG9yIGxhIFNBWUNPKSwgdW5hIGFnZW5jaWEgZGUgZGVyZWNob3MgbXVzaWNhbGVzIG8gYWxnw7puIGFnZW50ZSBkZXNpZ25hZG8sIGxhcyByZWdhbMOtYXMgcG9yIGN1YWxxdWllciBmb25vZ3JhbWEgcXVlIFVzdGVkIGNyZWUgYSBwYXJ0aXIgZGUgbGEgb2JyYSAo4oCcdmVyc2nDs24gY292ZXLigJ0pIHkgZGlzdHJpYnV5YSwgZW4gbG9zIHTDqXJtaW5vcyBkZWwgcsOpZ2ltZW4gZGUgZGVyZWNob3MgZGUgYXV0b3IsIHNpIGxhIGNyZWFjacOzbiBvIGRpc3RyaWJ1Y2nDs24gZGUgZXNhIHZlcnNpw7NuIGNvdmVyIGVzdMOhIHByaW1vcmRpYWxtZW50ZSBkZXN0aW5hZGEgbyBkaXJpZ2lkYSBhIG9idGVuZXIgdW5hIHZlbnRhamEgY29tZXJjaWFsIG8gdW5hIGNvbXBlbnNhY2nDs24gbW9uZXRhcmlhIHByaXZhZGEuPC9saT4KICAgICAgICA8L29sPgogICAgICA8L2xpPgogICAgICA8bGk+R2VzdGnDs24gZGUgRGVyZWNob3MgZGUgQXV0b3Igc29icmUgSW50ZXJwcmV0YWNpb25lcyB5IEVqZWN1Y2lvbmVzIERpZ2l0YWxlcyAoV2ViQ2FzdGluZykuIFBhcmEgZXZpdGFyIHRvZGEgY29uZnVzacOzbiwgZWwgTGljZW5jaWFudGUgYWNsYXJhIHF1ZSwgY3VhbmRvIGxhIG9icmEgc2VhIHVuIGZvbm9ncmFtYSwgZWwgTGljZW5jaWFudGUgc2UgcmVzZXJ2YSBlbCBkZXJlY2hvIGV4Y2x1c2l2byBkZSBhdXRvcml6YXIgbGEgZWplY3VjacOzbiBww7pibGljYSBkaWdpdGFsIGRlIGxhIG9icmEgKHBvciBlamVtcGxvLCB3ZWJjYXN0KSB5IGRlIHJlY29sZWN0YXIsIGluZGl2aWR1YWxtZW50ZSBvIGEgdHJhdsOpcyBkZSB1bmEgc29jaWVkYWQgZGUgZ2VzdGnDs24gY29sZWN0aXZhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIHkgZGVyZWNob3MgY29uZXhvcyAocG9yIGVqZW1wbG8sIEFDSU5QUk8pLCBsYXMgcmVnYWzDrWFzIHBvciBsYSBlamVjdWNpw7NuIHDDumJsaWNhIGRpZ2l0YWwgZGUgbGEgb2JyYSAocG9yIGVqZW1wbG8sIHdlYmNhc3QpLCBzdWpldGEgYSBsYXMgZGlzcG9zaWNpb25lcyBhcGxpY2FibGVzIGRlbCByw6lnaW1lbiBkZSBEZXJlY2hvIGRlIEF1dG9yLCBzaSBlc3RhIGVqZWN1Y2nDs24gcMO6YmxpY2EgZGlnaXRhbCBlc3TDoSBwcmltb3JkaWFsbWVudGUgZGlyaWdpZGEgYSBvYnRlbmVyIHVuYSB2ZW50YWphIGNvbWVyY2lhbCBvIHVuYSBjb21wZW5zYWNpw7NuIG1vbmV0YXJpYSBwcml2YWRhLjwvbGk+CiAgICA8L29sPgogIDwvbGk+CiAgPGJyLz4KICA8bGk+CiAgICBSZXByZXNlbnRhY2lvbmVzLCBHYXJhbnTDrWFzIHkgTGltaXRhY2lvbmVzIGRlIFJlc3BvbnNhYmlsaWRhZC4KICAgIDxwPkEgTUVOT1MgUVVFIExBUyBQQVJURVMgTE8gQUNPUkRBUkFOIERFIE9UUkEgRk9STUEgUE9SIEVTQ1JJVE8sIEVMIExJQ0VOQ0lBTlRFIE9GUkVDRSBMQSBPQlJBIChFTiBFTCBFU1RBRE8gRU4gRUwgUVVFIFNFIEVOQ1VFTlRSQSkg4oCcVEFMIENVQUzigJ0sIFNJTiBCUklOREFSIEdBUkFOVMONQVMgREUgQ0xBU0UgQUxHVU5BIFJFU1BFQ1RPIERFIExBIE9CUkEsIFlBIFNFQSBFWFBSRVNBLCBJTVBMw41DSVRBLCBMRUdBTCBPIENVQUxRVUlFUkEgT1RSQSwgSU5DTFVZRU5ETywgU0lOIExJTUlUQVJTRSBBIEVMTEFTLCBHQVJBTlTDjUFTIERFIFRJVFVMQVJJREFELCBDT01FUkNJQUJJTElEQUQsIEFEQVBUQUJJTElEQUQgTyBBREVDVUFDScOTTiBBIFBST1DDk1NJVE8gREVURVJNSU5BRE8sIEFVU0VOQ0lBIERFIElORlJBQ0NJw5NOLCBERSBBVVNFTkNJQSBERSBERUZFQ1RPUyBMQVRFTlRFUyBPIERFIE9UUk8gVElQTywgTyBMQSBQUkVTRU5DSUEgTyBBVVNFTkNJQSBERSBFUlJPUkVTLCBTRUFOIE8gTk8gREVTQ1VCUklCTEVTIChQVUVEQU4gTyBOTyBTRVIgRVNUT1MgREVTQ1VCSUVSVE9TKS4gQUxHVU5BUyBKVVJJU0RJQ0NJT05FUyBOTyBQRVJNSVRFTiBMQSBFWENMVVNJw5NOIERFIEdBUkFOVMONQVMgSU1QTMONQ0lUQVMsIEVOIENVWU8gQ0FTTyBFU1RBIEVYQ0xVU0nDk04gUFVFREUgTk8gQVBMSUNBUlNFIEEgVVNURUQuPC9wPgogIDwvbGk+CiAgPGJyLz4KICA8bGk+CiAgICBMaW1pdGFjacOzbiBkZSByZXNwb25zYWJpbGlkYWQuCiAgICA8cD5BIE1FTk9TIFFVRSBMTyBFWElKQSBFWFBSRVNBTUVOVEUgTEEgTEVZIEFQTElDQUJMRSwgRUwgTElDRU5DSUFOVEUgTk8gU0VSw4EgUkVTUE9OU0FCTEUgQU5URSBVU1RFRCBQT1IgREHDkU8gQUxHVU5PLCBTRUEgUE9SIFJFU1BPTlNBQklMSURBRCBFWFRSQUNPTlRSQUNUVUFMLCBQUkVDT05UUkFDVFVBTCBPIENPTlRSQUNUVUFMLCBPQkpFVElWQSBPIFNVQkpFVElWQSwgU0UgVFJBVEUgREUgREHDkU9TIE1PUkFMRVMgTyBQQVRSSU1PTklBTEVTLCBESVJFQ1RPUyBPIElORElSRUNUT1MsIFBSRVZJU1RPUyBPIElNUFJFVklTVE9TIFBST0RVQ0lET1MgUE9SIEVMIFVTTyBERSBFU1RBIExJQ0VOQ0lBIE8gREUgTEEgT0JSQSwgQVVOIENVQU5ETyBFTCBMSUNFTkNJQU5URSBIQVlBIFNJRE8gQURWRVJUSURPIERFIExBIFBPU0lCSUxJREFEIERFIERJQ0hPUyBEQcORT1MuIEFMR1VOQVMgTEVZRVMgTk8gUEVSTUlURU4gTEEgRVhDTFVTScOTTiBERSBDSUVSVEEgUkVTUE9OU0FCSUxJREFELCBFTiBDVVlPIENBU08gRVNUQSBFWENMVVNJw5NOIFBVRURFIE5PIEFQTElDQVJTRSBBIFVTVEVELjwvcD4KICA8L2xpPgogIDxici8+CiAgPGxpPgogICAgVMOpcm1pbm8uCiAgICA8b2wgdHlwZT0iYSI+CiAgICAgIDxsaT5Fc3RhIExpY2VuY2lhIHkgbG9zIGRlcmVjaG9zIG90b3JnYWRvcyBlbiB2aXJ0dWQgZGUgZWxsYSB0ZXJtaW5hcsOhbiBhdXRvbcOhdGljYW1lbnRlIHNpIFVzdGVkIGluZnJpbmdlIGFsZ3VuYSBjb25kaWNpw7NuIGVzdGFibGVjaWRhIGVuIGVsbGEuIFNpbiBlbWJhcmdvLCBsb3MgaW5kaXZpZHVvcyBvIGVudGlkYWRlcyBxdWUgaGFuIHJlY2liaWRvIE9icmFzIERlcml2YWRhcyBvIENvbGVjdGl2YXMgZGUgVXN0ZWQgZGUgY29uZm9ybWlkYWQgY29uIGVzdGEgTGljZW5jaWEsIG5vIHZlcsOhbiB0ZXJtaW5hZGFzIHN1cyBsaWNlbmNpYXMsIHNpZW1wcmUgcXVlIGVzdG9zIGluZGl2aWR1b3MgbyBlbnRpZGFkZXMgc2lnYW4gY3VtcGxpZW5kbyDDrW50ZWdyYW1lbnRlIGxhcyBjb25kaWNpb25lcyBkZSBlc3RhcyBsaWNlbmNpYXMuIExhcyBTZWNjaW9uZXMgMSwgMiwgNSwgNiwgNywgeSA4IHN1YnNpc3RpcsOhbiBhIGN1YWxxdWllciB0ZXJtaW5hY2nDs24gZGUgZXN0YSBMaWNlbmNpYS48L2xpPgogICAgICA8bGk+U3VqZXRhIGEgbGFzIGNvbmRpY2lvbmVzIHkgdMOpcm1pbm9zIGFudGVyaW9yZXMsIGxhIGxpY2VuY2lhIG90b3JnYWRhIGFxdcOtIGVzIHBlcnBldHVhIChkdXJhbnRlIGVsIHBlcsOtb2RvIGRlIHZpZ2VuY2lhIGRlIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSBsYSBvYnJhKS4gTm8gb2JzdGFudGUgbG8gYW50ZXJpb3IsIGVsIExpY2VuY2lhbnRlIHNlIHJlc2VydmEgZWwgZGVyZWNobyBhIHB1YmxpY2FyIHkvbyBlc3RyZW5hciBsYSBPYnJhIGJham8gY29uZGljaW9uZXMgZGUgbGljZW5jaWEgZGlmZXJlbnRlcyBvIGEgZGVqYXIgZGUgZGlzdHJpYnVpcmxhIGVuIGxvcyB0w6lybWlub3MgZGUgZXN0YSBMaWNlbmNpYSBlbiBjdWFscXVpZXIgbW9tZW50bzsgZW4gZWwgZW50ZW5kaWRvLCBzaW4gZW1iYXJnbywgcXVlIGVzYSBlbGVjY2nDs24gbm8gc2Vydmlyw6EgcGFyYSByZXZvY2FyIGVzdGEgbGljZW5jaWEgbyBxdWUgZGViYSBzZXIgb3RvcmdhZGEgLCBiYWpvIGxvcyB0w6lybWlub3MgZGUgZXN0YSBsaWNlbmNpYSksIHkgZXN0YSBsaWNlbmNpYSBjb250aW51YXLDoSBlbiBwbGVubyB2aWdvciB5IGVmZWN0byBhIG1lbm9zIHF1ZSBzZWEgdGVybWluYWRhIGNvbW8gc2UgZXhwcmVzYSBhdHLDoXMuIExhIExpY2VuY2lhIHJldm9jYWRhIGNvbnRpbnVhcsOhIHNpZW5kbyBwbGVuYW1lbnRlIHZpZ2VudGUgeSBlZmVjdGl2YSBzaSBubyBzZSBsZSBkYSB0w6lybWlubyBlbiBsYXMgY29uZGljaW9uZXMgaW5kaWNhZGFzIGFudGVyaW9ybWVudGUuPC9saT4KICAgIDwvb2w+CiAgPC9saT4KICA8YnIvPgogIDxsaT4KICAgIFZhcmlvcy4KICAgIDxvbCB0eXBlPSJhIj4KICAgICAgPGxpPkNhZGEgdmV6IHF1ZSBVc3RlZCBkaXN0cmlidXlhIG8gcG9uZ2EgYSBkaXNwb3NpY2nDs24gcMO6YmxpY2EgbGEgT2JyYSBvIHVuYSBPYnJhIENvbGVjdGl2YSwgZWwgTGljZW5jaWFudGUgb2ZyZWNlcsOhIGFsIGRlc3RpbmF0YXJpbyB1bmEgbGljZW5jaWEgZW4gbG9zIG1pc21vcyB0w6lybWlub3MgeSBjb25kaWNpb25lcyBxdWUgbGEgbGljZW5jaWEgb3RvcmdhZGEgYSBVc3RlZCBiYWpvIGVzdGEgTGljZW5jaWEuPC9saT4KICAgICAgPGxpPlNpIGFsZ3VuYSBkaXNwb3NpY2nDs24gZGUgZXN0YSBMaWNlbmNpYSByZXN1bHRhIGludmFsaWRhZGEgbyBubyBleGlnaWJsZSwgc2Vnw7puIGxhIGxlZ2lzbGFjacOzbiB2aWdlbnRlLCBlc3RvIG5vIGFmZWN0YXLDoSBuaSBsYSB2YWxpZGV6IG5pIGxhIGFwbGljYWJpbGlkYWQgZGVsIHJlc3RvIGRlIGNvbmRpY2lvbmVzIGRlIGVzdGEgTGljZW5jaWEgeSwgc2luIGFjY2nDs24gYWRpY2lvbmFsIHBvciBwYXJ0ZSBkZSBsb3Mgc3VqZXRvcyBkZSBlc3RlIGFjdWVyZG8sIGFxdcOpbGxhIHNlIGVudGVuZGVyw6EgcmVmb3JtYWRhIGxvIG3DrW5pbW8gbmVjZXNhcmlvIHBhcmEgaGFjZXIgcXVlIGRpY2hhIGRpc3Bvc2ljacOzbiBzZWEgdsOhbGlkYSB5IGV4aWdpYmxlLjwvbGk+CiAgICAgIDxsaT5OaW5nw7puIHTDqXJtaW5vIG8gZGlzcG9zaWNpw7NuIGRlIGVzdGEgTGljZW5jaWEgc2UgZXN0aW1hcsOhIHJlbnVuY2lhZGEgeSBuaW5ndW5hIHZpb2xhY2nDs24gZGUgZWxsYSBzZXLDoSBjb25zZW50aWRhIGEgbWVub3MgcXVlIGVzYSByZW51bmNpYSBvIGNvbnNlbnRpbWllbnRvIHNlYSBvdG9yZ2FkbyBwb3IgZXNjcml0byB5IGZpcm1hZG8gcG9yIGxhIHBhcnRlIHF1ZSByZW51bmNpZSBvIGNvbnNpZW50YS48L2xpPgogICAgICA8bGk+RXN0YSBMaWNlbmNpYSByZWZsZWphIGVsIGFjdWVyZG8gcGxlbm8gZW50cmUgbGFzIHBhcnRlcyByZXNwZWN0byBhIGxhIE9icmEgYXF1w60gbGljZW5jaWFkYS4gTm8gaGF5IGFycmVnbG9zLCBhY3VlcmRvcyBvIGRlY2xhcmFjaW9uZXMgcmVzcGVjdG8gYSBsYSBPYnJhIHF1ZSBubyBlc3TDqW4gZXNwZWNpZmljYWRvcyBlbiBlc3RlIGRvY3VtZW50by4gRWwgTGljZW5jaWFudGUgbm8gc2UgdmVyw6EgbGltaXRhZG8gcG9yIG5pbmd1bmEgZGlzcG9zaWNpw7NuIGFkaWNpb25hbCBxdWUgcHVlZGEgc3VyZ2lyIGVuIGFsZ3VuYSBjb211bmljYWNpw7NuIGVtYW5hZGEgZGUgVXN0ZWQuIEVzdGEgTGljZW5jaWEgbm8gcHVlZGUgc2VyIG1vZGlmaWNhZGEgc2luIGVsIGNvbnNlbnRpbWllbnRvIG11dHVvIHBvciBlc2NyaXRvIGRlbCBMaWNlbmNpYW50ZSB5IFVzdGVkLjwvbGk+CiAgICA8L29sPgogIDwvbGk+CiAgPGJyLz4KPC9vbD4K |