Linear methods of dimension reduction for classification

For classification problems, traditional dimension reduction methods often take into account only the feature information, while ignoring the class label. This poses an opportunity for improvement. In this thesis, we explore new methods that aim to find linear orthogonal projections that maximize op...

Full description

Autores:
Ramírez Garrido, Diego Alejandro
Tipo de recurso:
Trabajo de grado de pregrado
Fecha de publicación:
2024
Institución:
Universidad de los Andes
Repositorio:
Séneca: repositorio Uniandes
Idioma:
eng
OAI Identifier:
oai:repositorio.uniandes.edu.co:1992/75195
Acceso en línea:
https://hdl.handle.net/1992/75195
Palabra clave:
Dimension Reduction
Dimensionality Reduction
Wasserstein Distance
Sinkhorn Divergence
Subgradient Descent
Binary Classification
Optimal Transport
Matemáticas
Rights
openAccess
License
Attribution 4.0 International
id UNIANDES2_fefd20b46ce04db5120c32600eeae623
oai_identifier_str oai:repositorio.uniandes.edu.co:1992/75195
network_acronym_str UNIANDES2
network_name_str Séneca: repositorio Uniandes
repository_id_str
dc.title.eng.fl_str_mv Linear methods of dimension reduction for classification
title Linear methods of dimension reduction for classification
spellingShingle Linear methods of dimension reduction for classification
Dimension Reduction
Dimensionality Reduction
Wasserstein Distance
Sinkhorn Divergence
Subgradient Descent
Binary Classification
Optimal Transport
Matemáticas
title_short Linear methods of dimension reduction for classification
title_full Linear methods of dimension reduction for classification
title_fullStr Linear methods of dimension reduction for classification
title_full_unstemmed Linear methods of dimension reduction for classification
title_sort Linear methods of dimension reduction for classification
dc.creator.fl_str_mv Ramírez Garrido, Diego Alejandro
dc.contributor.advisor.none.fl_str_mv Quiroz Salazar, Adolfo José
dc.contributor.author.none.fl_str_mv Ramírez Garrido, Diego Alejandro
dc.contributor.jury.none.fl_str_mv Junca Peláez, Mauricio José
dc.subject.keyword.eng.fl_str_mv Dimension Reduction
Dimensionality Reduction
Wasserstein Distance
Sinkhorn Divergence
Subgradient Descent
Binary Classification
Optimal Transport
topic Dimension Reduction
Dimensionality Reduction
Wasserstein Distance
Sinkhorn Divergence
Subgradient Descent
Binary Classification
Optimal Transport
Matemáticas
dc.subject.themes.none.fl_str_mv Matemáticas
description For classification problems, traditional dimension reduction methods often take into account only the feature information, while ignoring the class label. This poses an opportunity for improvement. In this thesis, we explore new methods that aim to find linear orthogonal projections that maximize optimal transportation distances (Wasserstein and Sinkhorn) feature subsamples corresponding to the categories. These methods employ ubgradient ascent and stochastic subgradient ascent algorithms. We detail the calculation of the subgradient of these distances with respect to the projection and implement these methods in Python. To validate our approach, we test these methods on various datasets. Our results demonstrate that the proposed methods effectively enhance classification performance by incorporating class information into the dimension reduction process.
publishDate 2024
dc.date.accessioned.none.fl_str_mv 2024-10-30T21:55:29Z
dc.date.available.none.fl_str_mv 2024-10-30T21:55:29Z
dc.date.issued.none.fl_str_mv 2024-10-30
dc.type.none.fl_str_mv Trabajo de grado - Pregrado
dc.type.driver.none.fl_str_mv info:eu-repo/semantics/bachelorThesis
dc.type.version.none.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.coar.none.fl_str_mv http://purl.org/coar/resource_type/c_7a1f
dc.type.content.none.fl_str_mv Text
dc.type.redcol.none.fl_str_mv http://purl.org/redcol/resource_type/TP
format http://purl.org/coar/resource_type/c_7a1f
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/1992/75195
dc.identifier.instname.none.fl_str_mv instname:Universidad de los Andes
dc.identifier.reponame.none.fl_str_mv reponame:Repositorio Institucional Séneca
dc.identifier.repourl.none.fl_str_mv repourl:https://repositorio.uniandes.edu.co/
url https://hdl.handle.net/1992/75195
identifier_str_mv instname:Universidad de los Andes
reponame:Repositorio Institucional Séneca
repourl:https://repositorio.uniandes.edu.co/
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.references.none.fl_str_mv Boyd, Stephen, and Lieven Vandenberghe. *Convex Optimization.* Cambridge University Press, 2004.
Clarke, Frank H. “Generalized Gradients and Applications.” *Transactions of the American Mathematical Society*, vol. 205, 1975, pp. 247–262. https://doi.org/10.1090/s0002-9947-1975-0367131-6. Accessed 14 Jan. 2021.
Clarke, Frank H. *Optimization and Nonsmooth Analysis.* Wiley-Interscience, 1983. https://doi.org/10.1137/1.9781611971309
Cuturi, Marco. “Sinkhorn Distances: Lightspeed Computation of Optimal Transport.” *Advances in Neural Information Processing Systems*, vol. 26, 2013, pp. 2292–2300.https://doi.org/10.48550/arXiv.1306.0895
Devroye, Luc, et al. *A Probabilistic Theory of Pattern Recognition.* Springer Science \& Business Media, 2013.
Munkres, James. “Algorithms for the Assignment and Transportation Problems.” *Journal of the Society for Industrial and Applied Mathematics*, vol. 5, no. 1, Mar. 1957, pp. 32–38. https://doi.org/10.1137/0105003. Accessed 26 July 2020.
Peyré, Gabriel, and Marco Cuturi. *Computational Optimal Transport.* Foundations and Trends in Machine Learning, 2019.
Sinkhorn, Richard, and Paul Knopp. “Concerning Nonnegative Matrices and Doubly Stochastic Matrices.” *Pacific Journal of Mathematics*, vol. 21, no. 2, 1967, pp. 343–348. https://doi.org/10.2140/pjm.1967.21.343 Accessed 30 July 2022.
Vanderbei, Robert J. *Linear Programming: Foundations and Extensions.* Springer, 2021.
Villani, Cédric. *Optimal Transport: Old and New.* Springer, 2009.
Papailiopoulos, D. *ECE 901: Large-scale Machine Learning and Optimization.* Lecture 9. Scribed by Guangtong Bai \& Yuan-Ting Hsieh, Spring 2018.
Janosi, Andras, Steinbrunn, William, Pfisterer, Matthias, and Detrano, Robert. *Heart Disease.* UCI Machine Learning Repository, 1988. https://doi.org/10.24432/C52P4X
Kahn, Michael. *Diabetes.* UCI Machine Learning Repository. https://doi.org/10.24432/C5T59G
Hotelling, H. “Analysis of a Complex of Statistical Variables into Principal Components.” *Journal of Educational Psychology*, vol. 24, no. 6, 1933, pp. 417–441. https://doi.org/10.1037/h0071325
Stein, Elias M. *Singular Integrals and Differentiability Properties of Functions.* Princeton University Press, 1970.
Bottou, Léon. "Large-Scale Machine Learning with Stochastic Gradient Descent." In *Proceedings of COMPSTAT2010*, edited by Yves Lechevallier and Gilbert Saporta, Physica-Verlag HD, 2010, pp. 177-186. https://doi.org/10.1007/978-3-7908-2604-3_16
Polyak, Boris T. "Some Methods of Speeding up the Convergence of Iteration Methods." *USSR Computational Mathematics and Mathematical Physics*, vol. 4, no. 5, 1964, pp. 1-17. https://doi.org/10.1016/0041-5553(64)90137-5
Hinton, Geoffrey. "Lecture 6e rmsprop: Divide the Gradient by a Running Average of Its Recent Magnitude." *Coursera Lecture Notes*, 2012.
Tieleman, Tijmen, and Geoffrey Hinton. "Lecture 6.5 - RMSProp: Divide the Gradient by a Running Average of Its Recent Magnitude." *COURSERA: Neural Networks for Machine Learning*, University of Toronto, 2012.
Kingma, Diederik P., and Jimmy Ba. "Adam: A Method for Stochastic Optimization." In *Proceedings of the 3rd International Conference on Learning Representations (ICLR)*, 2015.https://doi.org/10.48550/arXiv.1412.6980
Volgenant, Ton, and R. Jonker. "A Branch and Bound Algorithm for the Symmetric Traveling Salesman Problem Based on the 1-tree Relaxation." *European Journal of Operational Research*, vol. 6, no. 4, 1981, pp. 447-458. https://doi.org/10.1016/0377-2217(81)90100-0
Edmonds, Jack, and Richard M. Karp. "Theoretical Improvements in Algorithmic Efficiency for Network Flow Problems." *Journal of the ACM*, vol. 19, no. 2, 1972, pp. 248-264. https://doi.org/10.1145/321694.321699
Shannon, Claude E. "A Mathematical Theory of Communication." *Bell System Technical Journal*, vol. 27, no. 3, 1948, pp. 379-423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x
dc.rights.en.fl_str_mv Attribution 4.0 International
dc.rights.uri.none.fl_str_mv http://creativecommons.org/licenses/by/4.0/
dc.rights.accessrights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.coar.none.fl_str_mv http://purl.org/coar/access_right/c_abf2
rights_invalid_str_mv Attribution 4.0 International
http://creativecommons.org/licenses/by/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.none.fl_str_mv 54 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Universidad de los Andes
dc.publisher.program.none.fl_str_mv Matemáticas
dc.publisher.faculty.none.fl_str_mv Facultad de Ciencias
dc.publisher.department.none.fl_str_mv Departamento de Matemáticas
publisher.none.fl_str_mv Universidad de los Andes
institution Universidad de los Andes
bitstream.url.fl_str_mv https://repositorio.uniandes.edu.co/bitstreams/6b84d5c6-56cd-4eb9-b3cf-8024e83e62fb/download
https://repositorio.uniandes.edu.co/bitstreams/6711caf2-5d86-4d68-a2e8-01136cc34e33/download
https://repositorio.uniandes.edu.co/bitstreams/136472ce-4188-428e-9e74-9711bcd2666c/download
https://repositorio.uniandes.edu.co/bitstreams/0a11f59d-83c5-499e-b1c8-f473adf27252/download
https://repositorio.uniandes.edu.co/bitstreams/4b635b18-fdb2-46fc-9f13-4ef6bde21fc7/download
https://repositorio.uniandes.edu.co/bitstreams/44ab51ee-b5b5-4a80-b105-f38f73c89f92/download
https://repositorio.uniandes.edu.co/bitstreams/99ffe3bb-a03e-448a-89ef-515c05eb8ffc/download
https://repositorio.uniandes.edu.co/bitstreams/ed6f09db-2ac8-4012-811d-7b7a0a7131eb/download
bitstream.checksum.fl_str_mv 0175ea4a2d4caec4bbcc37e300941108
ae9e573a68e7f92501b6913cc846c39f
e16bbe0f3f60ef09d2f08436b10e3c82
f31c9fcfa9ab4280de1d80371f63e986
30a21ea615d12756d53f82d9b82a658b
13c2e06b42c89281584508f3b9219df6
e7df4079db8e97692aea301bd02fce9b
f22df99a365ea68942bb39442996fc09
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio institucional Séneca
repository.mail.fl_str_mv adminrepositorio@uniandes.edu.co
_version_ 1818112052178190336
spelling Quiroz Salazar, Adolfo Josévirtual::20118-1Ramírez Garrido, Diego AlejandroJunca Peláez, Mauricio Josévirtual::20119-12024-10-30T21:55:29Z2024-10-30T21:55:29Z2024-10-30https://hdl.handle.net/1992/75195instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/For classification problems, traditional dimension reduction methods often take into account only the feature information, while ignoring the class label. This poses an opportunity for improvement. In this thesis, we explore new methods that aim to find linear orthogonal projections that maximize optimal transportation distances (Wasserstein and Sinkhorn) feature subsamples corresponding to the categories. These methods employ ubgradient ascent and stochastic subgradient ascent algorithms. We detail the calculation of the subgradient of these distances with respect to the projection and implement these methods in Python. To validate our approach, we test these methods on various datasets. Our results demonstrate that the proposed methods effectively enhance classification performance by incorporating class information into the dimension reduction process.Pregrado54 páginasapplication/pdfengUniversidad de los AndesMatemáticasFacultad de CienciasDepartamento de MatemáticasAttribution 4.0 Internationalhttp://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Linear methods of dimension reduction for classificationTrabajo de grado - Pregradoinfo:eu-repo/semantics/bachelorThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_7a1fTexthttp://purl.org/redcol/resource_type/TPDimension ReductionDimensionality ReductionWasserstein DistanceSinkhorn DivergenceSubgradient DescentBinary ClassificationOptimal TransportMatemáticasBoyd, Stephen, and Lieven Vandenberghe. *Convex Optimization.* Cambridge University Press, 2004.Clarke, Frank H. “Generalized Gradients and Applications.” *Transactions of the American Mathematical Society*, vol. 205, 1975, pp. 247–262. https://doi.org/10.1090/s0002-9947-1975-0367131-6. Accessed 14 Jan. 2021.Clarke, Frank H. *Optimization and Nonsmooth Analysis.* Wiley-Interscience, 1983. https://doi.org/10.1137/1.9781611971309Cuturi, Marco. “Sinkhorn Distances: Lightspeed Computation of Optimal Transport.” *Advances in Neural Information Processing Systems*, vol. 26, 2013, pp. 2292–2300.https://doi.org/10.48550/arXiv.1306.0895Devroye, Luc, et al. *A Probabilistic Theory of Pattern Recognition.* Springer Science \& Business Media, 2013.Munkres, James. “Algorithms for the Assignment and Transportation Problems.” *Journal of the Society for Industrial and Applied Mathematics*, vol. 5, no. 1, Mar. 1957, pp. 32–38. https://doi.org/10.1137/0105003. Accessed 26 July 2020.Peyré, Gabriel, and Marco Cuturi. *Computational Optimal Transport.* Foundations and Trends in Machine Learning, 2019.Sinkhorn, Richard, and Paul Knopp. “Concerning Nonnegative Matrices and Doubly Stochastic Matrices.” *Pacific Journal of Mathematics*, vol. 21, no. 2, 1967, pp. 343–348. https://doi.org/10.2140/pjm.1967.21.343 Accessed 30 July 2022.Vanderbei, Robert J. *Linear Programming: Foundations and Extensions.* Springer, 2021.Villani, Cédric. *Optimal Transport: Old and New.* Springer, 2009.Papailiopoulos, D. *ECE 901: Large-scale Machine Learning and Optimization.* Lecture 9. Scribed by Guangtong Bai \& Yuan-Ting Hsieh, Spring 2018.Janosi, Andras, Steinbrunn, William, Pfisterer, Matthias, and Detrano, Robert. *Heart Disease.* UCI Machine Learning Repository, 1988. https://doi.org/10.24432/C52P4XKahn, Michael. *Diabetes.* UCI Machine Learning Repository. https://doi.org/10.24432/C5T59GHotelling, H. “Analysis of a Complex of Statistical Variables into Principal Components.” *Journal of Educational Psychology*, vol. 24, no. 6, 1933, pp. 417–441. https://doi.org/10.1037/h0071325Stein, Elias M. *Singular Integrals and Differentiability Properties of Functions.* Princeton University Press, 1970.Bottou, Léon. "Large-Scale Machine Learning with Stochastic Gradient Descent." In *Proceedings of COMPSTAT2010*, edited by Yves Lechevallier and Gilbert Saporta, Physica-Verlag HD, 2010, pp. 177-186. https://doi.org/10.1007/978-3-7908-2604-3_16Polyak, Boris T. "Some Methods of Speeding up the Convergence of Iteration Methods." *USSR Computational Mathematics and Mathematical Physics*, vol. 4, no. 5, 1964, pp. 1-17. https://doi.org/10.1016/0041-5553(64)90137-5Hinton, Geoffrey. "Lecture 6e rmsprop: Divide the Gradient by a Running Average of Its Recent Magnitude." *Coursera Lecture Notes*, 2012.Tieleman, Tijmen, and Geoffrey Hinton. "Lecture 6.5 - RMSProp: Divide the Gradient by a Running Average of Its Recent Magnitude." *COURSERA: Neural Networks for Machine Learning*, University of Toronto, 2012.Kingma, Diederik P., and Jimmy Ba. "Adam: A Method for Stochastic Optimization." In *Proceedings of the 3rd International Conference on Learning Representations (ICLR)*, 2015.https://doi.org/10.48550/arXiv.1412.6980Volgenant, Ton, and R. Jonker. "A Branch and Bound Algorithm for the Symmetric Traveling Salesman Problem Based on the 1-tree Relaxation." *European Journal of Operational Research*, vol. 6, no. 4, 1981, pp. 447-458. https://doi.org/10.1016/0377-2217(81)90100-0Edmonds, Jack, and Richard M. Karp. "Theoretical Improvements in Algorithmic Efficiency for Network Flow Problems." *Journal of the ACM*, vol. 19, no. 2, 1972, pp. 248-264. https://doi.org/10.1145/321694.321699Shannon, Claude E. "A Mathematical Theory of Communication." *Bell System Technical Journal*, vol. 27, no. 3, 1948, pp. 379-423. https://doi.org/10.1002/j.1538-7305.1948.tb01338.x202013233Publicationhttps://scholar.google.es/citations?user=qwMDh-4AAAAJvirtual::20118-1https://scholar.google.es/citations?user=CoIlxH0AAAAJvirtual::20119-10000-0003-4033-3400virtual::20118-10000-0002-5541-0758virtual::20119-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0001497101virtual::20118-1https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0000155861virtual::20119-11be19e5b-39c2-4d92-b44f-b9b4a48991cavirtual::20118-11be19e5b-39c2-4d92-b44f-b9b4a48991cavirtual::20118-11e5c3dc6-4d9c-406b-9f99-5c91523b7e49virtual::20119-11e5c3dc6-4d9c-406b-9f99-5c91523b7e49virtual::20119-1CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8908https://repositorio.uniandes.edu.co/bitstreams/6b84d5c6-56cd-4eb9-b3cf-8024e83e62fb/download0175ea4a2d4caec4bbcc37e300941108MD51LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/6711caf2-5d86-4d68-a2e8-01136cc34e33/downloadae9e573a68e7f92501b6913cc846c39fMD52ORIGINALautorizacion tesis Diego Ramirez.pdfautorizacion tesis Diego Ramirez.pdfHIDEapplication/pdf365846https://repositorio.uniandes.edu.co/bitstreams/136472ce-4188-428e-9e74-9711bcd2666c/downloade16bbe0f3f60ef09d2f08436b10e3c82MD53Linear Methods of Dimension Reduction for Classification.pdfLinear Methods of Dimension Reduction for Classification.pdfapplication/pdf778417https://repositorio.uniandes.edu.co/bitstreams/0a11f59d-83c5-499e-b1c8-f473adf27252/downloadf31c9fcfa9ab4280de1d80371f63e986MD54TEXTautorizacion tesis Diego Ramirez.pdf.txtautorizacion tesis Diego Ramirez.pdf.txtExtracted texttext/plain2024https://repositorio.uniandes.edu.co/bitstreams/4b635b18-fdb2-46fc-9f13-4ef6bde21fc7/download30a21ea615d12756d53f82d9b82a658bMD55Linear Methods of Dimension Reduction for Classification.pdf.txtLinear Methods of Dimension Reduction for Classification.pdf.txtExtracted texttext/plain55616https://repositorio.uniandes.edu.co/bitstreams/44ab51ee-b5b5-4a80-b105-f38f73c89f92/download13c2e06b42c89281584508f3b9219df6MD57THUMBNAILautorizacion tesis Diego Ramirez.pdf.jpgautorizacion tesis Diego Ramirez.pdf.jpgGenerated Thumbnailimage/jpeg10933https://repositorio.uniandes.edu.co/bitstreams/99ffe3bb-a03e-448a-89ef-515c05eb8ffc/downloade7df4079db8e97692aea301bd02fce9bMD56Linear Methods of Dimension Reduction for Classification.pdf.jpgLinear Methods of Dimension Reduction for Classification.pdf.jpgGenerated Thumbnailimage/jpeg5875https://repositorio.uniandes.edu.co/bitstreams/ed6f09db-2ac8-4012-811d-7b7a0a7131eb/downloadf22df99a365ea68942bb39442996fc09MD581992/75195oai:repositorio.uniandes.edu.co:1992/751952024-10-31 03:08:28.004http://creativecommons.org/licenses/by/4.0/Attribution 4.0 Internationalopen.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K