Natural language explanation model for decision trees

This study describes a model of explanations in natural language for classification decision trees. The explanations include global aspects of the classifier and local aspects of the classification of a particular instance. The proposal is implemented in the ExpliClas open source Web service [1], wh...

Full description

Autores:
Silva, Jesús
H, H
Núñez, Vladimir
Ruiz Lázaro, Alex
Varela Izquierdo, Noel
Tipo de recurso:
Article of journal
Fecha de publicación:
2020
Institución:
Corporación Universidad de la Costa
Repositorio:
REDICUC - Repositorio CUC
Idioma:
eng
OAI Identifier:
oai:repositorio.cuc.edu.co:11323/6212
Acceso en línea:
https://hdl.handle.net/11323/6212
https://repositorio.cuc.edu.co/
Palabra clave:
Modelo de explicaciones
Arboles de decisión
Código abierto ExpliClas
Explanation model
Decision trees
Open source ExpliClas
Rights
openAccess
License
CC0 1.0 Universal
id RCUC2_361377b0bda6ffd601107038f8181b65
oai_identifier_str oai:repositorio.cuc.edu.co:11323/6212
network_acronym_str RCUC2
network_name_str REDICUC - Repositorio CUC
repository_id_str
dc.title.spa.fl_str_mv Natural language explanation model for decision trees
title Natural language explanation model for decision trees
spellingShingle Natural language explanation model for decision trees
Modelo de explicaciones
Arboles de decisión
Código abierto ExpliClas
Explanation model
Decision trees
Open source ExpliClas
title_short Natural language explanation model for decision trees
title_full Natural language explanation model for decision trees
title_fullStr Natural language explanation model for decision trees
title_full_unstemmed Natural language explanation model for decision trees
title_sort Natural language explanation model for decision trees
dc.creator.fl_str_mv Silva, Jesús
H, H
Núñez, Vladimir
Ruiz Lázaro, Alex
Varela Izquierdo, Noel
dc.contributor.author.spa.fl_str_mv Silva, Jesús
H, H
Núñez, Vladimir
Ruiz Lázaro, Alex
Varela Izquierdo, Noel
dc.subject.spa.fl_str_mv Modelo de explicaciones
Arboles de decisión
Código abierto ExpliClas
Explanation model
Decision trees
Open source ExpliClas
topic Modelo de explicaciones
Arboles de decisión
Código abierto ExpliClas
Explanation model
Decision trees
Open source ExpliClas
description This study describes a model of explanations in natural language for classification decision trees. The explanations include global aspects of the classifier and local aspects of the classification of a particular instance. The proposal is implemented in the ExpliClas open source Web service [1], which in its current version operates on trees built with Weka and data sets with numerical attributes. The feasibility of the proposal is illustrated with two example cases, where the detailed explanation of the respective classification trees is shown.
publishDate 2020
dc.date.accessioned.none.fl_str_mv 2020-04-17T00:16:16Z
dc.date.available.none.fl_str_mv 2020-04-17T00:16:16Z
dc.date.issued.none.fl_str_mv 2020-02-01
dc.type.spa.fl_str_mv Artículo de revista
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.coar.spa.fl_str_mv http://purl.org/coar/resource_type/c_6501
dc.type.content.spa.fl_str_mv Text
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/article
dc.type.redcol.spa.fl_str_mv http://purl.org/redcol/resource_type/ART
dc.type.version.spa.fl_str_mv info:eu-repo/semantics/acceptedVersion
format http://purl.org/coar/resource_type/c_6501
status_str acceptedVersion
dc.identifier.issn.spa.fl_str_mv 17426588
dc.identifier.uri.spa.fl_str_mv https://hdl.handle.net/11323/6212
dc.identifier.doi.spa.fl_str_mv 10.1088/1742-6596/1432/1/012074
dc.identifier.instname.spa.fl_str_mv Corporación Universidad de la Costa
dc.identifier.reponame.spa.fl_str_mv REDICUC - Repositorio CUC
dc.identifier.repourl.spa.fl_str_mv https://repositorio.cuc.edu.co/
identifier_str_mv 17426588
10.1088/1742-6596/1432/1/012074
Corporación Universidad de la Costa
REDICUC - Repositorio CUC
url https://hdl.handle.net/11323/6212
https://repositorio.cuc.edu.co/
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.references.spa.fl_str_mv [2] Jain, Mugdha, and Chakradhar Verma. "Adapting k-means for Clustering in Big Data." International Journal of Computer Applications 101.1 (2014): 19-24.
[3] S. Ramiırez-Gallego, A. Fernandez, S. Garcıa, M. Chen, and F. Herrera, “Big data: Tutorial and guidelines on information and process fusion for analytics algorithms with mapreduce,” Information Fusion, vol. 42, pp. 51 – 61, 2018
[4] M. Hamstra, H. Karau, M. Zaharia, A. Konwinski, and P. Wendell, Learning Spark: LightningFast Big Data Analytics. O’Reilly Media, 2015.
[5] Lis-Gutiérrez JP., Gaitán-Angulo M., Henao L.C., Viloria A., Aguilera-Hernández D., PortilloMedina R. (2018) Measures of Concentration and Stability: Two Pedagogical Tools for Industrial Organization Courses. In: Tan Y., Shi Y., Tang Q. (eds) Advances in Swarm Intelligence. ICSI 2018. Lecture Notes in Computer Science, vol 10942. Springer, Cham
[6] J. Lin, “Mapreduce is good enough? if all you have is a hammer, throw away everything that’s not a nail!” Big Data, vol. 1, no. 1, pp. 28–37, 2013.
[7] Viloria, A., & Gaitan-Angulo, M. (2016). Statistical Adjustment Module Advanced Optimizer Planner and SAP Generated the Case of a Food Production Company. Indian Journal Of Science And Technology, 9(47). doi:10.17485/ijst/2016/v9i47/107371.
[8] D. Garcia-Gil, S. Ramiırez-Gallego, S. Garcia, and F. Herrera, “Principal Components Analysis Random Discretization Ensemble for Big Data,” Knowledge-Based Systems, vol. 150, pp. 166 – 174, 2018.
[9] N. Sapankevych y R. Sankar, “Time Series Prediction Using Support Vector Machines: A Survey”, IEEE Computational Intelligence Magazine, vol. 4, núm. 2, pp. 24–38, may 2009.
[10] Viloria A., Lis-Gutiérrez JP., Gaitán-Angulo M., Godoy A.R.M., Moreno G.C., Kamatkar S.J. (2018) Methodology for the Design of a Student Pattern Recognition Tool to Facilitate the Teaching - Learning Process Through Knowledge Data Discovery (Big Data). In: Tan Y., Shi Y., Tang Q. (eds) Data Mining and Big Data. DMBD 2018. Lecture Notes in Computer Science, vol 10943. Springer, Cham.
[11] L. A. Hendricks, Z. Akata, M. Rohrbach, J. Donahue, B. Schiele, and T. Darrell, “Generating visual explanations,” in Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 2016, pp. 3–19.
[12] A. Gatt and E. Krahmer, “Survey of the state of the art in natural language generation: Core tasks, applications and evaluation,” Journal of Artificial Intelligence Research, vol. 61, pp. 65–170, 2018.
[13] Ruß G. Data Mining of Agricultural Yield Data: A Comparison of Regression Models, In: Perner P. (eds) Advances in Data Mining. Applications and Theoretical Aspects, ICDM 2009. Lecture Notes in Computer Science, vol 5633.
[14] S. Barocas and D. Boyd, “Computing ethics. engaging the ethics of data science in practice,” Communications of the ACM, vol. 60, no. 11, pp. 23–25, 2017.
[15] Hernández, J. A., Burlak, G., Muñoz Arteaga, J., y Ochoa, A. (2006). Propuesta para la evaluación de objetos de aprendizaje desde una perspectiva integral usando minería de datos. En A. Hernández y J. Zechinelli (Eds.), Avances en la ciencia de la computación (pp. 382-387). México: Universidad Autónoma de México.
[16] N. Sapankevych y R. Sankar, “Time Series Prediction Using Support Vector Machines: A Survey”, IEEE Computational Intelligence Magazine, vol. 4, núm. 2, pp. 24–38, may 2009.
[16] S. Gang Wu, F. Sheng Bao, E. You Xu, Y.-X. Wang, Y.-F. Chang, and Q.-L. Xiang, “A leaf recognition algorithm for plant classification using probabilistic neural network,” in IEEE International Symposium on Signal Processing and Information Technology, 2007, pp. 1–6..
[17] I. H. Witten, E. Frank, M. A. Hall, and C. J. Pal, Data Mining: Practical Machine Learning Tools and Techniques, 4th ed. Morgan Kaufmann, 2016.
[18] D. Gunning, “Explainable Artificial Intelligence (XAI),” Defense Advan- ced Research Projects Agency (DARPA), Arlington, USA, Tech. Rep., 2016, DARPA-BAA-16-53
[19] Scheffer, T. (2004). Finding Association Rules that Trade Support Optimally Against Confidence. Intelligent Data Analysis, 9(4), 381-395.
[20] J. M. Alonso, A. Ramos-Soto, E. Reiter, and K. van Deemter, “An exploratory study on the benefits of using natural language for ex- plaining fuzzy rule-based systems,” in IEEE International Conferen- ce on Fuzzy Systems (FUZZ-IEEE), Naples, Italy, 2017, pp. 1–6, http://dx.doi.org/10.1109/FUZZ-IEEE.2017.8015489.
[21] M. Zaharia, M. Chowdhury, T. Das, A. Dave, J. Ma, M. McCauly, M. J. Franklin, S. Shenker, and I. Stoica, “Resilient distributed datasets: A fault-tolerant abstraction for in-memory cluster computing,” in Procee- dins of the 9th USENIX Symposium on Networked Systems Design and Implementation (NSDI 12). San Jose, CA: USENIX, 2012, pp. 15–28
[22] S. Verbaeten and A. Assche, “Ensemble methods for noise elimination in classification problems,” in 4th International Workshop on Multiple Classifier Systems, ser. Lecture Notes on Computer Science, vol. 2709. Springer, 2003, pp. 317–325.
dc.rights.spa.fl_str_mv CC0 1.0 Universal
dc.rights.uri.spa.fl_str_mv http://creativecommons.org/publicdomain/zero/1.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.coar.spa.fl_str_mv http://purl.org/coar/access_right/c_abf2
rights_invalid_str_mv CC0 1.0 Universal
http://creativecommons.org/publicdomain/zero/1.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.publisher.spa.fl_str_mv Journal of Physics: Conference Series
institution Corporación Universidad de la Costa
bitstream.url.fl_str_mv https://repositorio.cuc.edu.co/bitstream/11323/6212/1/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf
https://repositorio.cuc.edu.co/bitstream/11323/6212/5/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf
https://repositorio.cuc.edu.co/bitstream/11323/6212/2/license_rdf
https://repositorio.cuc.edu.co/bitstream/11323/6212/3/license.txt
https://repositorio.cuc.edu.co/bitstream/11323/6212/4/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf.jpg
https://repositorio.cuc.edu.co/bitstream/11323/6212/6/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf.txt
bitstream.checksum.fl_str_mv 54acb86f7e942c479b263d5badb94bcd
4c0531c19f9816d55c5f999c60c9d0cc
42fd4ad1e89814f5e4a476b409eb708c
8a4605be74aa9ea9d79846c1fba20a33
10cb8eed26abaffba9fa16f508b3812e
0370392fb5f4102601b11cd7728dc3eb
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Universidad de La Costa
repository.mail.fl_str_mv bdigital@metabiblioteca.com
_version_ 1808400249216040960
spelling Silva, Jesúse17281d02925301aa71681ad0d7b3e03H, Ha50768e271729098aea43bcf7439850cNúñez, Vladimirac24f16fb0516928f418b83365643b6cRuiz Lázaro, Alex2e41b9bdebb7808a06c41279c2342bcfVarela Izquierdo, Noel484160b66adc1de7303e235ec78945322020-04-17T00:16:16Z2020-04-17T00:16:16Z2020-02-0117426588https://hdl.handle.net/11323/621210.1088/1742-6596/1432/1/012074Corporación Universidad de la CostaREDICUC - Repositorio CUChttps://repositorio.cuc.edu.co/This study describes a model of explanations in natural language for classification decision trees. The explanations include global aspects of the classifier and local aspects of the classification of a particular instance. The proposal is implemented in the ExpliClas open source Web service [1], which in its current version operates on trees built with Weka and data sets with numerical attributes. The feasibility of the proposal is illustrated with two example cases, where the detailed explanation of the respective classification trees is shown.Este estudio describe un modelo de explicaciones en lenguaje natural para la clasificación. árboles de decisión. Las explicaciones incluyen aspectos globales del clasificador y aspectos locales del clasificación de una instancia particular. La propuesta se implementa en el código abierto ExpliClas Servicio web [1], que en su versión actual opera en árboles construidos con Weka y conjuntos de datos con atributos numéricos La viabilidad de la propuesta se ilustra con dos casos de ejemplo, donde Se muestra la explicación detallada de los respectivos árboles de clasificación.engJournal of Physics: Conference SeriesCC0 1.0 Universalhttp://creativecommons.org/publicdomain/zero/1.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Modelo de explicacionesArboles de decisiónCódigo abierto ExpliClasExplanation modelDecision treesOpen source ExpliClasNatural language explanation model for decision treesArtículo de revistahttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/resource_type/c_2df8fbb1Textinfo:eu-repo/semantics/articlehttp://purl.org/redcol/resource_type/ARTinfo:eu-repo/semantics/acceptedVersion[2] Jain, Mugdha, and Chakradhar Verma. "Adapting k-means for Clustering in Big Data." International Journal of Computer Applications 101.1 (2014): 19-24.[3] S. Ramiırez-Gallego, A. Fernandez, S. Garcıa, M. Chen, and F. Herrera, “Big data: Tutorial and guidelines on information and process fusion for analytics algorithms with mapreduce,” Information Fusion, vol. 42, pp. 51 – 61, 2018[4] M. Hamstra, H. Karau, M. Zaharia, A. Konwinski, and P. Wendell, Learning Spark: LightningFast Big Data Analytics. O’Reilly Media, 2015.[5] Lis-Gutiérrez JP., Gaitán-Angulo M., Henao L.C., Viloria A., Aguilera-Hernández D., PortilloMedina R. (2018) Measures of Concentration and Stability: Two Pedagogical Tools for Industrial Organization Courses. In: Tan Y., Shi Y., Tang Q. (eds) Advances in Swarm Intelligence. ICSI 2018. Lecture Notes in Computer Science, vol 10942. Springer, Cham[6] J. Lin, “Mapreduce is good enough? if all you have is a hammer, throw away everything that’s not a nail!” Big Data, vol. 1, no. 1, pp. 28–37, 2013.[7] Viloria, A., & Gaitan-Angulo, M. (2016). Statistical Adjustment Module Advanced Optimizer Planner and SAP Generated the Case of a Food Production Company. Indian Journal Of Science And Technology, 9(47). doi:10.17485/ijst/2016/v9i47/107371.[8] D. Garcia-Gil, S. Ramiırez-Gallego, S. Garcia, and F. Herrera, “Principal Components Analysis Random Discretization Ensemble for Big Data,” Knowledge-Based Systems, vol. 150, pp. 166 – 174, 2018.[9] N. Sapankevych y R. Sankar, “Time Series Prediction Using Support Vector Machines: A Survey”, IEEE Computational Intelligence Magazine, vol. 4, núm. 2, pp. 24–38, may 2009.[10] Viloria A., Lis-Gutiérrez JP., Gaitán-Angulo M., Godoy A.R.M., Moreno G.C., Kamatkar S.J. (2018) Methodology for the Design of a Student Pattern Recognition Tool to Facilitate the Teaching - Learning Process Through Knowledge Data Discovery (Big Data). In: Tan Y., Shi Y., Tang Q. (eds) Data Mining and Big Data. DMBD 2018. Lecture Notes in Computer Science, vol 10943. Springer, Cham.[11] L. A. Hendricks, Z. Akata, M. Rohrbach, J. Donahue, B. Schiele, and T. Darrell, “Generating visual explanations,” in Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 2016, pp. 3–19.[12] A. Gatt and E. Krahmer, “Survey of the state of the art in natural language generation: Core tasks, applications and evaluation,” Journal of Artificial Intelligence Research, vol. 61, pp. 65–170, 2018.[13] Ruß G. Data Mining of Agricultural Yield Data: A Comparison of Regression Models, In: Perner P. (eds) Advances in Data Mining. Applications and Theoretical Aspects, ICDM 2009. Lecture Notes in Computer Science, vol 5633.[14] S. Barocas and D. Boyd, “Computing ethics. engaging the ethics of data science in practice,” Communications of the ACM, vol. 60, no. 11, pp. 23–25, 2017.[15] Hernández, J. A., Burlak, G., Muñoz Arteaga, J., y Ochoa, A. (2006). Propuesta para la evaluación de objetos de aprendizaje desde una perspectiva integral usando minería de datos. En A. Hernández y J. Zechinelli (Eds.), Avances en la ciencia de la computación (pp. 382-387). México: Universidad Autónoma de México.[16] N. Sapankevych y R. Sankar, “Time Series Prediction Using Support Vector Machines: A Survey”, IEEE Computational Intelligence Magazine, vol. 4, núm. 2, pp. 24–38, may 2009.[16] S. Gang Wu, F. Sheng Bao, E. You Xu, Y.-X. Wang, Y.-F. Chang, and Q.-L. Xiang, “A leaf recognition algorithm for plant classification using probabilistic neural network,” in IEEE International Symposium on Signal Processing and Information Technology, 2007, pp. 1–6..[17] I. H. Witten, E. Frank, M. A. Hall, and C. J. Pal, Data Mining: Practical Machine Learning Tools and Techniques, 4th ed. Morgan Kaufmann, 2016.[18] D. Gunning, “Explainable Artificial Intelligence (XAI),” Defense Advan- ced Research Projects Agency (DARPA), Arlington, USA, Tech. Rep., 2016, DARPA-BAA-16-53[19] Scheffer, T. (2004). Finding Association Rules that Trade Support Optimally Against Confidence. Intelligent Data Analysis, 9(4), 381-395.[20] J. M. Alonso, A. Ramos-Soto, E. Reiter, and K. van Deemter, “An exploratory study on the benefits of using natural language for ex- plaining fuzzy rule-based systems,” in IEEE International Conferen- ce on Fuzzy Systems (FUZZ-IEEE), Naples, Italy, 2017, pp. 1–6, http://dx.doi.org/10.1109/FUZZ-IEEE.2017.8015489.[21] M. Zaharia, M. Chowdhury, T. Das, A. Dave, J. Ma, M. McCauly, M. J. Franklin, S. Shenker, and I. Stoica, “Resilient distributed datasets: A fault-tolerant abstraction for in-memory cluster computing,” in Procee- dins of the 9th USENIX Symposium on Networked Systems Design and Implementation (NSDI 12). San Jose, CA: USENIX, 2012, pp. 15–28[22] S. Verbaeten and A. Assche, “Ensemble methods for noise elimination in classification problems,” in 4th International Workshop on Multiple Classifier Systems, ser. Lecture Notes on Computer Science, vol. 2709. Springer, 2003, pp. 317–325.ORIGINALNatural Language Explanation Model for Decision Trees.pdfNatural Language Explanation Model for Decision Trees.pdfapplication/pdf762999https://repositorio.cuc.edu.co/bitstream/11323/6212/1/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf54acb86f7e942c479b263d5badb94bcdMD51open accessNatural Language Explanation Model for Decision Trees.pdfNatural Language Explanation Model for Decision Trees.pdfapplication/pdf1489967https://repositorio.cuc.edu.co/bitstream/11323/6212/5/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf4c0531c19f9816d55c5f999c60c9d0ccMD55open accessCC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8701https://repositorio.cuc.edu.co/bitstream/11323/6212/2/license_rdf42fd4ad1e89814f5e4a476b409eb708cMD52open accessLICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://repositorio.cuc.edu.co/bitstream/11323/6212/3/license.txt8a4605be74aa9ea9d79846c1fba20a33MD53open accessTHUMBNAILNatural Language Explanation Model for Decision Trees.pdf.jpgNatural Language Explanation Model for Decision Trees.pdf.jpgimage/jpeg31661https://repositorio.cuc.edu.co/bitstream/11323/6212/4/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf.jpg10cb8eed26abaffba9fa16f508b3812eMD54open accessTEXTNatural Language Explanation Model for Decision Trees.pdf.txtNatural Language Explanation Model for Decision Trees.pdf.txttext/plain21870https://repositorio.cuc.edu.co/bitstream/11323/6212/6/Natural%20Language%20Explanation%20Model%20for%20Decision%20Trees.pdf.txt0370392fb5f4102601b11cd7728dc3ebMD56open access11323/6212oai:repositorio.cuc.edu.co:11323/62122023-12-14 17:40:20.705CC0 1.0 Universal|||http://creativecommons.org/publicdomain/zero/1.0/open accessRepositorio Universidad de La Costabdigital@metabiblioteca.comTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo=