Decoding as a linear ill-posed problem: The entropy minimization approach

The problem of decoding can be thought of as consisting of solving an ill-posed, linear inverse problem with noisy data and box constraints upon the unknowns. Specificially, we aimed to solve $\bA\bx+\be=\by,$ where $\bA$ is a matrix with positive entries and $\by$ is a vector with positive entries....

Full description

Autores:
Gauthier-Umaña, Valérie
Gzyl, Henryk
ter Horst, Enrique
Tipo de recurso:
Article of journal
Fecha de publicación:
2025
Institución:
Universidad de los Andes
Repositorio:
Séneca: repositorio Uniandes
Idioma:
eng
OAI Identifier:
oai:repositorio.uniandes.edu.co:1992/76128
Acceso en línea:
https://hdl.handle.net/1992/76128
https://doi.org/10.3934/math.2025192
Palabra clave:
ill-posed inverse problems
decoding as inverse problem
convex optimization
gaussian random variables
Ingeniería
Rights
openAccess
License
http://purl.org/coar/access_right/c_abf2
id UNIANDES2_8a871150185afb01ca759d0adaef9ebf
oai_identifier_str oai:repositorio.uniandes.edu.co:1992/76128
network_acronym_str UNIANDES2
network_name_str Séneca: repositorio Uniandes
repository_id_str
dc.title.none.fl_str_mv Decoding as a linear ill-posed problem: The entropy minimization approach
title Decoding as a linear ill-posed problem: The entropy minimization approach
spellingShingle Decoding as a linear ill-posed problem: The entropy minimization approach
ill-posed inverse problems
decoding as inverse problem
convex optimization
gaussian random variables
Ingeniería
title_short Decoding as a linear ill-posed problem: The entropy minimization approach
title_full Decoding as a linear ill-posed problem: The entropy minimization approach
title_fullStr Decoding as a linear ill-posed problem: The entropy minimization approach
title_full_unstemmed Decoding as a linear ill-posed problem: The entropy minimization approach
title_sort Decoding as a linear ill-posed problem: The entropy minimization approach
dc.creator.fl_str_mv Gauthier-Umaña, Valérie
Gzyl, Henryk
ter Horst, Enrique
dc.contributor.author.none.fl_str_mv Gauthier-Umaña, Valérie
Gzyl, Henryk
ter Horst, Enrique
dc.contributor.researchgroup.none.fl_str_mv Facultad de Ingeniería::TICSw: Tecnologías de Información y Construcción de Software
dc.subject.keyword.none.fl_str_mv ill-posed inverse problems
decoding as inverse problem
convex optimization
gaussian random variables
topic ill-posed inverse problems
decoding as inverse problem
convex optimization
gaussian random variables
Ingeniería
dc.subject.themes.none.fl_str_mv Ingeniería
description The problem of decoding can be thought of as consisting of solving an ill-posed, linear inverse problem with noisy data and box constraints upon the unknowns. Specificially, we aimed to solve $\bA\bx+\be=\by,$ where $\bA$ is a matrix with positive entries and $\by$ is a vector with positive entries. It is required that $\bx\in\cK$, which is specified below, and we considered two points of view about the noise term, both of which were implied as unknowns to be determined. On the one hand, the error can be thought of as a confounding error, intentionally added to the coded message. On the other hand, we may think of the error as a true additive transmission-measurement error. We solved the problem by minimizing an entropy of the Fermi-Dirac type defined on the set of all constraints of the problem. Our approach provided a consistent way to recover the message and the noise from the measurements. In an example with a generator code matrix of the Reed-Solomon type, we examined the two points of view about the noise. As our approach enabled us to recursively decrease the $\ell_1$ norm of the noise as part of the solution procedure, we saw that, if the required norm of the noise was too small, the message was not well recovered. Our work falls within the general class of near-optimal signal recovery line of work. We also studied the case with Gaussian random matrices.
publishDate 2025
dc.date.accessioned.none.fl_str_mv 2025-03-20T12:54:55Z
dc.date.available.none.fl_str_mv 2025-03-20T12:54:55Z
dc.date.issued.none.fl_str_mv 2025-02-27
dc.type.none.fl_str_mv Artículo de revista
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_970fb48d4fbd8a85
dc.type.driver.none.fl_str_mv info:eu-repo/semantics/article
dc.type.coar.none.fl_str_mv http://purl.org/coar/resource_type/c_6501
dc.type.content.none.fl_str_mv Text
dc.type.redcol.none.fl_str_mv http://purl.org/redcol/resource_type/ART
format http://purl.org/coar/resource_type/c_6501
dc.identifier.issn.none.fl_str_mv 2473-6988
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/1992/76128
dc.identifier.doi.none.fl_str_mv https://doi.org/10.3934/math.2025192
dc.identifier.instname.none.fl_str_mv instname:Universidad de los Andes
dc.identifier.reponame.none.fl_str_mv reponame:Repositorio Institucional Séneca
dc.identifier.repourl.none.fl_str_mv repourl:https://repositorio.uniandes.edu.co/
identifier_str_mv 2473-6988
instname:Universidad de los Andes
reponame:Repositorio Institucional Séneca
repourl:https://repositorio.uniandes.edu.co/
url https://hdl.handle.net/1992/76128
https://doi.org/10.3934/math.2025192
dc.language.iso.none.fl_str_mv eng
language eng
dc.relation.citationendpage.none.fl_str_mv 4152
dc.relation.citationissue.none.fl_str_mv 4
dc.relation.citationstartpage.none.fl_str_mv 4139
dc.relation.citationvolume.none.fl_str_mv 10
dc.relation.ispartofjournal.none.fl_str_mv AIMS Mathematics
dc.relation.references.none.fl_str_mv 1. F. L. Bauer, Decrypted secrets: Methods and maxims on cryptography, Berlin: Springer-Verlag, 1997. 2. J. M. Borwein, A. S. Lewis, Convex analysis and nonlinear optimization, 2nd Edition, Berlin: CMS-Springer, 2006. 3. D. Burshtein, I. Goldenberg, Improved linear programming decoding and bounds on the minimum distance of LDPC codes, IEEE Inf. Theory Work., 2010. Available from: https://ieeexplore. ieee.org/document/5592887. 4. E. Candes, T. Tao, Decoding by linear programming, IEEE Tran. Inf. Theory, 51 (2005), 4203– 4215. http://dx.doi.org/10.1109/TIT.2005.858979 5. E. Candes, T. Tao, Near optimal signal recovery from random projections: Universal encoding strategies, IEEE Tran. Inf. Theory, 52 (2006), 5406–5425. http://dx.doi.org/10.1109/TIT.2006.885507 6. C. Daskalakis, G. Alexandros, A. G. Dimakis, R. M. Karp, M. J. Wainwright, Probabilistic analysis of linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 3565–3578. http://dx.doi.org/10.1109/TIT.2008.926452 7. S. El Rouayyheb, C. N. Georghiades, Graph theoretic methods in coding theory, Classical, Semiclass. Quant. Noise, 2012, 53–62. https://doi.org/10.1007/978-1-4419-6624-7 5 8. J. Feldman, M. J. Wainwright, D. R. Karger, Using linear programming to decode binary linear codes, IEEE Tran. Inf. Theory, 51 (2005), 954–972. https://doi.org/10.1109/TIT.2004.842696 9. F. Gamboa, H. Gzyl, Linear programming with maximum entropy, Math. Comput. Modeling, 13 (1990), 49–52. 10. Y. S. Han, A new treatment of priority-first search maximum-likelihood soft-decision decoding of linear block codes, IEEE Tran. Inf. Theory, 44 (1998), 3091–3096. https://doi.org/10.1109/18.737538 11. M. Helmiling, Advances in mathematical programming-based error-correction decoding, OPUS Koblen., 2015. Available from: https://kola.opus.hbz-nrw.de/frontdoor/index/ index/year/2015/docId/948. 12. M. Helmling, S. Ruzika, A. Tanatmis, Mathematical programming decoding of binary linear codes: Theory and algorithms, IEEE Tran. Inf. Theory, 58 (2012), 4753–4769. https://doi.org/10.1109/TIT.2012.2191697 13. M. R. Islam, Linear programming decoding: The ultimate decoding technique for low density parity check codes, Radioel. Commun. Syst., 56 (2013), 57–72. https://doi.org/10.3103/S0735272713020015 14. T. Kaneko, T. Nishijima, S. Hirasawa, An improvement of soft-decision maximum-likelihood decoding algorithm using hard-decision bounded-distance decoding, IEEE Tran. Inf. Theory, 43 (1997), 1314–1319. https://doi.org/10.1109/18.605601 15. S. B. McGrayne, The theory that would not die. How Bayes’ rule cracked the enigma code, hunted down Russian submarines, & emerged triumphant from two centuries of controversy, New Haven: Yale University Press, 2011. 16. R. J. McEliece, A public-key cryptosystem based on algebraic, Coding Th., 4244 (1978), 114–116. 17. H. Mohammad, N. Taghavi, P. H. Siegel, Adaptive methods for linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 5396–5410. https://doi.org/10.1109/TIT.2008.2006384 18. G. Xie, F. Fu, H. Li, W. Du, Y. Zhong, L. Wang, et al, A gradient-enhanced physicsinformed neural networks method for the wave equation, Eng. Anal. Bound. Ele., 166 (2024). https://doi.org/10.1016/j.enganabound.2024.105802 19. Q. Yin, X. B. Shu, Y. Guo, Z. Y. Wang, Optimal control of stochastic differential equations with random impulses and the Hamilton-Jacobi-Bellman equation, Optimal Control Appl. Methods, 45 (2024), 2113–2135. https://doi.org/10.1002/oca.3139 20. B. Zolfaghani, K. Bibak, T. Koshiba, The odyssey of entropy: Cryptography, Entropy, 24 (2022), 266–292. https://doi.org/10.3390/e24020266
dc.rights.accessrights.none.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.coar.none.fl_str_mv http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
rights_invalid_str_mv http://purl.org/coar/access_right/c_abf2
dc.format.extent.none.fl_str_mv 14 páginas
dc.format.mimetype.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Universidad de los Andes
dc.publisher.faculty.none.fl_str_mv Facultad de Ingeniería
dc.publisher.department.none.fl_str_mv Departamento de Ingeniería de Sistemas
publisher.none.fl_str_mv Universidad de los Andes
institution Universidad de los Andes
bitstream.url.fl_str_mv https://repositorio.uniandes.edu.co/bitstreams/0cdc949a-98c1-4787-8eb1-ce59d3b866cf/download
https://repositorio.uniandes.edu.co/bitstreams/5cc74d01-08cb-4d4d-83e5-d5f238b2c1d9/download
https://repositorio.uniandes.edu.co/bitstreams/ecd8f5da-8b47-4135-b371-4977100d0ff8/download
https://repositorio.uniandes.edu.co/bitstreams/08da0415-8110-44ae-b547-4d55e56e9b24/download
bitstream.checksum.fl_str_mv c3368784454cb8b77d242815bbaba1bb
ae9e573a68e7f92501b6913cc846c39f
ae1329de38a20ca40fd67d1489371ae8
ffe24214940ce6a1dc560746250677d3
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio institucional Séneca
repository.mail.fl_str_mv adminrepositorio@uniandes.edu.co
_version_ 1831927835446476800
spelling Al consultar y hacer uso de este recurso, está aceptando las condiciones de uso establecidas por los autoresinfo:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Gauthier-Umaña, ValérieGzyl, Henrykter Horst, EnriqueFacultad de Ingeniería::TICSw: Tecnologías de Información y Construcción de Software2025-03-20T12:54:55Z2025-03-20T12:54:55Z2025-02-272473-6988https://hdl.handle.net/1992/76128https://doi.org/10.3934/math.2025192instname:Universidad de los Andesreponame:Repositorio Institucional Sénecarepourl:https://repositorio.uniandes.edu.co/The problem of decoding can be thought of as consisting of solving an ill-posed, linear inverse problem with noisy data and box constraints upon the unknowns. Specificially, we aimed to solve $\bA\bx+\be=\by,$ where $\bA$ is a matrix with positive entries and $\by$ is a vector with positive entries. It is required that $\bx\in\cK$, which is specified below, and we considered two points of view about the noise term, both of which were implied as unknowns to be determined. On the one hand, the error can be thought of as a confounding error, intentionally added to the coded message. On the other hand, we may think of the error as a true additive transmission-measurement error. We solved the problem by minimizing an entropy of the Fermi-Dirac type defined on the set of all constraints of the problem. Our approach provided a consistent way to recover the message and the noise from the measurements. In an example with a generator code matrix of the Reed-Solomon type, we examined the two points of view about the noise. As our approach enabled us to recursively decrease the $\ell_1$ norm of the noise as part of the solution procedure, we saw that, if the required norm of the noise was too small, the message was not well recovered. Our work falls within the general class of near-optimal signal recovery line of work. We also studied the case with Gaussian random matrices.14 páginasapplication/pdfengUniversidad de los AndesFacultad de IngenieríaDepartamento de Ingeniería de SistemasDecoding as a linear ill-posed problem: The entropy minimization approachArtículo de revistainfo:eu-repo/semantics/articlehttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/resource_type/c_2df8fbb1http://purl.org/coar/version/c_970fb48d4fbd8a85Texthttp://purl.org/redcol/resource_type/ARTill-posed inverse problemsdecoding as inverse problemconvex optimizationgaussian random variablesIngeniería41524413910AIMS Mathematics1. F. L. Bauer, Decrypted secrets: Methods and maxims on cryptography, Berlin: Springer-Verlag, 1997. 2. J. M. Borwein, A. S. Lewis, Convex analysis and nonlinear optimization, 2nd Edition, Berlin: CMS-Springer, 2006. 3. D. Burshtein, I. Goldenberg, Improved linear programming decoding and bounds on the minimum distance of LDPC codes, IEEE Inf. Theory Work., 2010. Available from: https://ieeexplore. ieee.org/document/5592887. 4. E. Candes, T. Tao, Decoding by linear programming, IEEE Tran. Inf. Theory, 51 (2005), 4203– 4215. http://dx.doi.org/10.1109/TIT.2005.858979 5. E. Candes, T. Tao, Near optimal signal recovery from random projections: Universal encoding strategies, IEEE Tran. Inf. Theory, 52 (2006), 5406–5425. http://dx.doi.org/10.1109/TIT.2006.885507 6. C. Daskalakis, G. Alexandros, A. G. Dimakis, R. M. Karp, M. J. Wainwright, Probabilistic analysis of linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 3565–3578. http://dx.doi.org/10.1109/TIT.2008.926452 7. S. El Rouayyheb, C. N. Georghiades, Graph theoretic methods in coding theory, Classical, Semiclass. Quant. Noise, 2012, 53–62. https://doi.org/10.1007/978-1-4419-6624-7 5 8. J. Feldman, M. J. Wainwright, D. R. Karger, Using linear programming to decode binary linear codes, IEEE Tran. Inf. Theory, 51 (2005), 954–972. https://doi.org/10.1109/TIT.2004.842696 9. F. Gamboa, H. Gzyl, Linear programming with maximum entropy, Math. Comput. Modeling, 13 (1990), 49–52. 10. Y. S. Han, A new treatment of priority-first search maximum-likelihood soft-decision decoding of linear block codes, IEEE Tran. Inf. Theory, 44 (1998), 3091–3096. https://doi.org/10.1109/18.737538 11. M. Helmiling, Advances in mathematical programming-based error-correction decoding, OPUS Koblen., 2015. Available from: https://kola.opus.hbz-nrw.de/frontdoor/index/ index/year/2015/docId/948. 12. M. Helmling, S. Ruzika, A. Tanatmis, Mathematical programming decoding of binary linear codes: Theory and algorithms, IEEE Tran. Inf. Theory, 58 (2012), 4753–4769. https://doi.org/10.1109/TIT.2012.2191697 13. M. R. Islam, Linear programming decoding: The ultimate decoding technique for low density parity check codes, Radioel. Commun. Syst., 56 (2013), 57–72. https://doi.org/10.3103/S0735272713020015 14. T. Kaneko, T. Nishijima, S. Hirasawa, An improvement of soft-decision maximum-likelihood decoding algorithm using hard-decision bounded-distance decoding, IEEE Tran. Inf. Theory, 43 (1997), 1314–1319. https://doi.org/10.1109/18.605601 15. S. B. McGrayne, The theory that would not die. How Bayes’ rule cracked the enigma code, hunted down Russian submarines, & emerged triumphant from two centuries of controversy, New Haven: Yale University Press, 2011. 16. R. J. McEliece, A public-key cryptosystem based on algebraic, Coding Th., 4244 (1978), 114–116. 17. H. Mohammad, N. Taghavi, P. H. Siegel, Adaptive methods for linear programming decoding, IEEE Tran. Inf. Theory, 54 (2008), 5396–5410. https://doi.org/10.1109/TIT.2008.2006384 18. G. Xie, F. Fu, H. Li, W. Du, Y. Zhong, L. Wang, et al, A gradient-enhanced physicsinformed neural networks method for the wave equation, Eng. Anal. Bound. Ele., 166 (2024). https://doi.org/10.1016/j.enganabound.2024.105802 19. Q. Yin, X. B. Shu, Y. Guo, Z. Y. Wang, Optimal control of stochastic differential equations with random impulses and the Hamilton-Jacobi-Bellman equation, Optimal Control Appl. Methods, 45 (2024), 2113–2135. https://doi.org/10.1002/oca.3139 20. B. Zolfaghani, K. Bibak, T. Koshiba, The odyssey of entropy: Cryptography, Entropy, 24 (2022), 266–292. https://doi.org/10.3390/e24020266ORIGINAL10.3934_math.2025192 (1).pdf10.3934_math.2025192 (1).pdfapplication/pdf253804https://repositorio.uniandes.edu.co/bitstreams/0cdc949a-98c1-4787-8eb1-ce59d3b866cf/downloadc3368784454cb8b77d242815bbaba1bbMD51LICENSElicense.txtlicense.txttext/plain; charset=utf-82535https://repositorio.uniandes.edu.co/bitstreams/5cc74d01-08cb-4d4d-83e5-d5f238b2c1d9/downloadae9e573a68e7f92501b6913cc846c39fMD52TEXT10.3934_math.2025192 (1).pdf.txt10.3934_math.2025192 (1).pdf.txtExtracted texttext/plain39084https://repositorio.uniandes.edu.co/bitstreams/ecd8f5da-8b47-4135-b371-4977100d0ff8/downloadae1329de38a20ca40fd67d1489371ae8MD53THUMBNAIL10.3934_math.2025192 (1).pdf.jpg10.3934_math.2025192 (1).pdf.jpgGenerated Thumbnailimage/jpeg14757https://repositorio.uniandes.edu.co/bitstreams/08da0415-8110-44ae-b547-4d55e56e9b24/downloadffe24214940ce6a1dc560746250677d3MD541992/76128oai:repositorio.uniandes.edu.co:1992/761282025-03-28 09:19:09.356open.accesshttps://repositorio.uniandes.edu.coRepositorio institucional Sénecaadminrepositorio@uniandes.edu.coPGgzPjxzdHJvbmc+RGVzY2FyZ28gZGUgUmVzcG9uc2FiaWxpZGFkIC0gTGljZW5jaWEgZGUgQXV0b3JpemFjacOzbjwvc3Ryb25nPjwvaDM+CjxwPjxzdHJvbmc+UG9yIGZhdm9yIGxlZXIgYXRlbnRhbWVudGUgZXN0ZSBkb2N1bWVudG8gcXVlIHBlcm1pdGUgYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBTw6luZWNhIHJlcHJvZHVjaXIgeSBkaXN0cmlidWlyIGxvcyByZWN1cnNvcyBkZSBpbmZvcm1hY2nDs24gZGVwb3NpdGFkb3MgbWVkaWFudGUgbGEgYXV0b3JpemFjacOzbiBkZSBsb3Mgc2lndWllbnRlcyB0w6lybWlub3M6PC9zdHJvbmc+PC9wPgo8cD5Db25jZWRhIGxhIGxpY2VuY2lhIGRlIGRlcMOzc2l0byBlc3TDoW5kYXIgc2VsZWNjaW9uYW5kbyBsYSBvcGNpw7NuIDxzdHJvbmc+J0FjZXB0YXIgbG9zIHTDqXJtaW5vcyBhbnRlcmlvcm1lbnRlIGRlc2NyaXRvcyc8L3N0cm9uZz4geSBjb250aW51YXIgZWwgcHJvY2VzbyBkZSBlbnbDrW8gbWVkaWFudGUgZWwgYm90w7NuIDxzdHJvbmc+J1NpZ3VpZW50ZScuPC9zdHJvbmc+PC9wPgo8aHI+CjxwPllvLCBlbiBtaSBjYWxpZGFkIGRlIGF1dG9yIGRlbCB0cmFiYWpvIGRlIHRlc2lzLCBtb25vZ3JhZsOtYSBvIHRyYWJham8gZGUgZ3JhZG8sIGhhZ28gZW50cmVnYSBkZWwgZWplbXBsYXIgcmVzcGVjdGl2byB5IGRlIHN1cyBhbmV4b3MgZGUgc2VyIGVsIGNhc28sIGVuIGZvcm1hdG8gZGlnaXRhbCB5L28gZWxlY3Ryw7NuaWNvIHkgYXV0b3Jpem8gYSBsYSBVbml2ZXJzaWRhZCBkZSBsb3MgQW5kZXMgcGFyYSBxdWUgcmVhbGljZSBsYSBwdWJsaWNhY2nDs24gZW4gZWwgU2lzdGVtYSBkZSBCaWJsaW90ZWNhcyBvIGVuIGN1YWxxdWllciBvdHJvIHNpc3RlbWEgbyBiYXNlIGRlIGRhdG9zIHByb3BpbyBvIGFqZW5vIGEgbGEgVW5pdmVyc2lkYWQgeSBwYXJhIHF1ZSBlbiBsb3MgdMOpcm1pbm9zIGVzdGFibGVjaWRvcyBlbiBsYSBMZXkgMjMgZGUgMTk4MiwgTGV5IDQ0IGRlIDE5OTMsIERlY2lzacOzbiBBbmRpbmEgMzUxIGRlIDE5OTMsIERlY3JldG8gNDYwIGRlIDE5OTUgeSBkZW3DoXMgbm9ybWFzIGdlbmVyYWxlcyBzb2JyZSBsYSBtYXRlcmlhLCB1dGlsaWNlIGVuIHRvZGFzIHN1cyBmb3JtYXMsIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIHJlcHJvZHVjY2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EsIHRyYW5zZm9ybWFjacOzbiB5IGRpc3RyaWJ1Y2nDs24gKGFscXVpbGVyLCBwcsOpc3RhbW8gcMO6YmxpY28gZSBpbXBvcnRhY2nDs24pIHF1ZSBtZSBjb3JyZXNwb25kZW4gY29tbyBjcmVhZG9yIGRlIGxhIG9icmEgb2JqZXRvIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8uPC9wPgo8cD5MYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGVtaXRlIGVuIGNhbGlkYWQgZGUgYXV0b3IgZGUgbGEgb2JyYSBvYmpldG8gZGVsIHByZXNlbnRlIGRvY3VtZW50byB5IG5vIGNvcnJlc3BvbmRlIGEgY2VzacOzbiBkZSBkZXJlY2hvcywgc2lubyBhIGxhIGF1dG9yaXphY2nDs24gZGUgdXNvIGFjYWTDqW1pY28gZGUgY29uZm9ybWlkYWQgY29uIGxvIGFudGVyaW9ybWVudGUgc2XDsWFsYWRvLiBMYSBwcmVzZW50ZSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgZXh0ZW5zaXZhIG5vIHNvbG8gYSBsYXMgZmFjdWx0YWRlcyB5IGRlcmVjaG9zIGRlIHVzbyBzb2JyZSBsYSBvYnJhIGVuIGZvcm1hdG8gbyBzb3BvcnRlIG1hdGVyaWFsLCBzaW5vIHRhbWJpw6luIHBhcmEgZm9ybWF0byBlbGVjdHLDs25pY28sIHkgZW4gZ2VuZXJhbCBwYXJhIGN1YWxxdWllciBmb3JtYXRvIGNvbm9jaWRvIG8gcG9yIGNvbm9jZXIuPC9wPgo8cD5FbCBhdXRvciwgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBvYmpldG8gZGUgbGEgcHJlc2VudGUgYXV0b3JpemFjacOzbiBlcyBvcmlnaW5hbCB5IGxhIHJlYWxpesOzIHNpbiB2aW9sYXIgbyB1c3VycGFyIGRlcmVjaG9zIGRlIGF1dG9yIGRlIHRlcmNlcm9zLCBwb3IgbG8gdGFudG8sIGxhIG9icmEgZXMgZGUgc3UgZXhjbHVzaXZhIGF1dG9yw61hIHkgdGllbmUgbGEgdGl0dWxhcmlkYWQgc29icmUgbGEgbWlzbWEuPC9wPgo8cD5FbiBjYXNvIGRlIHByZXNlbnRhcnNlIGN1YWxxdWllciByZWNsYW1hY2nDs24gbyBhY2Npw7NuIHBvciBwYXJ0ZSBkZSB1biB0ZXJjZXJvIGVuIGN1YW50byBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGVuIGN1ZXN0acOzbiwgZWwgYXV0b3IgYXN1bWlyw6EgdG9kYSBsYSByZXNwb25zYWJpbGlkYWQsIHkgc2FsZHLDoSBkZSBkZWZlbnNhIGRlIGxvcyBkZXJlY2hvcyBhcXXDrSBhdXRvcml6YWRvcywgcGFyYSB0b2RvcyBsb3MgZWZlY3RvcyBsYSBVbml2ZXJzaWRhZCBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlLjwvcD4KPHA+U2kgdGllbmUgYWxndW5hIGR1ZGEgc29icmUgbGEgbGljZW5jaWEsIHBvciBmYXZvciwgY29udGFjdGUgY29uIGVsIDxhIGhyZWY9Im1haWx0bzpiaWJsaW90ZWNhQHVuaWFuZGVzLmVkdS5jbyIgdGFyZ2V0PSJfYmxhbmsiPkFkbWluaXN0cmFkb3IgZGVsIFNpc3RlbWEuPC9hPjwvcD4K