Skin color correction via convolutional neural networks in 3D fringe projection profilometry

Fringe Projection Profilometry (FPP) with Digital Light Projector technology is one of the most reliable 3D sensing techniques for biomedical applications. However, besides the fringe pattern images,often a color texture image is needed for an accurate medical documentation. This image may be acquir...

Full description

Autores:
Barrios, Erik
Pineda, Jesus
Romero, Lenny A
Millán, María S
Marrugo, Andrés G.
Tipo de recurso:
Fecha de publicación:
2021
Institución:
Universidad Tecnológica de Bolívar
Repositorio:
Repositorio Institucional UTB
Idioma:
eng
OAI Identifier:
oai:repositorio.utb.edu.co:20.500.12585/12114
Acceso en línea:
https://hdl.handle.net/20.500.12585/12114
Palabra clave:
Color constancy
Convolutional neural network
Image color processing
Machine learning
Skin color correction
Rights
openAccess
License
http://creativecommons.org/licenses/by-nc-nd/4.0/
id UTB2_bc022bd1b3cd28422f76b7e23da40f49
oai_identifier_str oai:repositorio.utb.edu.co:20.500.12585/12114
network_acronym_str UTB2
network_name_str Repositorio Institucional UTB
repository_id_str
dc.title.spa.fl_str_mv Skin color correction via convolutional neural networks in 3D fringe projection profilometry
title Skin color correction via convolutional neural networks in 3D fringe projection profilometry
spellingShingle Skin color correction via convolutional neural networks in 3D fringe projection profilometry
Color constancy
Convolutional neural network
Image color processing
Machine learning
Skin color correction
title_short Skin color correction via convolutional neural networks in 3D fringe projection profilometry
title_full Skin color correction via convolutional neural networks in 3D fringe projection profilometry
title_fullStr Skin color correction via convolutional neural networks in 3D fringe projection profilometry
title_full_unstemmed Skin color correction via convolutional neural networks in 3D fringe projection profilometry
title_sort Skin color correction via convolutional neural networks in 3D fringe projection profilometry
dc.creator.fl_str_mv Barrios, Erik
Pineda, Jesus
Romero, Lenny A
Millán, María S
Marrugo, Andrés G.
dc.contributor.author.none.fl_str_mv Barrios, Erik
Pineda, Jesus
Romero, Lenny A
Millán, María S
Marrugo, Andrés G.
dc.subject.keywords.spa.fl_str_mv Color constancy
Convolutional neural network
Image color processing
Machine learning
Skin color correction
topic Color constancy
Convolutional neural network
Image color processing
Machine learning
Skin color correction
description Fringe Projection Profilometry (FPP) with Digital Light Projector technology is one of the most reliable 3D sensing techniques for biomedical applications. However, besides the fringe pattern images,often a color texture image is needed for an accurate medical documentation. This image may be acquired either by projecting a white image or a black image and relying on ambient light. Color Constancy is essential for a faithful digital record, although the optical properties of biological tissue make color reproducibility challenging. Furthermore, color perception is highly dependent on the illuminant. Here, we describe a deep learning-based method for skin color correction in FPP. We trained a convolutional neural network using a skin tone color palette acquired under different illumination conditions to learn the mapping relationship between the input color image and its counterpart in the sRGB color space. Preliminary experimental results demonstrate the potential for this approach.
publishDate 2021
dc.date.issued.none.fl_str_mv 2021-09-02
dc.date.accessioned.none.fl_str_mv 2023-07-18T19:17:34Z
dc.date.available.none.fl_str_mv 2023-07-18T19:17:34Z
dc.date.submitted.none.fl_str_mv 2023-07
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_b1a7d7d4d402bcce
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/article
dc.type.hasversion.spa.fl_str_mv info:eu-repo/semantics/draft
dc.type.spa.spa.fl_str_mv http://purl.org/coar/resource_type/c_6501
status_str draft
dc.identifier.citation.spa.fl_str_mv Barrios, E., Pineda, J., Romero, L.A., Millán, M.S., Marrugo, A.G. Skin color correction via convolutional neural networks in 3D fringe projection profilometry (2021) Proceedings of SPIE - The International Society for Optical Engineering, 11804, art. no. 118041P, . DOI: 10.1117/12.2594331
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12585/12114
dc.identifier.doi.none.fl_str_mv 10.1117/12.2594331
dc.identifier.instname.spa.fl_str_mv Universidad Tecnológica de Bolívar
dc.identifier.reponame.spa.fl_str_mv Repositorio Universidad Tecnológica de Bolívar
identifier_str_mv Barrios, E., Pineda, J., Romero, L.A., Millán, M.S., Marrugo, A.G. Skin color correction via convolutional neural networks in 3D fringe projection profilometry (2021) Proceedings of SPIE - The International Society for Optical Engineering, 11804, art. no. 118041P, . DOI: 10.1117/12.2594331
10.1117/12.2594331
Universidad Tecnológica de Bolívar
Repositorio Universidad Tecnológica de Bolívar
url https://hdl.handle.net/20.500.12585/12114
dc.language.iso.spa.fl_str_mv eng
language eng
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.*.fl_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.cc.*.fl_str_mv Attribution-NonCommercial-NoDerivatives 4.0 Internacional
rights_invalid_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.place.spa.fl_str_mv Cartagena de Indias
dc.source.spa.fl_str_mv Proceedings of SPIE - The International Society for Optical Engineering - Vol. 11804 (2021)
institution Universidad Tecnológica de Bolívar
bitstream.url.fl_str_mv https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/1/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/2/license_rdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/3/license.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/4/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/5/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf.jpg
bitstream.checksum.fl_str_mv 371445ffec1c488df1ca8d8baff9ecf2
4460e5956bc1d1639be9ae6146a50347
e20ad307a1c5f3f25af9304a7a7c86b6
de64508af7ca8531e38f3d1456767488
efa6309ff7a70d54388d2d35aac4b3d0
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional UTB
repository.mail.fl_str_mv repositorioutb@utb.edu.co
_version_ 1814021614683029504
spelling Barrios, Erikbb277699-e10e-4f85-982c-9a3b98acb515Pineda, Jesusa6827c4e-c14f-4dc1-ba8e-4c5b1cd055ebRomero, Lenny A4e34aa8a-f981-4e1d-ae32-d45acb6abcf9Millán, María S9fe60bec-aad5-4e2e-99bd-db4b5e8f4a1bMarrugo, Andrés G.00746131-f46c-4d8c-9c02-514385d7b36e2023-07-18T19:17:34Z2023-07-18T19:17:34Z2021-09-022023-07Barrios, E., Pineda, J., Romero, L.A., Millán, M.S., Marrugo, A.G. Skin color correction via convolutional neural networks in 3D fringe projection profilometry (2021) Proceedings of SPIE - The International Society for Optical Engineering, 11804, art. no. 118041P, . DOI: 10.1117/12.2594331https://hdl.handle.net/20.500.12585/1211410.1117/12.2594331Universidad Tecnológica de BolívarRepositorio Universidad Tecnológica de BolívarFringe Projection Profilometry (FPP) with Digital Light Projector technology is one of the most reliable 3D sensing techniques for biomedical applications. However, besides the fringe pattern images,often a color texture image is needed for an accurate medical documentation. This image may be acquired either by projecting a white image or a black image and relying on ambient light. Color Constancy is essential for a faithful digital record, although the optical properties of biological tissue make color reproducibility challenging. Furthermore, color perception is highly dependent on the illuminant. Here, we describe a deep learning-based method for skin color correction in FPP. We trained a convolutional neural network using a skin tone color palette acquired under different illumination conditions to learn the mapping relationship between the input color image and its counterpart in the sRGB color space. Preliminary experimental results demonstrate the potential for this approach.application/pdfenghttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAttribution-NonCommercial-NoDerivatives 4.0 Internacionalhttp://purl.org/coar/access_right/c_abf2Proceedings of SPIE - The International Society for Optical Engineering - Vol. 11804 (2021)Skin color correction via convolutional neural networks in 3D fringe projection profilometryinfo:eu-repo/semantics/articleinfo:eu-repo/semantics/drafthttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/version/c_b1a7d7d4d402bccehttp://purl.org/coar/resource_type/c_2df8fbb1Color constancyConvolutional neural networkImage color processingMachine learningSkin color correctionCartagena de IndiasMarrugo, A.G., Gao, F., Zhang, S. State-of-the-art active optical techniques for three-dimensional surface metrology: a review [Invited] (2020) Journal of the Optical Society of America A: Optics and Image Science, and Vision, 37 (9), pp. B60-B77. Cited 102 times. https://www.osapublishing.org/abstract.cfm?URI=josaa-37-9-B60 doi: 10.1364/JOSAA.398644Meza, J., Contreras-Ortiz, S.H., Romero, L.A., Marrugo, A.G. Three-dimensional multimodal medical imaging system based on freehand ultrasound and structured light (2021) Optical Engineering, 60 (5), art. no. 054106. Cited 7 times. http://www.spie.org/x867.xml doi: 10.1117/1.OE.60.5.054106Laloš, J., Mrak, M., Pavlovčič, U., Jezeršek, M. Handheld optical system for skin topography measurement using fourier transform profilometry (2015) Strojniski Vestnik/Journal of Mechanical Engineering, 61 (5), pp. 285-291. Cited 4 times. http://en.sv-jme.eu/data/upload/2015/05/01_2015_2424_Lalos_04.pdf doi: 10.5545/sv-jme.2015.2424Pineda, J., Vargas, R., Romero, L.A., Marrugo, J., Meneses, J., Marrugo, A.G. Robust automated reading of the skin prick test via 3D imaging and parametric surface fitting (2019) PLoS ONE, 14 (10), art. no. e0223623. Cited 7 times. https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0223623&type=printable doi: 10.1371/journal.pone.0223623Rey-Barroso, L., Burgos-Fernández, F.J., Ares, M., Royo, S., Puig, S., Malvehy, J., Pellacani, G., (...), Ricart, M.V. Morphological study of skin cancer lesions through a 3D scanner based on fringe projection and machine learning (2019) Biomedical Optics Express, 10 (7), pp. 3404-3409. Cited 6 times. https://www.osapublishing.org/boe/viewmedia.cfm?uri=boe-10-7-3404&seq=0 doi: 10.1364/BOE.10.003404Xu, J., Zhang, S. Status, challenges, and future perspectives of fringe projection profilometry (2020) Optics and Lasers in Engineering, 135, art. no. 106193. Cited 121 times. https://www.journals.elsevier.com/optics-and-lasers-in-engineering doi: 10.1016/j.optlaseng.2020.106193Wannous, H., Treuillet, S., Lucas, Y. Robust tissue classification for reproducible wound assessment in telemedicine environments (2010) Journal of Electronic Imaging, 19 (2), art. no. 023002. Cited 43 times. doi: 10.1117/1.3378149Wannous, H., Lucas, Y., Treuillet, S., Mansouri, A., Voisin, Y. Improving color correction across camera and illumination changes by contextual sample selection (2012) Journal of Electronic Imaging, 21 (2), art. no. 023015. Cited 23 times. http://www.spie.org/x868.xml doi: 10.1117/1.JEI.21.2.023015Tanaka, S., Kakinuma, A., Kamijo, N., Takahashi, H., Tsumura, N. Auto white balance method using a pigmentation separation technique for human skin color (Open Access) (2017) Optical Review, 24 (1), pp. 17-26. Cited 3 times. http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-40109-70-1126859-0,00.html doi: 10.1007/s10043-016-0290-yCorbalán, M., Millán, M.S., Yzuel, M.J. Color measurement in standard CIELAB coordinates using a 3CCD camera: correction for the influence of the light source (2000) Optical Engineering, 39 (6), pp. 1470-1476. Cited 30 times. doi: 10.1117/1.602519Mahmoud, A. (2018) Semantic white balance: Semantic color constancy using convolutional neural network. Cited 15 times. arXiv preprint arXiv:1802.00153Lou, Z., Gevers, T., Hu, N., Lucassen, M. P. Color constancy by deep learning (2015) , pp. 76-1. Cited 67 times. [BMVC]Qian, Y. (2020) Computational color constancy: From pixel to video with a stop at convolutional neural networkBuchsbaum, G. A spatial processor model for object colour perception (Open Access) (1980) Journal of the Franklin Institute, 310 (1), pp. 1-26. Cited 1251 times. doi: 10.1016/0016-0032(80)90058-7Vaezi Joze, H.R., Drew, M.S. White patch gamut mapping colour constancy (Open Access) (2012) Proceedings - International Conference on Image Processing, ICIP, art. no. 6466981, pp. 801-804. Cited 15 times. ISBN: 978-146732533-2 doi: 10.1109/ICIP.2012.6466981Choi, H.-H., Yun, B.-J. Deep learning-based computational color constancy with convoluted mixture of deep experts (CMODe) fusion technique (2020) IEEE Access, 8, pp. 188309-188320. Cited 10 times. http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639 doi: 10.1109/ACCESS.2020.3030912Oh, S.W., Kim, S.J. Approaching the computational color constancy as a classification problem through deep learning (2017) Pattern Recognition, 61, pp. 405-416. Cited 75 times. www.elsevier.com/inca/publications/store/3/2/8/ doi: 10.1016/j.patcog.2016.08.013Hu, Y., Wang, B., Lin, S. FC4: Fully convolutional color constancy with confidence-weighted pooling (Open Access) (2017) Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017-January, pp. 330-339. Cited 151 times. ISBN: 978-153860457-1 doi: 10.1109/CVPR.2017.43Cooksey, C.C., Allen, D.W., Tsai, B.K. Reference data set of human skin reflectance (Open Access) (2017) Journal of Research of the National Institute of Standards and Technology, 122, art. no. 26. Cited 15 times. https://nvlpubs.nist.gov/nistpubs/jres/122/jres.122.026.pdf doi: 10.6028/jres.122.026Cooksey, C., Allen, D., Tsai, B. Reectance data set and variability study for human skin reectance (2019) Proceedings of the CIE 2019 29th Session Washington, DC (2019-07-01)McCamy, C.S., Marcus, H., Davidson, J.G. COLOR-RENDITION CHART. (Open Access) (1976) J Appl Photogr Eng, 2 (3), pp. 95-99. Cited 382 times.Kingma, D. P., Ba, J. (2014) Adam: A method for stochastic optimization. Cited 51871 times. arXiv preprint arXiv:1412.6980Helgadottir, S., Argun, A., Volpe, G. Digital video microscopy enhanced by deep learning (Open Access) (2019) Optica, 6 (4), pp. 506-513. Cited 42 times. https://www.osapublishing.org/optica/viewmedia.cfm?uri=optica-6-4-506&seq=0 doi: 10.1364/OPTICA.6.000506Midtvedt, B., Helgadottir, S., Argun, A., Midtvedt, D., Volpe, G. (2020) Deeptrack: A comprehensive deep learning framework for digital microscopy. Cited 2 times. https://github.com/softmatterlab/DeepTrack-2.0.gitGaurav, S. (2003) Digital color imaging handbook. Cited 660 times. CRC Presshttp://purl.org/coar/resource_type/c_2df8fbb1ORIGINALSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdfSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdfapplication/pdf142922https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/1/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf371445ffec1c488df1ca8d8baff9ecf2MD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/2/license_rdf4460e5956bc1d1639be9ae6146a50347MD52LICENSElicense.txtlicense.txttext/plain; charset=utf-83182https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/3/license.txte20ad307a1c5f3f25af9304a7a7c86b6MD53TEXTSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdf.txtSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdf.txtExtracted texttext/plain6655https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/4/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf.txtde64508af7ca8531e38f3d1456767488MD54THUMBNAILSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdf.jpgSkin color correction via convolutional neural networks in 3D fringe projection profilometry.pdf.jpgGenerated Thumbnailimage/jpeg8606https://repositorio.utb.edu.co/bitstream/20.500.12585/12114/5/Skin%20color%20correction%20via%20convolutional%20neural%20networks%20in%203D%20fringe%20projection%20profilometry.pdf.jpgefa6309ff7a70d54388d2d35aac4b3d0MD5520.500.12585/12114oai:repositorio.utb.edu.co:20.500.12585/121142023-07-19 00:19:22.157Repositorio Institucional UTBrepositorioutb@utb.edu.coQXV0b3Jpem8gKGF1dG9yaXphbW9zKSBhIGxhIEJpYmxpb3RlY2EgZGUgbGEgSW5zdGl0dWNpw7NuIHBhcmEgcXVlIGluY2x1eWEgdW5hIGNvcGlhLCBpbmRleGUgeSBkaXZ1bGd1ZSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBsYSBvYnJhIG1lbmNpb25hZGEgY29uIGVsIGZpbiBkZSBmYWNpbGl0YXIgbG9zIHByb2Nlc29zIGRlIHZpc2liaWxpZGFkIGUgaW1wYWN0byBkZSBsYSBtaXNtYSwgY29uZm9ybWUgYSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBxdWUgbWUobm9zKSBjb3JyZXNwb25kZShuKSB5IHF1ZSBpbmNsdXllbjogbGEgcmVwcm9kdWNjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSwgZGlzdHJpYnVjacOzbiBhbCBww7pibGljbywgdHJhbnNmb3JtYWNpw7NuLCBkZSBjb25mb3JtaWRhZCBjb24gbGEgbm9ybWF0aXZpZGFkIHZpZ2VudGUgc29icmUgZGVyZWNob3MgZGUgYXV0b3IgeSBkZXJlY2hvcyBjb25leG9zIHJlZmVyaWRvcyBlbiBhcnQuIDIsIDEyLCAzMCAobW9kaWZpY2FkbyBwb3IgZWwgYXJ0IDUgZGUgbGEgbGV5IDE1MjAvMjAxMiksIHkgNzIgZGUgbGEgbGV5IDIzIGRlIGRlIDE5ODIsIExleSA0NCBkZSAxOTkzLCBhcnQuIDQgeSAxMSBEZWNpc2nDs24gQW5kaW5hIDM1MSBkZSAxOTkzIGFydC4gMTEsIERlY3JldG8gNDYwIGRlIDE5OTUsIENpcmN1bGFyIE5vIDA2LzIwMDIgZGUgbGEgRGlyZWNjacOzbiBOYWNpb25hbCBkZSBEZXJlY2hvcyBkZSBhdXRvciwgYXJ0LiAxNSBMZXkgMTUyMCBkZSAyMDEyLCBsYSBMZXkgMTkxNSBkZSAyMDE4IHkgZGVtw6FzIG5vcm1hcyBzb2JyZSBsYSBtYXRlcmlhLgoKQWwgcmVzcGVjdG8gY29tbyBBdXRvcihlcykgbWFuaWZlc3RhbW9zIGNvbm9jZXIgcXVlOgoKLSBMYSBhdXRvcml6YWNpw7NuIGVzIGRlIGNhcsOhY3RlciBubyBleGNsdXNpdmEgeSBsaW1pdGFkYSwgZXN0byBpbXBsaWNhIHF1ZSBsYSBsaWNlbmNpYSB0aWVuZSB1bmEgdmlnZW5jaWEsIHF1ZSBubyBlcyBwZXJwZXR1YSB5IHF1ZSBlbCBhdXRvciBwdWVkZSBwdWJsaWNhciBvIGRpZnVuZGlyIHN1IG9icmEgZW4gY3VhbHF1aWVyIG90cm8gbWVkaW8sIGFzw60gY29tbyBsbGV2YXIgYSBjYWJvIGN1YWxxdWllciB0aXBvIGRlIGFjY2nDs24gc29icmUgZWwgZG9jdW1lbnRvLgoKLSBMYSBhdXRvcml6YWNpw7NuIHRlbmRyw6EgdW5hIHZpZ2VuY2lhIGRlIGNpbmNvIGHDsW9zIGEgcGFydGlyIGRlbCBtb21lbnRvIGRlIGxhIGluY2x1c2nDs24gZGUgbGEgb2JyYSBlbiBlbCByZXBvc2l0b3JpbywgcHJvcnJvZ2FibGUgaW5kZWZpbmlkYW1lbnRlIHBvciBlbCB0aWVtcG8gZGUgZHVyYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlbCBhdXRvciB5IHBvZHLDoSBkYXJzZSBwb3IgdGVybWluYWRhIHVuYSB2ZXogZWwgYXV0b3IgbG8gbWFuaWZpZXN0ZSBwb3IgZXNjcml0byBhIGxhIGluc3RpdHVjacOzbiwgY29uIGxhIHNhbHZlZGFkIGRlIHF1ZSBsYSBvYnJhIGVzIGRpZnVuZGlkYSBnbG9iYWxtZW50ZSB5IGNvc2VjaGFkYSBwb3IgZGlmZXJlbnRlcyBidXNjYWRvcmVzIHkvbyByZXBvc2l0b3Jpb3MgZW4gSW50ZXJuZXQgbG8gcXVlIG5vIGdhcmFudGl6YSBxdWUgbGEgb2JyYSBwdWVkYSBzZXIgcmV0aXJhZGEgZGUgbWFuZXJhIGlubWVkaWF0YSBkZSBvdHJvcyBzaXN0ZW1hcyBkZSBpbmZvcm1hY2nDs24gZW4gbG9zIHF1ZSBzZSBoYXlhIGluZGV4YWRvLCBkaWZlcmVudGVzIGFsIHJlcG9zaXRvcmlvIGluc3RpdHVjaW9uYWwgZGUgbGEgSW5zdGl0dWNpw7NuLCBkZSBtYW5lcmEgcXVlIGVsIGF1dG9yKHJlcykgdGVuZHLDoW4gcXVlIHNvbGljaXRhciBsYSByZXRpcmFkYSBkZSBzdSBvYnJhIGRpcmVjdGFtZW50ZSBhIG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBkaXN0aW50b3MgYWwgZGUgbGEgSW5zdGl0dWNpw7NuIHNpIGRlc2VhIHF1ZSBzdSBvYnJhIHNlYSByZXRpcmFkYSBkZSBpbm1lZGlhdG8uCgotIExhIGF1dG9yaXphY2nDs24gZGUgcHVibGljYWNpw7NuIGNvbXByZW5kZSBlbCBmb3JtYXRvIG9yaWdpbmFsIGRlIGxhIG9icmEgeSB0b2RvcyBsb3MgZGVtw6FzIHF1ZSBzZSByZXF1aWVyYSBwYXJhIHN1IHB1YmxpY2FjacOzbiBlbiBlbCByZXBvc2l0b3Jpby4gSWd1YWxtZW50ZSwgbGEgYXV0b3JpemFjacOzbiBwZXJtaXRlIGEgbGEgaW5zdGl0dWNpw7NuIGVsIGNhbWJpbyBkZSBzb3BvcnRlIGRlIGxhIG9icmEgY29uIGZpbmVzIGRlIHByZXNlcnZhY2nDs24gKGltcHJlc28sIGVsZWN0csOzbmljbywgZGlnaXRhbCwgSW50ZXJuZXQsIGludHJhbmV0LCBvIGN1YWxxdWllciBvdHJvIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2NlcikuCgotIExhIGF1dG9yaXphY2nDs24gZXMgZ3JhdHVpdGEgeSBzZSByZW51bmNpYSBhIHJlY2liaXIgY3VhbHF1aWVyIHJlbXVuZXJhY2nDs24gcG9yIGxvcyB1c29zIGRlIGxhIG9icmEsIGRlIGFjdWVyZG8gY29uIGxhIGxpY2VuY2lhIGVzdGFibGVjaWRhIGVuIGVzdGEgYXV0b3JpemFjacOzbi4KCi0gQWwgZmlybWFyIGVzdGEgYXV0b3JpemFjacOzbiwgc2UgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBlcyBvcmlnaW5hbCB5IG5vIGV4aXN0ZSBlbiBlbGxhIG5pbmd1bmEgdmlvbGFjacOzbiBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSB0ZXJjZXJvcy4gRW4gY2FzbyBkZSBxdWUgZWwgdHJhYmFqbyBoYXlhIHNpZG8gZmluYW5jaWFkbyBwb3IgdGVyY2Vyb3MgZWwgbyBsb3MgYXV0b3JlcyBhc3VtZW4gbGEgcmVzcG9uc2FiaWxpZGFkIGRlbCBjdW1wbGltaWVudG8gZGUgbG9zIGFjdWVyZG9zIGVzdGFibGVjaWRvcyBzb2JyZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBsYSBvYnJhIGNvbiBkaWNobyB0ZXJjZXJvLgoKLSBGcmVudGUgYSBjdWFscXVpZXIgcmVjbGFtYWNpw7NuIHBvciB0ZXJjZXJvcywgZWwgbyBsb3MgYXV0b3JlcyBzZXLDoW4gcmVzcG9uc2FibGVzLCBlbiBuaW5nw7puIGNhc28gbGEgcmVzcG9uc2FiaWxpZGFkIHNlcsOhIGFzdW1pZGEgcG9yIGxhIGluc3RpdHVjacOzbi4KCi0gQ29uIGxhIGF1dG9yaXphY2nDs24sIGxhIGluc3RpdHVjacOzbiBwdWVkZSBkaWZ1bmRpciBsYSBvYnJhIGVuIMOtbmRpY2VzLCBidXNjYWRvcmVzIHkgb3Ryb3Mgc2lzdGVtYXMgZGUgaW5mb3JtYWNpw7NuIHF1ZSBmYXZvcmV6Y2FuIHN1IHZpc2liaWxpZGFkCgo=