Multi-target Attachment for Surgical Instrument Tracking

The pose estimation of a surgical instrument is a common problem in the new needs of medical science. Many instrument tracking methods use markers with a known geometry that allows for solving the instrument pose as detected by a camera. However, marker occlusion happens, and it hinders correct pose...

Full description

Autores:
Benjumea, Eberto
Sierra, Juan S
Meza, Jhacson
Marrugo, Andres G.
Tipo de recurso:
Fecha de publicación:
2022
Institución:
Universidad Tecnológica de Bolívar
Repositorio:
Repositorio Institucional UTB
Idioma:
eng
OAI Identifier:
oai:repositorio.utb.edu.co:20.500.12585/12397
Acceso en línea:
https://hdl.handle.net/20.500.12585/12397
Palabra clave:
Fiducial Markers;
Augmented Reality;
Mixed Reality
LEMB
Rights
openAccess
License
http://creativecommons.org/licenses/by-nc-nd/4.0/
id UTB2_e23fc69f80d482187ceee3297974321f
oai_identifier_str oai:repositorio.utb.edu.co:20.500.12585/12397
network_acronym_str UTB2
network_name_str Repositorio Institucional UTB
repository_id_str
dc.title.spa.fl_str_mv Multi-target Attachment for Surgical Instrument Tracking
title Multi-target Attachment for Surgical Instrument Tracking
spellingShingle Multi-target Attachment for Surgical Instrument Tracking
Fiducial Markers;
Augmented Reality;
Mixed Reality
LEMB
title_short Multi-target Attachment for Surgical Instrument Tracking
title_full Multi-target Attachment for Surgical Instrument Tracking
title_fullStr Multi-target Attachment for Surgical Instrument Tracking
title_full_unstemmed Multi-target Attachment for Surgical Instrument Tracking
title_sort Multi-target Attachment for Surgical Instrument Tracking
dc.creator.fl_str_mv Benjumea, Eberto
Sierra, Juan S
Meza, Jhacson
Marrugo, Andres G.
dc.contributor.author.none.fl_str_mv Benjumea, Eberto
Sierra, Juan S
Meza, Jhacson
Marrugo, Andres G.
dc.subject.keywords.spa.fl_str_mv Fiducial Markers;
Augmented Reality;
Mixed Reality
topic Fiducial Markers;
Augmented Reality;
Mixed Reality
LEMB
dc.subject.armarc.none.fl_str_mv LEMB
description The pose estimation of a surgical instrument is a common problem in the new needs of medical science. Many instrument tracking methods use markers with a known geometry that allows for solving the instrument pose as detected by a camera. However, marker occlusion happens, and it hinders correct pose estimation. In this work, we propose an adaptable multi-target attachment with ArUco markers to solve occlusion problems on tracking a medical instrument like an ultrasound probe or a scalpel. Our multi-target system allows for precise and redundant real-time pose estimation implemented in OpenCV. Encouraging results show that the multi-target device may prove useful in the clinical setting. © 2021, Springer Nature Switzerland AG.
publishDate 2022
dc.date.issued.none.fl_str_mv 2022
dc.date.accessioned.none.fl_str_mv 2023-07-21T20:51:05Z
dc.date.available.none.fl_str_mv 2023-07-21T20:51:05Z
dc.date.submitted.none.fl_str_mv 2023
dc.type.coarversion.fl_str_mv http://purl.org/coar/version/c_b1a7d7d4d402bcce
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_2df8fbb1
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/article
dc.type.hasversion.spa.fl_str_mv info:eu-repo/semantics/draft
dc.type.spa.spa.fl_str_mv http://purl.org/coar/resource_type/c_6501
status_str draft
dc.identifier.citation.spa.fl_str_mv Marrugo, A. G. (2021, June). Multi-target Attachment for Surgical Instrument Tracking. In Pattern Recognition: 13th Mexican Conference, MCPR 2021, Mexico City, Mexico, June 23–26, 2021, Proceedings (Vol. 12725, p. 345). Springer Nature.
dc.identifier.uri.none.fl_str_mv https://hdl.handle.net/20.500.12585/12397
dc.identifier.doi.none.fl_str_mv 10.1007/978-3-030-77004-4_33
dc.identifier.instname.spa.fl_str_mv Universidad Tecnológica de Bolívar
dc.identifier.reponame.spa.fl_str_mv Repositorio Universidad Tecnológica de Bolívar
identifier_str_mv Marrugo, A. G. (2021, June). Multi-target Attachment for Surgical Instrument Tracking. In Pattern Recognition: 13th Mexican Conference, MCPR 2021, Mexico City, Mexico, June 23–26, 2021, Proceedings (Vol. 12725, p. 345). Springer Nature.
10.1007/978-3-030-77004-4_33
Universidad Tecnológica de Bolívar
Repositorio Universidad Tecnológica de Bolívar
url https://hdl.handle.net/20.500.12585/12397
dc.language.iso.spa.fl_str_mv eng
language eng
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.uri.*.fl_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
dc.rights.cc.*.fl_str_mv Attribution-NonCommercial-NoDerivatives 4.0 Internacional
rights_invalid_str_mv http://creativecommons.org/licenses/by-nc-nd/4.0/
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.place.spa.fl_str_mv Cartagena de Indias
dc.source.spa.fl_str_mv Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
institution Universidad Tecnológica de Bolívar
bitstream.url.fl_str_mv https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/1/509624_1_En_33_Andres%20Guillermo%20Mar.pdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/2/license_rdf
https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/3/license.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/4/509624_1_En_33_Andres%20Guillermo%20Mar.pdf.txt
https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/5/509624_1_En_33_Andres%20Guillermo%20Mar.pdf.jpg
bitstream.checksum.fl_str_mv 5fc5685f7c6ea77949ecbdcc48cd1e97
4460e5956bc1d1639be9ae6146a50347
e20ad307a1c5f3f25af9304a7a7c86b6
6f482f749990bb04c93dfc82e6b0af94
130ed3113342e00b5a0df9c1da9365bb
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional UTB
repository.mail.fl_str_mv repositorioutb@utb.edu.co
_version_ 1814021710703230976
spelling Benjumea, Ebertofcf5517c-943a-4b31-b047-96dd34300c9dSierra, Juan S225cb910-26f4-4ac4-9967-4da4e91f4debMeza, Jhacsonf82caa3d-d398-4c7c-8651-1d32adcd8925Marrugo, Andres G.3d6cd388-d48f-4669-934f-49ca4179f5422023-07-21T20:51:05Z2023-07-21T20:51:05Z20222023Marrugo, A. G. (2021, June). Multi-target Attachment for Surgical Instrument Tracking. In Pattern Recognition: 13th Mexican Conference, MCPR 2021, Mexico City, Mexico, June 23–26, 2021, Proceedings (Vol. 12725, p. 345). Springer Nature.https://hdl.handle.net/20.500.12585/1239710.1007/978-3-030-77004-4_33Universidad Tecnológica de BolívarRepositorio Universidad Tecnológica de BolívarThe pose estimation of a surgical instrument is a common problem in the new needs of medical science. Many instrument tracking methods use markers with a known geometry that allows for solving the instrument pose as detected by a camera. However, marker occlusion happens, and it hinders correct pose estimation. In this work, we propose an adaptable multi-target attachment with ArUco markers to solve occlusion problems on tracking a medical instrument like an ultrasound probe or a scalpel. Our multi-target system allows for precise and redundant real-time pose estimation implemented in OpenCV. Encouraging results show that the multi-target device may prove useful in the clinical setting. © 2021, Springer Nature Switzerland AG.application/pdfenghttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAttribution-NonCommercial-NoDerivatives 4.0 Internacionalhttp://purl.org/coar/access_right/c_abf2Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)Multi-target Attachment for Surgical Instrument Trackinginfo:eu-repo/semantics/articleinfo:eu-repo/semantics/drafthttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/version/c_b1a7d7d4d402bccehttp://purl.org/coar/resource_type/c_2df8fbb1Fiducial Markers;Augmented Reality;Mixed RealityLEMBCartagena de IndiasAvola, D., Cinque, L., Foresti, G.L., Mercuri, C., Pannone, D. A practical framework for the development of augmented reality applications by using ARUco markers (2016) ICPRAM 2016 - Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods, pp. 645-654. Cited 20 times. ISBN: 978-989758173-1 doi: 10.5220/0005755806450654Azuma, R.T. A survey of augmented reality (1997) Presence: Teleoperators and Virtual Environments, 6 (4), pp. 355-385. Cited 5588 times. http://www.mitpressjournals.org/loi/pres doi: 10.1162/pres.1997.6.4.355Bootsma, G.J., Siewerdsen, J.H., Daly, M.J., Jaffray, D.A. Initial investigation of an automatic registration algorithm for surgical navigation (2008) Proceedings of the 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS'08 - "Personalized Healthcare through Technology", art. no. 4649996, pp. 3638-3642. Cited 18 times. ISBN: 978-142441815-2 doi: 10.1109/iembs.2008.4649996Colley, E., Carroll, J., Thomas, S., Varcoe, R.L., Simmons, A., Barber, T. A methodology for non-invasive 3-D surveillance of arteriovenous fistulae using freehand ultrasound (2018) IEEE Transactions on Biomedical Engineering, 65 (8), pp. 1885-1891. Cited 16 times. http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?reload=true&punumber=10 doi: 10.1109/TBME.2017.2782781DeSouza, G.N., Kak, A.C. Vision for mobile robot navigation: A survey (2002) IEEE Transactions on Pattern Analysis and Machine Intelligence, 24 (2), pp. 237-267. Cited 1110 times. doi: 10.1109/34.982903Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion (Open Access) (2014) Pattern Recognition, 47 (6), pp. 2280-2292. Cited 1428 times. doi: 10.1016/j.patcog.2014.01.005Hu, D., Detone, D., Malisiewicz, T. Deep charuco: Dark charuco marker pose estimation (2019) Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, art. no. 8953882, pp. 8428-8436. Cited 40 times. ISBN: 978-172813293-8 doi: 10.1109/CVPR.2019.00863Kato, H., Billinghurst, M. Marker tracking and HMD calibration for a video-based augmented reality conferencing system (1999) Proceedings - 2nd IEEE and ACM International Workshop on Augmented Reality, IWAR 1999, art. no. 803809, pp. 85-94. Cited 1785 times. http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6523 ISBN: 0769503594; 978-076950359-2 doi: 10.1109/IWAR.1999.803809Mercier, L., Langø, T., Lindseth, F., Collins, D.L. A review of calibration techniques for freehand 3-D ultrasound systems (Open Access) (2005) Ultrasound in Medicine and Biology, 31 (4), pp. 449-471. Cited 125 times. www.elsevier.com/locate/ultrasmedbio doi: 10.1016/j.ultrasmedbio.2004.11.015Meza, J., Simarra, P., Contreras-Ojeda, S., Romero, L.A., Contreras-Ortiz, S.H., Arámbula Cosío, F., Marrugo, A.G. A low-cost multi-modal medical imaging system with fringe projection profilometry and 3D freehand ultrasound (Open Access) (2020) Proceedings of SPIE - The International Society for Optical Engineering, 11330, art. no. 1133004. Cited 4 times. http://spie.org/x1848.xml ISBN: 978-151063427-5 doi: 10.1117/12.2542712Romero, C., Naufal, C., Meza, J., Marrugo, A.G. A validation strategy for a target-based vision tracking system with an industrial robot (Open Access) (2020) Journal of Physics: Conference Series, 1547 (1), art. no. 012018. Cited 6 times. http://iopscience.iop.org/journal/1742-6596 doi: 10.1088/1742-6596/1547/1/012018Romero-Ramirez, F.J., Muñoz-Salinas, R., Medina-Carnicer, R. Speeded up detection of squared fiducial markers (2018) Image and Vision Computing, 76, pp. 38-47. Cited 420 times. doi: 10.1016/j.imavis.2018.05.004Sani, M.F., Karimian, G. Automatic navigation and landing of an indoor AR. Drone quadrotor using ArUco marker and inertial sensors (2017) 1st International Conference on Computer and Drone Applications: Ethical Integration of Computer and Drone Technology for Humanity Sustainability, IConDA 2017, 2018-January, pp. 102-107. Cited 65 times. ISBN: 978-153860765-7 doi: 10.1109/ICONDA.2017.8270408Treece, G.M., Gee, A.H., Prager, R.W., Cash, C.J.C., Berman, L.H. High-definition freehand 3-D ultrasound (2003) Ultrasound in Medicine and Biology, 29 (4), pp. 529-546. Cited 134 times. www.elsevier.com/locate/ultrasmedbio doi: 10.1016/S0301-5629(02)00735-4Wang, J., Olson, E. AprilTag 2: Efficient and robust fiducial detection (2016) IEEE International Conference on Intelligent Robots and Systems, 2016-November, art. no. 7759617, pp. 4193-4198. Cited 421 times. ISBN: 978-150903762-9 doi: 10.1109/IROS.2016.7759617http://purl.org/coar/resource_type/c_6501ORIGINAL509624_1_En_33_Andres Guillermo Mar.pdf509624_1_En_33_Andres Guillermo Mar.pdfapplication/pdf1268883https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/1/509624_1_En_33_Andres%20Guillermo%20Mar.pdf5fc5685f7c6ea77949ecbdcc48cd1e97MD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/2/license_rdf4460e5956bc1d1639be9ae6146a50347MD52LICENSElicense.txtlicense.txttext/plain; charset=utf-83182https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/3/license.txte20ad307a1c5f3f25af9304a7a7c86b6MD53TEXT509624_1_En_33_Andres Guillermo Mar.pdf.txt509624_1_En_33_Andres Guillermo Mar.pdf.txtExtracted texttext/plain21682https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/4/509624_1_En_33_Andres%20Guillermo%20Mar.pdf.txt6f482f749990bb04c93dfc82e6b0af94MD54THUMBNAIL509624_1_En_33_Andres Guillermo Mar.pdf.jpg509624_1_En_33_Andres Guillermo Mar.pdf.jpgGenerated Thumbnailimage/jpeg4912https://repositorio.utb.edu.co/bitstream/20.500.12585/12397/5/509624_1_En_33_Andres%20Guillermo%20Mar.pdf.jpg130ed3113342e00b5a0df9c1da9365bbMD5520.500.12585/12397oai:repositorio.utb.edu.co:20.500.12585/123972023-07-22 00:18:24.486Repositorio Institucional UTBrepositorioutb@utb.edu.coQXV0b3Jpem8gKGF1dG9yaXphbW9zKSBhIGxhIEJpYmxpb3RlY2EgZGUgbGEgSW5zdGl0dWNpw7NuIHBhcmEgcXVlIGluY2x1eWEgdW5hIGNvcGlhLCBpbmRleGUgeSBkaXZ1bGd1ZSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBsYSBvYnJhIG1lbmNpb25hZGEgY29uIGVsIGZpbiBkZSBmYWNpbGl0YXIgbG9zIHByb2Nlc29zIGRlIHZpc2liaWxpZGFkIGUgaW1wYWN0byBkZSBsYSBtaXNtYSwgY29uZm9ybWUgYSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBxdWUgbWUobm9zKSBjb3JyZXNwb25kZShuKSB5IHF1ZSBpbmNsdXllbjogbGEgcmVwcm9kdWNjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSwgZGlzdHJpYnVjacOzbiBhbCBww7pibGljbywgdHJhbnNmb3JtYWNpw7NuLCBkZSBjb25mb3JtaWRhZCBjb24gbGEgbm9ybWF0aXZpZGFkIHZpZ2VudGUgc29icmUgZGVyZWNob3MgZGUgYXV0b3IgeSBkZXJlY2hvcyBjb25leG9zIHJlZmVyaWRvcyBlbiBhcnQuIDIsIDEyLCAzMCAobW9kaWZpY2FkbyBwb3IgZWwgYXJ0IDUgZGUgbGEgbGV5IDE1MjAvMjAxMiksIHkgNzIgZGUgbGEgbGV5IDIzIGRlIGRlIDE5ODIsIExleSA0NCBkZSAxOTkzLCBhcnQuIDQgeSAxMSBEZWNpc2nDs24gQW5kaW5hIDM1MSBkZSAxOTkzIGFydC4gMTEsIERlY3JldG8gNDYwIGRlIDE5OTUsIENpcmN1bGFyIE5vIDA2LzIwMDIgZGUgbGEgRGlyZWNjacOzbiBOYWNpb25hbCBkZSBEZXJlY2hvcyBkZSBhdXRvciwgYXJ0LiAxNSBMZXkgMTUyMCBkZSAyMDEyLCBsYSBMZXkgMTkxNSBkZSAyMDE4IHkgZGVtw6FzIG5vcm1hcyBzb2JyZSBsYSBtYXRlcmlhLgoKQWwgcmVzcGVjdG8gY29tbyBBdXRvcihlcykgbWFuaWZlc3RhbW9zIGNvbm9jZXIgcXVlOgoKLSBMYSBhdXRvcml6YWNpw7NuIGVzIGRlIGNhcsOhY3RlciBubyBleGNsdXNpdmEgeSBsaW1pdGFkYSwgZXN0byBpbXBsaWNhIHF1ZSBsYSBsaWNlbmNpYSB0aWVuZSB1bmEgdmlnZW5jaWEsIHF1ZSBubyBlcyBwZXJwZXR1YSB5IHF1ZSBlbCBhdXRvciBwdWVkZSBwdWJsaWNhciBvIGRpZnVuZGlyIHN1IG9icmEgZW4gY3VhbHF1aWVyIG90cm8gbWVkaW8sIGFzw60gY29tbyBsbGV2YXIgYSBjYWJvIGN1YWxxdWllciB0aXBvIGRlIGFjY2nDs24gc29icmUgZWwgZG9jdW1lbnRvLgoKLSBMYSBhdXRvcml6YWNpw7NuIHRlbmRyw6EgdW5hIHZpZ2VuY2lhIGRlIGNpbmNvIGHDsW9zIGEgcGFydGlyIGRlbCBtb21lbnRvIGRlIGxhIGluY2x1c2nDs24gZGUgbGEgb2JyYSBlbiBlbCByZXBvc2l0b3JpbywgcHJvcnJvZ2FibGUgaW5kZWZpbmlkYW1lbnRlIHBvciBlbCB0aWVtcG8gZGUgZHVyYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlbCBhdXRvciB5IHBvZHLDoSBkYXJzZSBwb3IgdGVybWluYWRhIHVuYSB2ZXogZWwgYXV0b3IgbG8gbWFuaWZpZXN0ZSBwb3IgZXNjcml0byBhIGxhIGluc3RpdHVjacOzbiwgY29uIGxhIHNhbHZlZGFkIGRlIHF1ZSBsYSBvYnJhIGVzIGRpZnVuZGlkYSBnbG9iYWxtZW50ZSB5IGNvc2VjaGFkYSBwb3IgZGlmZXJlbnRlcyBidXNjYWRvcmVzIHkvbyByZXBvc2l0b3Jpb3MgZW4gSW50ZXJuZXQgbG8gcXVlIG5vIGdhcmFudGl6YSBxdWUgbGEgb2JyYSBwdWVkYSBzZXIgcmV0aXJhZGEgZGUgbWFuZXJhIGlubWVkaWF0YSBkZSBvdHJvcyBzaXN0ZW1hcyBkZSBpbmZvcm1hY2nDs24gZW4gbG9zIHF1ZSBzZSBoYXlhIGluZGV4YWRvLCBkaWZlcmVudGVzIGFsIHJlcG9zaXRvcmlvIGluc3RpdHVjaW9uYWwgZGUgbGEgSW5zdGl0dWNpw7NuLCBkZSBtYW5lcmEgcXVlIGVsIGF1dG9yKHJlcykgdGVuZHLDoW4gcXVlIHNvbGljaXRhciBsYSByZXRpcmFkYSBkZSBzdSBvYnJhIGRpcmVjdGFtZW50ZSBhIG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBkaXN0aW50b3MgYWwgZGUgbGEgSW5zdGl0dWNpw7NuIHNpIGRlc2VhIHF1ZSBzdSBvYnJhIHNlYSByZXRpcmFkYSBkZSBpbm1lZGlhdG8uCgotIExhIGF1dG9yaXphY2nDs24gZGUgcHVibGljYWNpw7NuIGNvbXByZW5kZSBlbCBmb3JtYXRvIG9yaWdpbmFsIGRlIGxhIG9icmEgeSB0b2RvcyBsb3MgZGVtw6FzIHF1ZSBzZSByZXF1aWVyYSBwYXJhIHN1IHB1YmxpY2FjacOzbiBlbiBlbCByZXBvc2l0b3Jpby4gSWd1YWxtZW50ZSwgbGEgYXV0b3JpemFjacOzbiBwZXJtaXRlIGEgbGEgaW5zdGl0dWNpw7NuIGVsIGNhbWJpbyBkZSBzb3BvcnRlIGRlIGxhIG9icmEgY29uIGZpbmVzIGRlIHByZXNlcnZhY2nDs24gKGltcHJlc28sIGVsZWN0csOzbmljbywgZGlnaXRhbCwgSW50ZXJuZXQsIGludHJhbmV0LCBvIGN1YWxxdWllciBvdHJvIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2NlcikuCgotIExhIGF1dG9yaXphY2nDs24gZXMgZ3JhdHVpdGEgeSBzZSByZW51bmNpYSBhIHJlY2liaXIgY3VhbHF1aWVyIHJlbXVuZXJhY2nDs24gcG9yIGxvcyB1c29zIGRlIGxhIG9icmEsIGRlIGFjdWVyZG8gY29uIGxhIGxpY2VuY2lhIGVzdGFibGVjaWRhIGVuIGVzdGEgYXV0b3JpemFjacOzbi4KCi0gQWwgZmlybWFyIGVzdGEgYXV0b3JpemFjacOzbiwgc2UgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBlcyBvcmlnaW5hbCB5IG5vIGV4aXN0ZSBlbiBlbGxhIG5pbmd1bmEgdmlvbGFjacOzbiBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSB0ZXJjZXJvcy4gRW4gY2FzbyBkZSBxdWUgZWwgdHJhYmFqbyBoYXlhIHNpZG8gZmluYW5jaWFkbyBwb3IgdGVyY2Vyb3MgZWwgbyBsb3MgYXV0b3JlcyBhc3VtZW4gbGEgcmVzcG9uc2FiaWxpZGFkIGRlbCBjdW1wbGltaWVudG8gZGUgbG9zIGFjdWVyZG9zIGVzdGFibGVjaWRvcyBzb2JyZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBsYSBvYnJhIGNvbiBkaWNobyB0ZXJjZXJvLgoKLSBGcmVudGUgYSBjdWFscXVpZXIgcmVjbGFtYWNpw7NuIHBvciB0ZXJjZXJvcywgZWwgbyBsb3MgYXV0b3JlcyBzZXLDoW4gcmVzcG9uc2FibGVzLCBlbiBuaW5nw7puIGNhc28gbGEgcmVzcG9uc2FiaWxpZGFkIHNlcsOhIGFzdW1pZGEgcG9yIGxhIGluc3RpdHVjacOzbi4KCi0gQ29uIGxhIGF1dG9yaXphY2nDs24sIGxhIGluc3RpdHVjacOzbiBwdWVkZSBkaWZ1bmRpciBsYSBvYnJhIGVuIMOtbmRpY2VzLCBidXNjYWRvcmVzIHkgb3Ryb3Mgc2lzdGVtYXMgZGUgaW5mb3JtYWNpw7NuIHF1ZSBmYXZvcmV6Y2FuIHN1IHZpc2liaWxpZGFkCgo=