Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks
Due to the computational power and memory of modern computers, computer vision techniques and neural networks can be used to develop a visual inspection system of agricultural products to satisfy product quality requirements. This chapter employs artificial vision techniques to classify seeds in RGB...
- Autores:
-
Suarez, Oscar J.
Macias-Garcia, Edgar
Vega, Carlos J.
Peñaloza, Yersica C.
Hernández Díaz, Nicolás
Garrido, Victor M.
- Tipo de recurso:
- Fecha de publicación:
- 2023
- Institución:
- Universidad Tecnológica de Bolívar
- Repositorio:
- Repositorio Institucional UTB
- Idioma:
- eng
- OAI Identifier:
- oai:repositorio.utb.edu.co:20.500.12585/12322
- Acceso en línea:
- https://hdl.handle.net/20.500.12585/12322
- Palabra clave:
- Object Detection;
Deep Learning;
IOU
LEMB
- Rights
- openAccess
- License
- http://creativecommons.org/licenses/by-nc-nd/4.0/
id |
UTB2_792f27bbcefed137f6dac3e757f7cf6f |
---|---|
oai_identifier_str |
oai:repositorio.utb.edu.co:20.500.12585/12322 |
network_acronym_str |
UTB2 |
network_name_str |
Repositorio Institucional UTB |
repository_id_str |
|
dc.title.spa.fl_str_mv |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
title |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
spellingShingle |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks Object Detection; Deep Learning; IOU LEMB |
title_short |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
title_full |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
title_fullStr |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
title_full_unstemmed |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
title_sort |
Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks |
dc.creator.fl_str_mv |
Suarez, Oscar J. Macias-Garcia, Edgar Vega, Carlos J. Peñaloza, Yersica C. Hernández Díaz, Nicolás Garrido, Victor M. |
dc.contributor.author.none.fl_str_mv |
Suarez, Oscar J. Macias-Garcia, Edgar Vega, Carlos J. Peñaloza, Yersica C. Hernández Díaz, Nicolás Garrido, Victor M. |
dc.subject.keywords.spa.fl_str_mv |
Object Detection; Deep Learning; IOU |
topic |
Object Detection; Deep Learning; IOU LEMB |
dc.subject.armarc.none.fl_str_mv |
LEMB |
description |
Due to the computational power and memory of modern computers, computer vision techniques and neural networks can be used to develop a visual inspection system of agricultural products to satisfy product quality requirements. This chapter employs artificial vision techniques to classify seeds in RGB images. As a first step, an algorithm based on pixel intensity threshold is developed to detect and classify a set of different seed types, such as rice, beans, and lentils. Then, the information inferred by this algorithm is exploited to develop a neural network model, which successfully achieves learning classification and detection tasks through a semantic-segmentation scheme. The applicability and satisfactory performance of the proposed algorithms are illustrated by testing with real images, achieving an average accuracy of 92% in the selected set of classes. The experimental results verify that both algorithms can directly detect and classify the proposed set of seeds in input RGB images. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG. |
publishDate |
2023 |
dc.date.accessioned.none.fl_str_mv |
2023-07-21T16:21:09Z |
dc.date.available.none.fl_str_mv |
2023-07-21T16:21:09Z |
dc.date.issued.none.fl_str_mv |
2023 |
dc.date.submitted.none.fl_str_mv |
2023 |
dc.type.coarversion.fl_str_mv |
http://purl.org/coar/version/c_b1a7d7d4d402bcce |
dc.type.coar.fl_str_mv |
http://purl.org/coar/resource_type/c_2df8fbb1 |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/article |
dc.type.hasversion.spa.fl_str_mv |
info:eu-repo/semantics/draft |
dc.type.spa.spa.fl_str_mv |
http://purl.org/coar/resource_type/c_6501 |
status_str |
draft |
dc.identifier.citation.spa.fl_str_mv |
Suarez, O. J., Macias-Garcia, E., Vega, C. J., Peñaloza, Y. C., Díaz, N. H., & Garrido, V. M. (2022, July). Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks. In IEEE Colombian Conference on Applications of Computational Intelligence (pp. 1-17). Cham: Springer Nature Switzerland. |
dc.identifier.uri.none.fl_str_mv |
https://hdl.handle.net/20.500.12585/12322 |
dc.identifier.doi.none.fl_str_mv |
10.1007/978-3-031-29783-0_1 |
dc.identifier.instname.spa.fl_str_mv |
Universidad Tecnológica de Bolívar |
dc.identifier.reponame.spa.fl_str_mv |
Repositorio Universidad Tecnológica de Bolívar |
identifier_str_mv |
Suarez, O. J., Macias-Garcia, E., Vega, C. J., Peñaloza, Y. C., Díaz, N. H., & Garrido, V. M. (2022, July). Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks. In IEEE Colombian Conference on Applications of Computational Intelligence (pp. 1-17). Cham: Springer Nature Switzerland. 10.1007/978-3-031-29783-0_1 Universidad Tecnológica de Bolívar Repositorio Universidad Tecnológica de Bolívar |
url |
https://hdl.handle.net/20.500.12585/12322 |
dc.language.iso.spa.fl_str_mv |
eng |
language |
eng |
dc.rights.coar.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
dc.rights.uri.*.fl_str_mv |
http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
dc.rights.cc.*.fl_str_mv |
Attribution-NonCommercial-NoDerivatives 4.0 Internacional |
rights_invalid_str_mv |
http://creativecommons.org/licenses/by-nc-nd/4.0/ Attribution-NonCommercial-NoDerivatives 4.0 Internacional http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.none.fl_str_mv |
17 páginas |
dc.format.mimetype.spa.fl_str_mv |
application/pdf |
dc.publisher.place.spa.fl_str_mv |
Cartagena de Indias |
dc.source.spa.fl_str_mv |
Communications in Computer and Information Science |
institution |
Universidad Tecnológica de Bolívar |
bitstream.url.fl_str_mv |
https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/3/license.txt https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/1/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/2/license_rdf https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/4/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf.txt https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/5/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf.jpg |
bitstream.checksum.fl_str_mv |
e20ad307a1c5f3f25af9304a7a7c86b6 27f48aab029d19521adf8a10815149d3 4460e5956bc1d1639be9ae6146a50347 fb3e7ea50d1a96d5159cd2ae108d84d1 f1a5f8293f7e0856ace59a07d6f4eac2 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio Institucional UTB |
repository.mail.fl_str_mv |
repositorioutb@utb.edu.co |
_version_ |
1814021758643077120 |
spelling |
Suarez, Oscar J.b4e56031-cf82-4988-acc0-d6b98038e418Macias-Garcia, Edgarbdab05ee-11d6-46a1-af9e-6b1784dfa2acVega, Carlos J.1b4832f5-df5b-47fa-a9e4-b51914850ea4Peñaloza, Yersica C.f1fa3225-5d02-4d24-a1ae-29295105c3e6Hernández Díaz, Nicolás63e39a63-8a3f-4573-bde7-cd014effbf78Garrido, Victor M.f32cfdd6-b0c1-4f80-bba4-29bceab1320a2023-07-21T16:21:09Z2023-07-21T16:21:09Z20232023Suarez, O. J., Macias-Garcia, E., Vega, C. J., Peñaloza, Y. C., Díaz, N. H., & Garrido, V. M. (2022, July). Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks. In IEEE Colombian Conference on Applications of Computational Intelligence (pp. 1-17). Cham: Springer Nature Switzerland.https://hdl.handle.net/20.500.12585/1232210.1007/978-3-031-29783-0_1Universidad Tecnológica de BolívarRepositorio Universidad Tecnológica de BolívarDue to the computational power and memory of modern computers, computer vision techniques and neural networks can be used to develop a visual inspection system of agricultural products to satisfy product quality requirements. This chapter employs artificial vision techniques to classify seeds in RGB images. As a first step, an algorithm based on pixel intensity threshold is developed to detect and classify a set of different seed types, such as rice, beans, and lentils. Then, the information inferred by this algorithm is exploited to develop a neural network model, which successfully achieves learning classification and detection tasks through a semantic-segmentation scheme. The applicability and satisfactory performance of the proposed algorithms are illustrated by testing with real images, achieving an average accuracy of 92% in the selected set of classes. The experimental results verify that both algorithms can directly detect and classify the proposed set of seeds in input RGB images. © 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.17 páginasapplication/pdfenghttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccessAttribution-NonCommercial-NoDerivatives 4.0 Internacionalhttp://purl.org/coar/access_right/c_abf2Communications in Computer and Information ScienceDesign of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networksinfo:eu-repo/semantics/articleinfo:eu-repo/semantics/drafthttp://purl.org/coar/resource_type/c_6501http://purl.org/coar/version/c_b1a7d7d4d402bccehttp://purl.org/coar/resource_type/c_2df8fbb1Object Detection;Deep Learning;IOULEMBCartagena de IndiasBen-Daya, M., Hassini, E., Bahroun, Z., Banimfreg, B.H. The role of internet of things in food supply chain quality management: A review (2020) Quality Management Journal, 28 (1), pp. 17-40. Cited 38 times. https://www.tandfonline.com/doi/full/10.1080/10686967.2020.1838978 doi: 10.1080/10686967.2020.1838978Schweizer, M., Kolar, J.W. Design and implementation of a highly efficient three-level T-type converter for low-voltage applications (2013) IEEE Transactions on Power Electronics, 28 (2), art. no. 6213134, pp. 899-907. Cited 5093 times. doi: 10.1109/TPEL.2012.2203151Constante, P., Gordon, A., Chang, O., Pruna, E., Acuna, F., Escobar, I. Artificial Vision Techniques to Optimize Strawberrys Industrial Classification (2016) IEEE Latin America Transactions, 14 (6), art. no. 7555221, pp. 2576-2581. Cited 29 times. http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=9907 doi: 10.1109/TLA.2016.7555221Dana, W., Ivo, W. Computer image analysis of seed shape and seed color for flax cultivar description (2008) Computers and Electronics in Agriculture, 61 (2), pp. 126-135. Cited 85 times. doi: 10.1016/j.compag.2007.10.001Schweizer, M., Kolar, J.W. Design and implementation of a highly efficient three-level T-type converter for low-voltage applications (2013) IEEE Transactions on Power Electronics, 28 (2), art. no. 6213134, pp. 899-907. Cited 5093 times. doi: 10.1109/TPEL.2012.2203151Galdelli, A., D’imperio, M., Marchello, G., Mancini, A., Scaccia, M., Sasso, M., Frontoni, E., (...), Cannella, F. A Novel Remote Visual Inspection System for Bridge Predictive Maintenance (2022) Remote Sensing, 14 (9), art. no. 2248. Cited 6 times. https://www.mdpi.com/2072-4292/14/9/2248/pdf?version=1651926784 doi: 10.3390/rs14092248Golnabi, H., Asadpour, A. Design and application of industrial machine vision systems (2007) Robotics and Computer-Integrated Manufacturing, 23 (6), pp. 630-637. Cited 244 times. doi: 10.1016/j.rcim.2007.02.005He, K., Zhang, X., Ren, S., Sun, J. Deep residual learning for image recognition (2016) Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December, art. no. 7780459, pp. 770-778. Cited 108313 times. ISBN: 978-146738850-4 doi: 10.1109/CVPR.2016.90Ismail, N., Malik, O.A. Real-time visual inspection system for grading fruits using computer vision and deep learning techniques (2022) Information Processing in Agriculture, 9 (1), pp. 24-37. Cited 42 times. http://www.elsevier.com/journals/information-processing-in-agriculture/2214-3173# doi: 10.1016/j.inpa.2021.01.005Jaffery, Z.A., Dubey, A.K. Scope and prospects of non-invasive visual inspection systems for industrial applications (2016) Indian Journal of Science and Technology, 9 (4). Cited 13 times. http://www.indjst.org doi: 10.17485/ijst/2016/v9i4/80067Jampílek, J., KráL'Ová, K. Application of Nanotechnology in Agriculture and Food Industry, Its Prospects and Risks (2015) Ecological Chemistry and Engineering S, 22 (3), pp. 321-361. Cited 92 times. http://www.degruyter.com/view/j/eces doi: 10.1515/eces-2015-0018Jhawar, J. Orange Sorting by Applying Pattern Recognition on Colour Image (Open Access) (2016) Procedia Computer Science, 78, pp. 691-697. Cited 29 times. http://www.sciencedirect.com/science/journal/18770509 doi: 10.1016/j.procs.2016.02.118Birmelé, E., Crescenzi, P., Ferreira, R., Grossi, R., Lacroix, V., Marino, A., Pisanti, N., (...), Sagot, M.-F. Efficient bubble enumeration in directed graphs (2012) Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 7608 LNCS, pp. 118-129. Cited 558 times. https://www.springer.com/series/558 ISBN: 978-364234108-3 doi: 10.1007/978-3-642-34109-0_13Lecun, Y., Bengio, Y., Hinton, G. Deep learning (Open Access) (2015) Nature, 521 (7553), pp. 436-444. Cited 46322 times. http://www.nature.com/nature/index.html doi: 10.1038/nature14539Lu, Y., Yi, S., Zeng, N., Liu, Y., Zhang, Y. Identification of rice diseases using deep convolutional neural networks (2017) Neurocomputing, 267, pp. 378-384. Cited 581 times. www.elsevier.com/locate/neucom doi: 10.1016/j.neucom.2017.06.023Macias-Garcia, E., Galeana-Perez, D. Multi-stage deep learning perception system for mobile robots (2020) Integr. Comput.-Aided Eng. (Preprint), pp. 1-15. Cited 3 times.Mansuri, L.E., Patel, D.A. Artificial intelligence-based automatic visual inspection system for built heritage (2022) Smart and Sustainable Built Environment, 11 (3), pp. 622-646. Cited 19 times. www.emeraldinsight.com/products/journals/journals.htm?id=sasbe doi: 10.1108/SASBE-09-2020-0139Manzoor, M.F., Hussain, A., Naumovski, N., Ranjha, M.M.A.N., Ahmad, N., Karrar, E., Xu, B., (...), Ibrahim, S.A. A Narrative Review of Recent Advances in Rapid Assessment of Anthocyanins in Agricultural and Food Products (Open Access) (2022) Frontiers in Nutrition, 9, art. no. 901342. Cited 14 times. journal.frontiersin.org/journal/nutrition doi: 10.3389/fnut.2022.901342Misra, N., Dixit, Y., Al-Mallahi, A., Bhullar, M., Upadhyay, R., Martynenko, A. IoT, big data and artificial intelligence in agriculture and food industry (2020) IEEE Internet of Things. Cited 42 times.Negrete, J. Artificial vision in Mexican agriculture, a new technology for increase food security (2018) Manag. Econ. J, pp. 381-398. Cited 3 times.Newman, T.S. Survey of automated visual inspection (Open Access) (1995) Computer Vision and Image Understanding, 61 (2), pp. 231-262. Cited 530 times. doi: 10.1006/cviu.1995.1017Ravikumar, S., Ramachandran, K.I., Sugumaran, V. Machine learning approach for automated visual inspection of machine components (Open Access) (2011) Expert Systems with Applications, 38 (4), pp. 3260-3266. Cited 93 times. doi: 10.1016/j.eswa.2010.09.012Redmon, J., Farhadi, A. (2018) Yolov3: An Incremental Improvement. Arxiv, 1804, p. 02767. Cited 565 times.Romualdo, L.M., Luz, P.H.C., Devechio, F.F.S., Marin, M.A., Zúñiga, A.M.G., Bruno, O.M., Herling, V.R. Use of artificial vision techniques for diagnostic of nitrogen nutritional status in maize plants (2014) Computers and Electronics in Agriculture, 104, pp. 63-70. Cited 35 times. www.elsevier.com/inca/publications/store/5/0/3/3/0/4 doi: 10.1016/j.compag.2014.03.009Gomes, J.F.S., Leta, F.R. Applications of computer vision techniques in the agriculture and food industry: A review (2012) European Food Research and Technology, 235 (6), pp. 989-1000. Cited 111 times. doi: 10.1007/s00217-012-1844-2Sarkar, N.R. Machine vision for quality control in the food industry (2017) Instrumental Methods for Quality Assurance in Foods, pp. 167-187. Cited 11 times. http://www.tandfebooks.com/doi/book/10.1201/9780203750711 ISBN: 978-135143814-8; 978-082478278-8 doi: 10.1201/9780203750711Schmidhuber, J. Deep Learning in neural networks: An overview (2015) Neural Networks, 61, pp. 85-117. Cited 11372 times. www.elsevier.com/locate/neunet doi: 10.1016/j.neunet.2014.09.003Suarez, O.J., Díaz, N.H., Garcia, A.P. A real-time pattern recognition module via Matlab-Arduino interface (Open Access) (2020) Proceedings of the LACCEI international Multi-conference for Engineering, Education and Technology. Cited 2 times. http://www.laccei.org/index.php/publications/laccei-proceedings ISBN: 978-958520714-1 doi: 10.18687/LACCEI2020.1.1.646Schweizer, M., Kolar, J.W. Design and implementation of a highly efficient three-level T-type converter for low-voltage applications (Open Access) (2013) IEEE Transactions on Power Electronics, 28 (2), art. no. 6213134, pp. 899-907. Cited 5093 times. doi: 10.1109/TPEL.2012.2203151Szeliski, R. (2010) Computer Vision: Algorithms and Applications. Cited 3570 times. Springer, HeidelbergVoulodimos, A., Doulamis, N., Doulamis, A., Protopapadakis, E. Deep learning for computer vision: A brief review. Comput. Intell (2018) Neurosci, p. 2018. Cited 50 times.Yu, C., Wang, J. BiSeNet: Bilateral segmentation network for real-time semantic segmentation (2018) Proceedings of the European Conference on Computer Vision, pp. 325-341. Cited 970 times.http://purl.org/coar/resource_type/c_6501LICENSElicense.txtlicense.txttext/plain; charset=utf-83182https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/3/license.txte20ad307a1c5f3f25af9304a7a7c86b6MD53ORIGINALScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdfScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdfapplication/pdf173705https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/1/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf27f48aab029d19521adf8a10815149d3MD51CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8805https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/2/license_rdf4460e5956bc1d1639be9ae6146a50347MD52TEXTScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdf.txtScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdf.txtExtracted texttext/plain2261https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/4/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf.txtfb3e7ea50d1a96d5159cd2ae108d84d1MD54THUMBNAILScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdf.jpgScopus - Document details - Design of a Segmentation and Classification System for Seed Detection Based on Pixel Intensity Thresholds and Convolutional Neural Networks.pdf.jpgGenerated Thumbnailimage/jpeg5841https://repositorio.utb.edu.co/bitstream/20.500.12585/12322/5/Scopus%20-%20Document%20details%20-%20Design%20of%c2%a0a%c2%a0Segmentation%20and%c2%a0Classification%20System%20for%c2%a0Seed%20Detection%20Based%20on%c2%a0Pixel%20Intensity%20Thresholds%20and%c2%a0Convolutional%20Neural%20Networks.pdf.jpgf1a5f8293f7e0856ace59a07d6f4eac2MD5520.500.12585/12322oai:repositorio.utb.edu.co:20.500.12585/123222023-07-22 00:18:10.017Repositorio Institucional UTBrepositorioutb@utb.edu.coQXV0b3Jpem8gKGF1dG9yaXphbW9zKSBhIGxhIEJpYmxpb3RlY2EgZGUgbGEgSW5zdGl0dWNpw7NuIHBhcmEgcXVlIGluY2x1eWEgdW5hIGNvcGlhLCBpbmRleGUgeSBkaXZ1bGd1ZSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBsYSBvYnJhIG1lbmNpb25hZGEgY29uIGVsIGZpbiBkZSBmYWNpbGl0YXIgbG9zIHByb2Nlc29zIGRlIHZpc2liaWxpZGFkIGUgaW1wYWN0byBkZSBsYSBtaXNtYSwgY29uZm9ybWUgYSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBxdWUgbWUobm9zKSBjb3JyZXNwb25kZShuKSB5IHF1ZSBpbmNsdXllbjogbGEgcmVwcm9kdWNjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSwgZGlzdHJpYnVjacOzbiBhbCBww7pibGljbywgdHJhbnNmb3JtYWNpw7NuLCBkZSBjb25mb3JtaWRhZCBjb24gbGEgbm9ybWF0aXZpZGFkIHZpZ2VudGUgc29icmUgZGVyZWNob3MgZGUgYXV0b3IgeSBkZXJlY2hvcyBjb25leG9zIHJlZmVyaWRvcyBlbiBhcnQuIDIsIDEyLCAzMCAobW9kaWZpY2FkbyBwb3IgZWwgYXJ0IDUgZGUgbGEgbGV5IDE1MjAvMjAxMiksIHkgNzIgZGUgbGEgbGV5IDIzIGRlIGRlIDE5ODIsIExleSA0NCBkZSAxOTkzLCBhcnQuIDQgeSAxMSBEZWNpc2nDs24gQW5kaW5hIDM1MSBkZSAxOTkzIGFydC4gMTEsIERlY3JldG8gNDYwIGRlIDE5OTUsIENpcmN1bGFyIE5vIDA2LzIwMDIgZGUgbGEgRGlyZWNjacOzbiBOYWNpb25hbCBkZSBEZXJlY2hvcyBkZSBhdXRvciwgYXJ0LiAxNSBMZXkgMTUyMCBkZSAyMDEyLCBsYSBMZXkgMTkxNSBkZSAyMDE4IHkgZGVtw6FzIG5vcm1hcyBzb2JyZSBsYSBtYXRlcmlhLgoKQWwgcmVzcGVjdG8gY29tbyBBdXRvcihlcykgbWFuaWZlc3RhbW9zIGNvbm9jZXIgcXVlOgoKLSBMYSBhdXRvcml6YWNpw7NuIGVzIGRlIGNhcsOhY3RlciBubyBleGNsdXNpdmEgeSBsaW1pdGFkYSwgZXN0byBpbXBsaWNhIHF1ZSBsYSBsaWNlbmNpYSB0aWVuZSB1bmEgdmlnZW5jaWEsIHF1ZSBubyBlcyBwZXJwZXR1YSB5IHF1ZSBlbCBhdXRvciBwdWVkZSBwdWJsaWNhciBvIGRpZnVuZGlyIHN1IG9icmEgZW4gY3VhbHF1aWVyIG90cm8gbWVkaW8sIGFzw60gY29tbyBsbGV2YXIgYSBjYWJvIGN1YWxxdWllciB0aXBvIGRlIGFjY2nDs24gc29icmUgZWwgZG9jdW1lbnRvLgoKLSBMYSBhdXRvcml6YWNpw7NuIHRlbmRyw6EgdW5hIHZpZ2VuY2lhIGRlIGNpbmNvIGHDsW9zIGEgcGFydGlyIGRlbCBtb21lbnRvIGRlIGxhIGluY2x1c2nDs24gZGUgbGEgb2JyYSBlbiBlbCByZXBvc2l0b3JpbywgcHJvcnJvZ2FibGUgaW5kZWZpbmlkYW1lbnRlIHBvciBlbCB0aWVtcG8gZGUgZHVyYWNpw7NuIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlbCBhdXRvciB5IHBvZHLDoSBkYXJzZSBwb3IgdGVybWluYWRhIHVuYSB2ZXogZWwgYXV0b3IgbG8gbWFuaWZpZXN0ZSBwb3IgZXNjcml0byBhIGxhIGluc3RpdHVjacOzbiwgY29uIGxhIHNhbHZlZGFkIGRlIHF1ZSBsYSBvYnJhIGVzIGRpZnVuZGlkYSBnbG9iYWxtZW50ZSB5IGNvc2VjaGFkYSBwb3IgZGlmZXJlbnRlcyBidXNjYWRvcmVzIHkvbyByZXBvc2l0b3Jpb3MgZW4gSW50ZXJuZXQgbG8gcXVlIG5vIGdhcmFudGl6YSBxdWUgbGEgb2JyYSBwdWVkYSBzZXIgcmV0aXJhZGEgZGUgbWFuZXJhIGlubWVkaWF0YSBkZSBvdHJvcyBzaXN0ZW1hcyBkZSBpbmZvcm1hY2nDs24gZW4gbG9zIHF1ZSBzZSBoYXlhIGluZGV4YWRvLCBkaWZlcmVudGVzIGFsIHJlcG9zaXRvcmlvIGluc3RpdHVjaW9uYWwgZGUgbGEgSW5zdGl0dWNpw7NuLCBkZSBtYW5lcmEgcXVlIGVsIGF1dG9yKHJlcykgdGVuZHLDoW4gcXVlIHNvbGljaXRhciBsYSByZXRpcmFkYSBkZSBzdSBvYnJhIGRpcmVjdGFtZW50ZSBhIG90cm9zIHNpc3RlbWFzIGRlIGluZm9ybWFjacOzbiBkaXN0aW50b3MgYWwgZGUgbGEgSW5zdGl0dWNpw7NuIHNpIGRlc2VhIHF1ZSBzdSBvYnJhIHNlYSByZXRpcmFkYSBkZSBpbm1lZGlhdG8uCgotIExhIGF1dG9yaXphY2nDs24gZGUgcHVibGljYWNpw7NuIGNvbXByZW5kZSBlbCBmb3JtYXRvIG9yaWdpbmFsIGRlIGxhIG9icmEgeSB0b2RvcyBsb3MgZGVtw6FzIHF1ZSBzZSByZXF1aWVyYSBwYXJhIHN1IHB1YmxpY2FjacOzbiBlbiBlbCByZXBvc2l0b3Jpby4gSWd1YWxtZW50ZSwgbGEgYXV0b3JpemFjacOzbiBwZXJtaXRlIGEgbGEgaW5zdGl0dWNpw7NuIGVsIGNhbWJpbyBkZSBzb3BvcnRlIGRlIGxhIG9icmEgY29uIGZpbmVzIGRlIHByZXNlcnZhY2nDs24gKGltcHJlc28sIGVsZWN0csOzbmljbywgZGlnaXRhbCwgSW50ZXJuZXQsIGludHJhbmV0LCBvIGN1YWxxdWllciBvdHJvIGZvcm1hdG8gY29ub2NpZG8gbyBwb3IgY29ub2NlcikuCgotIExhIGF1dG9yaXphY2nDs24gZXMgZ3JhdHVpdGEgeSBzZSByZW51bmNpYSBhIHJlY2liaXIgY3VhbHF1aWVyIHJlbXVuZXJhY2nDs24gcG9yIGxvcyB1c29zIGRlIGxhIG9icmEsIGRlIGFjdWVyZG8gY29uIGxhIGxpY2VuY2lhIGVzdGFibGVjaWRhIGVuIGVzdGEgYXV0b3JpemFjacOzbi4KCi0gQWwgZmlybWFyIGVzdGEgYXV0b3JpemFjacOzbiwgc2UgbWFuaWZpZXN0YSBxdWUgbGEgb2JyYSBlcyBvcmlnaW5hbCB5IG5vIGV4aXN0ZSBlbiBlbGxhIG5pbmd1bmEgdmlvbGFjacOzbiBhIGxvcyBkZXJlY2hvcyBkZSBhdXRvciBkZSB0ZXJjZXJvcy4gRW4gY2FzbyBkZSBxdWUgZWwgdHJhYmFqbyBoYXlhIHNpZG8gZmluYW5jaWFkbyBwb3IgdGVyY2Vyb3MgZWwgbyBsb3MgYXV0b3JlcyBhc3VtZW4gbGEgcmVzcG9uc2FiaWxpZGFkIGRlbCBjdW1wbGltaWVudG8gZGUgbG9zIGFjdWVyZG9zIGVzdGFibGVjaWRvcyBzb2JyZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBsYSBvYnJhIGNvbiBkaWNobyB0ZXJjZXJvLgoKLSBGcmVudGUgYSBjdWFscXVpZXIgcmVjbGFtYWNpw7NuIHBvciB0ZXJjZXJvcywgZWwgbyBsb3MgYXV0b3JlcyBzZXLDoW4gcmVzcG9uc2FibGVzLCBlbiBuaW5nw7puIGNhc28gbGEgcmVzcG9uc2FiaWxpZGFkIHNlcsOhIGFzdW1pZGEgcG9yIGxhIGluc3RpdHVjacOzbi4KCi0gQ29uIGxhIGF1dG9yaXphY2nDs24sIGxhIGluc3RpdHVjacOzbiBwdWVkZSBkaWZ1bmRpciBsYSBvYnJhIGVuIMOtbmRpY2VzLCBidXNjYWRvcmVzIHkgb3Ryb3Mgc2lzdGVtYXMgZGUgaW5mb3JtYWNpw7NuIHF1ZSBmYXZvcmV6Y2FuIHN1IHZpc2liaWxpZGFkCgo= |