Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones

ilustraciones, fotografías a color, gráficas

Autores:
Arias VAnegas, Victor Alfonso
Tipo de recurso:
Fecha de publicación:
2022
Institución:
Universidad Nacional de Colombia
Repositorio:
Universidad Nacional de Colombia
Idioma:
spa
OAI Identifier:
oai:repositorio.unal.edu.co:unal/82351
Acceso en línea:
https://repositorio.unal.edu.co/handle/unal/82351
Palabra clave:
630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales
632 - Lesiones, enfermedades, plagas vegetales
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
Control de maleza
Control de maleza - investigaciones
Weed control - research
Weed control
Mapeo de Maleza
Segmentación Semántica
imágenes Multiespectrales
Aprendizaje Profundo
Vehículo Aéreo No Tripulado
Clasificación Por Píxeles
Aprendizaje Automático En Producción
Redes Neuronales Convolucionales
Rights
openAccess
License
Reconocimiento 4.0 Internacional
id UNACIONAL2_ca92782d0b26e13636f04b0e05f6574e
oai_identifier_str oai:repositorio.unal.edu.co:unal/82351
network_acronym_str UNACIONAL2
network_name_str Universidad Nacional de Colombia
repository_id_str
dc.title.spa.fl_str_mv Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
dc.title.translated.eng.fl_str_mv Deep learning for weed mapping using multispectral drone Acquired imagery
title Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
spellingShingle Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales
632 - Lesiones, enfermedades, plagas vegetales
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
Control de maleza
Control de maleza - investigaciones
Weed control - research
Weed control
Mapeo de Maleza
Segmentación Semántica
imágenes Multiespectrales
Aprendizaje Profundo
Vehículo Aéreo No Tripulado
Clasificación Por Píxeles
Aprendizaje Automático En Producción
Redes Neuronales Convolucionales
title_short Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
title_full Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
title_fullStr Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
title_full_unstemmed Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
title_sort Aprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por drones
dc.creator.fl_str_mv Arias VAnegas, Victor Alfonso
dc.contributor.advisor.none.fl_str_mv Gonzalez Osorio, Fabio Augusto
dc.contributor.author.none.fl_str_mv Arias VAnegas, Victor Alfonso
dc.contributor.researchgroup.spa.fl_str_mv Machine Learning Perception and Discovery Lab (MindLab)
dc.subject.ddc.spa.fl_str_mv 630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales
632 - Lesiones, enfermedades, plagas vegetales
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
topic 630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales
632 - Lesiones, enfermedades, plagas vegetales
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
Control de maleza
Control de maleza - investigaciones
Weed control - research
Weed control
Mapeo de Maleza
Segmentación Semántica
imágenes Multiespectrales
Aprendizaje Profundo
Vehículo Aéreo No Tripulado
Clasificación Por Píxeles
Aprendizaje Automático En Producción
Redes Neuronales Convolucionales
dc.subject.lemb.spa.fl_str_mv Control de maleza
Control de maleza - investigaciones
dc.subject.lemb.eng.fl_str_mv Weed control - research
Weed control
dc.subject.proposal.spa.fl_str_mv Mapeo de Maleza
Segmentación Semántica
imágenes Multiespectrales
Aprendizaje Profundo
Vehículo Aéreo No Tripulado
Clasificación Por Píxeles
Aprendizaje Automático En Producción
Redes Neuronales Convolucionales
description ilustraciones, fotografías a color, gráficas
publishDate 2022
dc.date.accessioned.none.fl_str_mv 2022-10-04T13:13:35Z
dc.date.available.none.fl_str_mv 2022-10-04T13:13:35Z
dc.date.issued.none.fl_str_mv 2022-10-02
dc.type.spa.fl_str_mv Trabajo de grado - Maestría
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/masterThesis
dc.type.version.spa.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.content.spa.fl_str_mv Text
dc.type.redcol.spa.fl_str_mv http://purl.org/redcol/resource_type/TM
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://repositorio.unal.edu.co/handle/unal/82351
url https://repositorio.unal.edu.co/handle/unal/82351
dc.language.iso.spa.fl_str_mv spa
language spa
dc.relation.indexed.spa.fl_str_mv Bireme
RedCol
dc.relation.references.spa.fl_str_mv Stephen O Duke. Perspectives on transgenic, herbicide-resistant crops in the united states almost 20 years after introduction. Pest management science, 71(5):652–657, 2015.
Alexa Varah, Kwadjo Ahodo, Shaun R Coutts, Helen L Hicks, David Comont, Laura Crook, Richard Hull, Paul Neve, Dylan Z Childs, Robert P Freckleton, et al. The costs of human-induced evolution in an agricultural system. Nature sustainability, 3(1):63–71, 2020.
Alessandro dos Santos Ferreira, Daniel Matte Freitas, Gercina Gon ̧calves da Silva, Hemerson Pistori, and Marcelo Theophilo Folhes. Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143:314–324, 2017.
Shirley A Briggs. Basic guide to pesticides: their characteristics and hazards. CRC Press, 2018.
Isabelle Schuster, Henning Nordmeyer, and Thomas Rath. Comparison of vision-based and manual weed mapping in sugar beet. Biosystems engineering, 98(1):17–25, 2007.
David Pimentel, Herbert Acquay, Michael Biltonen, P Rice, M Silva, J Nelson, V Lip- ner, S Giordano, A Horowitz, and M D’amore. Environmental and economic costs of pesticide use. BioScience, 42(10):750–760, 1992.
K Neil Harker and John T O’Donovan. Recent weed control, weed management, and integrated weed management. Weed Technology, 27(1):1–11, 2013.
Mulham Fawakherji, Ali Youssef, Domenico D Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weed classification using pixel-wise segmentation on ground and aerial images. Int. J. Robot. Comput, 2(1):39–57, 2020.
David R Shaw. Remote sensing and site-specific weed management. Frontiers in Ecology and the Environment, 3(10):526–532, 2005.
Colin Birch, Ian Cooper, Gurjeet Gill, Stephen Adkins, and Madan Gupta. Weed management in rainfed agricultural systems. In Rainfed Farming Systems, pages 215– 232. Springer, 2011.
Philipp Lottes, Jens Behley, Nived Chebrolu, Andres Milioto, and Cyrill Stachniss. Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming. Journal of Field Robotics, 37(1):20–34, 2020.
Inkyu Sa, Marija Popovi ́c, Raghav Khanna, Zetao Chen, Philipp Lottes, Frank Liebisch, Juan Nieto, Cyrill Stachniss, Achim Walter, and Roland Siegwart. Weedmap: a large- scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sensing, 10(9):1423, 2018.
Jorge Torres-S ́anchez, Jos ́e Manuel Pena, Ana Isabel de Castro, and Fransisca L ́opez- Granados. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Computers and Electronics in Agriculture, 103:104–113, 2014.
Chang-chun Li, Guang-sheng Zhang, Tian-jie Lei, and A-du GONG. Quick image- processing method of uav without control points data in earthquake disaster area. Transactions of Nonferrous Metals Society of China, 21:s523–s528, 2011.
Andreas Kamilaris and Francesc X Prenafeta-Bold ́u. Deep learning in agriculture: A survey. Computers and electronics in agriculture, 147:70–90, 2018.
Konstantinos G Liakos, Patrizia Busato, Dimitrios Moshou, Simon Pearson, and Dion- ysis Bochtis. Machine learning in agriculture: A review. Sensors, 18(8):2674, 2018.
Dimosthenis C Tsouros, Stamatia Bibi, and Panagiotis G Sarigiannidis. A review on uav-based applications for precision agriculture. Information, 10(11):349, 2019.
Huasheng Huang, Yubin Lan, Aqing Yang, Yali Zhang, Sheng Wen, and Jizhong Deng. Deep learning versus object-based image analysis (obia) in weed mapping of uav im- agery. International Journal of Remote Sensing, 41(9):3446–3479, 2020.
Inkyu Sa, Zetao Chen, Marija Popovi ́c, Raghav Khanna, Frank Liebisch, Juan Nieto, and Roland Siegwart. weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1):588– 595, 2017.
Alwaseela Abdalla, Haiyan Cen, Liang Wan, Reem Rashid, Haiyong Weng, Weijun Zhou, and Yong He. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Computers and Electronics in Agriculture, 167:105091, 2019.
Kavir Osorio, Andr ́es Puerto, Cesar Pedraza, David Jamaica, and Leonardo Rodr ́ıguez. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering, 2(3):471–488, 2020.
Sigurbj ̈orn J ́onsson. Rgb and multispectral uav image classification of agricultural fields using a machine learning algorithm. Student thesis series INES, 2019.
ASM Mahmudul Hasan, Ferdous Sohel, Dean Diepeveen, Hamid Laga, and Michael GK Jones. A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture, 184:106067, 2021.
Aurelien Geron. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. ” O’Reilly Media, Inc.”, 2019.
Calvin Hung, Zhe Xu, and Salah Sukkarieh. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a uav. Remote Sensing, 6(12):12037–12054, 2014.
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553):436–444, 2015.
Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola. Dive into deep learning. arXiv preprint arXiv:2106.11342, 2021.
Huasheng Huang, Jizhong Deng, Yubin Lan, Aqing Yang, Xiaoling Deng, and Lei Zhang. A fully convolutional network for weed mapping of unmanned aerial vehicle (uav) im- agery. PloS one, 13(4):e0196302, 2018.
Jonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431–3440, 2015.
Sebastian Haug and J ̈orn Ostermann. A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In Lourdes Agapito, Michael M. Bronstein, and Carsten Rother, editors, Computer Vision - ECCV 2014 Workshops, pages 105–116, Cham, 2015. Springer International Publishing.
Sebastian Haug, Andreas Michaels, Peter Biber, and J ̈orn Ostermann. Plant classi- fication system for crop/weed discrimination without segmentation. In IEEE winter conference on applications of computer vision, pages 1142–1149. IEEE, 2014.
Jos ́e Manuel Pe ̃na, Jorge Torres-S ́anchez, Ana Isabel de Castro, Maggi Kelly, and Fran- cisca L ́opez-Granados. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLOS ONE, 8(10):null, 10 2013.
Maria Perez-Ortiz, JM Pena, Pedro Antonio Gutierrez, Jorge Torres-Sanchez, Cesar Hervas-Martınez, and Francisca Lopez-Granados. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Applied Soft Computing, 37:533–544, 2015.
Thomas K Alexandridis, Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Anastasia L Lagopodi, Javid Kashefi, Georgios Ovakoglou, Vassilios Polychronos, and Dimitrios Moshou. Novelty detection classifiers in weed mapping: Silybum marianum detection on uav multispectral images. Sensors, 17(9):2007, 2017.
Philipp Lottes, Raghav Khanna, Johannes Pfeifer, Roland Siegwart, and Cyrill Stach- niss. Uav-based crop and weed classification for smart farming. In 2017 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 3024–3031. IEEE, 2017.
nders Krogh Mortensen, Mads Dyrmann, Henrik Karstoft, R Nyholm Jørgensen, Ren ́e Gislum, et al. Semantic segmentation of mixed crops using deep convolutional neural network. In CIGR-AgEng Conference, 26-29 June 2016, Aarhus, Denmark. Abstracts and Full papers, pages 1–6. Organising Committee, CIGR 2016, 2016.
M. Dyrmann, R. N. Jørgensen, and H. S. Midtiby. Roboweedsupport - detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences, 8(2):842–847, 2017.
Vijay Badrinarayanan, Alex Kendall, and Roberto Cipolla. Segnet: A deep convo- lutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence, 39(12):2481–2495, 2017.
Maurilio Di Cicco, Ciro Potena, Giorgio Grisetti, and Alberto Pretto. Automatic model based dataset generation for fast and accurate crop and weeds detection. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 5188–5195. IEEE, 2017.
Huasheng Huang, Yubin Lan, Jizhong Deng, Aqing Yang, Xiaoling Deng, Lei Zhang, and Sheng Wen. A semantic labeling approach for accurate weed mapping of high resolution uav imagery. Sensors, 18(7):2113, 2018.
Soren Skovsen, Mads Dyrmann, Anders K Mortensen, Morten S Laursen, Ren ́e Gis- lum, Jorgen Eriksen, Sadaf Farkhani, Henrik Karstoft, and Rasmus N Jorgensen. The grassclover image dataset for semantic and hierarchical species understanding in agri- culture. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 0–0, 2019.
Yannik Rist, Iurii Shendryk, Foivos Diakogiannis, and Shaun Levick. Weed mapping using very high resolution satellite imagery and fully convolutional neural network. In IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, pages 9784–9787. IEEE, 2019.
Foivos I Diakogiannis, Fran ̧cois Waldner, Peter Caccetta, and Chen Wu. Resunet-a: A deep learning framework for semantic segmentation of remotely sensed data. ISPRS Journal of Photogrammetry and Remote Sensing, 162:94–114, 2020.
Shyam Prasad Adhikari, Heechan Yang, and Hyongsuk Kim. Learning semantic graph- ics using convolutional encoder–decoder network for autonomous weeding in paddy. Frontiers in plant science, 10:1404, 2019.
Olaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image com- puting and computer-assisted intervention, pages 234–241. Springer, 2015.
Liang-Chieh Chen, George Papandreou, Florian Schroff, and Hartwig Adam. Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:1706.05587, 2017.
Anderson Brilhador, Matheus Gutoski, Leandro Takeshi Hattori, Andrei de Souza In ́acio, Andr ́e Eugˆenio Lazzaretti, and Heitor Silv ́erio Lopes. Classifi- cation of weeds and crops at the pixel-level using convolutional neural networks and data augmentation. In 2019 IEEE Latin American Conference on Computational Intelligence (LA-CCI), pages 1–6. IEEE, 2019.
Mulham Fawakherji, Ali Youssef, Domenico Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weeds classification for precision agriculture using context-independent pixel- wise segmentation. In 2019 Third IEEE International Conference on Robotic Computing (IRC), pages 146–152. IEEE, 2019.
Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
Muhammad Hamza Asad and Abdul Bais. Weed detection in canola fields using maxi- mum likelihood classification and deep convolutional neural network. Information Pro- cessing in Agriculture, 7(4):535–545, 2020.
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.
Xu Ma, Xiangwu Deng, Long Qi, Yu Jiang, Hongwei Li, Yuwei Wang, and Xupo Xing. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PloS one, 14(4):e0215676, 2019.
Lukas Petrich, Georg Lohrmann, Matthias Neumann, Fabio Martin, Andreas Frey, Al- bert Stoll, and Volker Schmidt. Detection of colchicum autumnale in drone images, using a machine-learning approach. Precision Agriculture, 21(6):1291–1303, 2020.
W Ramirez, P Achanccaray, LF Mendoza, and MAC Pacheco. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. In 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), pages 133–137. IEEE, 2020.
Navneet Dalal and Bill Triggs. Histograms of oriented gradients for human detection. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), volume 1, pages 886–893. Ieee, 2005.
Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779–788, 2016.
Kaiming He, Georgia Gkioxari, Piotr Doll ́ar, and Ross Girshick. Mask r-cnn. In Pro- ceedings of the IEEE international conference on computer vision, pages 2961–2969, 2017.
Kunlin Zou, Xin Chen, Fan Zhang, Hang Zhou, and Chunlong Zhang. A field weed density evaluation method based on uav imaging and modified u-net. Remote Sensing, 13(2):310, 2021.
Petra Bosilj, Erchan Aptoula, Tom Duckett, and Grzegorz Cielniak. Transfer learn- ing between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1):7–19, 2020.
S Umamaheswari and Ashvini V Jain. Encoder–decoder architecture for crop-weed classification using pixel-wise labelling. In 2020 International Conference on Artificial Intelligence and Signal Processing (AISP), pages 1–6. IEEE, 2020.
Aichen Wang, Yifei Xu, Xinhua Wei, and Bingbo Cui. Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. IEEE Access, 8:81724–81734, 2020.
Yuzhen Lu and Sierra Young. A survey of public datasets for computer vision tasks in precision agriculture. Computers and Electronics in Agriculture, 178:105760, 2020.
Zhangnan Wu, Yajun Chen, Bo Zhao, Xiaobing Kang, and Yuanyuan Ding. Review of weed detection methods based on computer vision. Sensors, 21(11):3647, 2021.
Merima Kulin, Tarik Kazaz, Eli De Poorter, and Ingrid Moerman. A survey on machine learning-based performance improvement of wireless networks: Phy, mac and network layer. Electronics, 10(3):318, 2021.
Panqu Wang, Pengfei Chen, Ye Yuan, Ding Liu, Zehua Huang, Xiaodi Hou, and Gar- rison Cottrell. Understanding convolution for semantic segmentation. In 2018 IEEE winter conference on applications of computer vision (WACV), pages 1451–1460. IEEE, 2018.
Peng Liu, Hui Zhang, and Kie B Eom. Active deep learning for classification of hyper- spectral images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10(2):712–724, 2016.
Nguyen Thanh Toan and Nguyen Thanh Tam. Early bushfire detection with 3d cnn from streams of satellite images.
Nikhil Jangamreddy. A survey on specialised hardware for machine learning. 2019.
E-C Oerke. Crop losses to pests. The Journal of Agricultural Science, 144(1):31–43, 2006.
Craig D Osteen and Jorge Fernandez-Cornejo. Herbicide use trends: a backgrounder. Choices, 31(4):1–7, 2016.
John Peterson Myers, Michael N Antoniou, Bruce Blumberg, Lynn Carroll, Theo Col- born, Lorne G Everett, Michael Hansen, Philip J Landrigan, Bruce P Lanphear, Robin Mesnage, et al. Concerns over use of glyphosate-based herbicides and risks associated with exposures: a consensus statement. Environmental Health, 15(1):1–13, 2016.
Kevis-Kokitsi Maninis, Jordi Pont-Tuset, Pablo Arbel ́aez, and Luc Van Gool. Deep retinal image understanding. In International conference on medical image computing and computer-assisted intervention, pages 140–148. Springer, 2016.
Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
Liam Li and Ameet Talwalkar. Random search and reproducibility for neural architec- ture search. In Uncertainty in artificial intelligence, pages 367–377. PMLR, 2020.
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.license.spa.fl_str_mv Reconocimiento 4.0 Internacional
dc.rights.uri.spa.fl_str_mv http://creativecommons.org/licenses/by/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
rights_invalid_str_mv Reconocimiento 4.0 Internacional
http://creativecommons.org/licenses/by/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv xii, 45 páginas
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.spa.fl_str_mv Universidad Nacional De Colombia
dc.publisher.program.spa.fl_str_mv Bogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y Computación
dc.publisher.faculty.spa.fl_str_mv Facultad de Ingeniería
dc.publisher.place.spa.fl_str_mv Bogotá, Colombia
dc.publisher.branch.spa.fl_str_mv Universidad Nacional de Colombia - Sede Bogotá
institution Universidad Nacional de Colombia
bitstream.url.fl_str_mv https://repositorio.unal.edu.co/bitstream/unal/82351/1/license.txt
https://repositorio.unal.edu.co/bitstream/unal/82351/2/1090457208.2022.pdf
bitstream.checksum.fl_str_mv eb34b1cf90b7e1103fc9dfd26be24b4a
8e4d6575f15fd8f801784678171a8c6f
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
repository.name.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
repository.mail.fl_str_mv repositorio_nal@unal.edu.co
_version_ 1806886625413693440
spelling Reconocimiento 4.0 Internacionalhttp://creativecommons.org/licenses/by/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Gonzalez Osorio, Fabio Augusto0e9d70b5c1d7448338ca4467ccb27e59Arias VAnegas, Victor Alfonso478b2eff19ecd4ec3c43845951eaa749Machine Learning Perception and Discovery Lab (MindLab)2022-10-04T13:13:35Z2022-10-04T13:13:35Z2022-10-02https://repositorio.unal.edu.co/handle/unal/82351ilustraciones, fotografías a color, gráficasLa maleza o malas hierbas se define como una planta que crece de forma silvestre en un lugar indeseable para la actividad agrícola. Esto es debido a que compite por los recursos limitados disponibles en un sector previamente destinado y acondicionado a la producción de alimentos u otras actividades específicas, disminuyendo su rendimiento. Tradicionalmente los granjeros aplican la escarda o eliminación de malas hierbas con herramientas manuales, haciendo de este un proceso lento y costoso debido a la gran cantidad de mano de obra necesaria. Con el fin de reducir el número de trabajadores en la labor, agentes químicos de acción selectiva son usados directamente sobre el cultivo para matar la planta invasora, sin embargo, en grandes extensiones de terreno es difícil conocer previamente la distribución espacial de la maleza, por lo que la aplicación del agente se hace de manera uniforme en toda la plantación, llevando a un mayor desperdicio del producto y por ende un incremento en los costos. En este documento se propone una estrategia para la detección automática de la distribución espacial de la maleza en un terreno cultivado usando algoritmos de aprendizaje profundo (DL) en imágenes multiespectrales. Para probar el desempeño de la estrategia se utilizó una base de datos de imágenes recolectada por un vehículo aéreo no tripulado (VANT). Las bases de datos empleadas proporcionan las imágenes multiespectrales y su respectiva máscara, esta última representa la información semántica de cada uno de los pixeles de la imagen, la información semántica se constituye a partir de tres colores cada uno de ellos pertenecientes a una clase de interés: el rojo representa la maleza, el verde representa el cultivo y el negro representa el fondo o todo aquello que no es vegetal en el mapa. Adicionalmente, el problema se abordó como un problema de segmentación semántica y la estrategia de solución fue un algoritmo de DL. Al aplicar la solución a las imágenes se evidencia una mejora en las diferentes métricas usadas en la literatura para estas bases de datos tales como el AUC y el F1-score, además se evidencia excelentes resultados en las máscaras predichas para los datos de prueba. Por último, se analiza el aporte de los diferentes canales multiespectrales y de técnicas clásicas de preprocesamiento de imágenes a las métricas del modelo, además de la capacidad de este por generar buenas representaciones semánticas del terreno captado por el sensor.(Texto tomado de la fuente)A weed is defined as a plant that grows wild in a place undesirable for agricultural crops. This is because it competes for the limited resources available in a sector previously destined and conditioned for food production or other specific activities, decreasing its yield. Traditionally farmers apply weeding or weed removal with hand tools, making this a slow and costly process due to the large amount of labor required. In order to reduce the number of workers involved, selective action chemical agents are used directly on the crop to kill the invasive plant, however, in large extensions of land it is difficult to know the spatial distribution of the weeds beforehand, so the application of the agent is done uniformly throughout the plantation, leading to a greater waste of the product and therefore an increase in costs. This thesis presents a strategy for automatic detection of the spatial distribution of weeds in a cultivated field using deep learning (DL) algorithms on multispectral images is proposed. An image database collected by an unmanned aerial vehicle (UAV) was used to test the performance of the strategy. The databases used provide the multispectral images and their respective mask, the latter represents the semantic information of each of the pixels of the image, the semantic information is represented using three colors, each one belonging to a class of interest: red represents the weeds, green represents the crop and black represents the background or everything that is not vegetation on the map. Additionally, the problem was approached as a semantic segmentation problem and the solution strategy was a DL algorithm. By applying the solution to the images, an improvement in the different metrics used in the literature for these databases such as AUC and F1-score is evidenced, in addition to excellent results in the predicted masks for the test data. Finally, the contribution of the different multispectral channels and classical image preprocessing techniques to the model metrics is analyzed, as well as the model’s ability to generate good semantic representations of the terrain captured by the sensor.ColcienciasMaestríaMagíster en Ingeniería - Ingeniería de Sistemas y ComputaciónProcesamiento digital de imágenes.xii, 45 páginasapplication/pdfspaUniversidad Nacional De ColombiaBogotá - Ingeniería - Maestría en Ingeniería - Ingeniería de Sistemas y ComputaciónFacultad de IngenieríaBogotá, ColombiaUniversidad Nacional de Colombia - Sede Bogotá630 - Agricultura y tecnologías relacionadas::631 - Técnicas específicas, aparatos, equipos, materiales632 - Lesiones, enfermedades, plagas vegetales000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadoresControl de malezaControl de maleza - investigacionesWeed control - researchWeed controlMapeo de MalezaSegmentación Semánticaimágenes MultiespectralesAprendizaje ProfundoVehículo Aéreo No TripuladoClasificación Por PíxelesAprendizaje Automático En ProducciónRedes Neuronales ConvolucionalesAprendizaje profundo para el mapeo de maleza usando imágenes multiespectrales adquiridas por dronesDeep learning for weed mapping using multispectral drone Acquired imageryTrabajo de grado - Maestríainfo:eu-repo/semantics/masterThesisinfo:eu-repo/semantics/acceptedVersionTexthttp://purl.org/redcol/resource_type/TMBiremeRedColStephen O Duke. Perspectives on transgenic, herbicide-resistant crops in the united states almost 20 years after introduction. Pest management science, 71(5):652–657, 2015.Alexa Varah, Kwadjo Ahodo, Shaun R Coutts, Helen L Hicks, David Comont, Laura Crook, Richard Hull, Paul Neve, Dylan Z Childs, Robert P Freckleton, et al. The costs of human-induced evolution in an agricultural system. Nature sustainability, 3(1):63–71, 2020.Alessandro dos Santos Ferreira, Daniel Matte Freitas, Gercina Gon ̧calves da Silva, Hemerson Pistori, and Marcelo Theophilo Folhes. Weed detection in soybean crops using convnets. Computers and Electronics in Agriculture, 143:314–324, 2017.Shirley A Briggs. Basic guide to pesticides: their characteristics and hazards. CRC Press, 2018.Isabelle Schuster, Henning Nordmeyer, and Thomas Rath. Comparison of vision-based and manual weed mapping in sugar beet. Biosystems engineering, 98(1):17–25, 2007.David Pimentel, Herbert Acquay, Michael Biltonen, P Rice, M Silva, J Nelson, V Lip- ner, S Giordano, A Horowitz, and M D’amore. Environmental and economic costs of pesticide use. BioScience, 42(10):750–760, 1992.K Neil Harker and John T O’Donovan. Recent weed control, weed management, and integrated weed management. Weed Technology, 27(1):1–11, 2013.Mulham Fawakherji, Ali Youssef, Domenico D Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weed classification using pixel-wise segmentation on ground and aerial images. Int. J. Robot. Comput, 2(1):39–57, 2020.David R Shaw. Remote sensing and site-specific weed management. Frontiers in Ecology and the Environment, 3(10):526–532, 2005.Colin Birch, Ian Cooper, Gurjeet Gill, Stephen Adkins, and Madan Gupta. Weed management in rainfed agricultural systems. In Rainfed Farming Systems, pages 215– 232. Springer, 2011.Philipp Lottes, Jens Behley, Nived Chebrolu, Andres Milioto, and Cyrill Stachniss. Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming. Journal of Field Robotics, 37(1):20–34, 2020.Inkyu Sa, Marija Popovi ́c, Raghav Khanna, Zetao Chen, Philipp Lottes, Frank Liebisch, Juan Nieto, Cyrill Stachniss, Achim Walter, and Roland Siegwart. Weedmap: a large- scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sensing, 10(9):1423, 2018.Jorge Torres-S ́anchez, Jos ́e Manuel Pena, Ana Isabel de Castro, and Fransisca L ́opez- Granados. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from uav. Computers and Electronics in Agriculture, 103:104–113, 2014.Chang-chun Li, Guang-sheng Zhang, Tian-jie Lei, and A-du GONG. Quick image- processing method of uav without control points data in earthquake disaster area. Transactions of Nonferrous Metals Society of China, 21:s523–s528, 2011.Andreas Kamilaris and Francesc X Prenafeta-Bold ́u. Deep learning in agriculture: A survey. Computers and electronics in agriculture, 147:70–90, 2018.Konstantinos G Liakos, Patrizia Busato, Dimitrios Moshou, Simon Pearson, and Dion- ysis Bochtis. Machine learning in agriculture: A review. Sensors, 18(8):2674, 2018.Dimosthenis C Tsouros, Stamatia Bibi, and Panagiotis G Sarigiannidis. A review on uav-based applications for precision agriculture. Information, 10(11):349, 2019.Huasheng Huang, Yubin Lan, Aqing Yang, Yali Zhang, Sheng Wen, and Jizhong Deng. Deep learning versus object-based image analysis (obia) in weed mapping of uav im- agery. International Journal of Remote Sensing, 41(9):3446–3479, 2020.Inkyu Sa, Zetao Chen, Marija Popovi ́c, Raghav Khanna, Frank Liebisch, Juan Nieto, and Roland Siegwart. weednet: Dense semantic weed classification using multispectral images and mav for smart farming. IEEE Robotics and Automation Letters, 3(1):588– 595, 2017.Alwaseela Abdalla, Haiyan Cen, Liang Wan, Reem Rashid, Haiyong Weng, Weijun Zhou, and Yong He. Fine-tuning convolutional neural network with transfer learning for semantic segmentation of ground-level oilseed rape images in a field with high weed pressure. Computers and Electronics in Agriculture, 167:105091, 2019.Kavir Osorio, Andr ́es Puerto, Cesar Pedraza, David Jamaica, and Leonardo Rodr ́ıguez. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering, 2(3):471–488, 2020.Sigurbj ̈orn J ́onsson. Rgb and multispectral uav image classification of agricultural fields using a machine learning algorithm. Student thesis series INES, 2019.ASM Mahmudul Hasan, Ferdous Sohel, Dean Diepeveen, Hamid Laga, and Michael GK Jones. A survey of deep learning techniques for weed detection from images. Computers and Electronics in Agriculture, 184:106067, 2021.Aurelien Geron. Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. ” O’Reilly Media, Inc.”, 2019.Calvin Hung, Zhe Xu, and Salah Sukkarieh. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a uav. Remote Sensing, 6(12):12037–12054, 2014.Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553):436–444, 2015.Aston Zhang, Zachary C. Lipton, Mu Li, and Alexander J. Smola. Dive into deep learning. arXiv preprint arXiv:2106.11342, 2021.Huasheng Huang, Jizhong Deng, Yubin Lan, Aqing Yang, Xiaoling Deng, and Lei Zhang. A fully convolutional network for weed mapping of unmanned aerial vehicle (uav) im- agery. PloS one, 13(4):e0196302, 2018.Jonathan Long, Evan Shelhamer, and Trevor Darrell. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 3431–3440, 2015.Sebastian Haug and J ̈orn Ostermann. A crop/weed field image dataset for the evaluation of computer vision based precision agriculture tasks. In Lourdes Agapito, Michael M. Bronstein, and Carsten Rother, editors, Computer Vision - ECCV 2014 Workshops, pages 105–116, Cham, 2015. Springer International Publishing.Sebastian Haug, Andreas Michaels, Peter Biber, and J ̈orn Ostermann. Plant classi- fication system for crop/weed discrimination without segmentation. In IEEE winter conference on applications of computer vision, pages 1142–1149. IEEE, 2014.Jos ́e Manuel Pe ̃na, Jorge Torres-S ́anchez, Ana Isabel de Castro, Maggi Kelly, and Fran- cisca L ́opez-Granados. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (uav) images. PLOS ONE, 8(10):null, 10 2013.Maria Perez-Ortiz, JM Pena, Pedro Antonio Gutierrez, Jorge Torres-Sanchez, Cesar Hervas-Martınez, and Francisca Lopez-Granados. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Applied Soft Computing, 37:533–544, 2015.Thomas K Alexandridis, Afroditi Alexandra Tamouridou, Xanthoula Eirini Pantazi, Anastasia L Lagopodi, Javid Kashefi, Georgios Ovakoglou, Vassilios Polychronos, and Dimitrios Moshou. Novelty detection classifiers in weed mapping: Silybum marianum detection on uav multispectral images. Sensors, 17(9):2007, 2017.Philipp Lottes, Raghav Khanna, Johannes Pfeifer, Roland Siegwart, and Cyrill Stach- niss. Uav-based crop and weed classification for smart farming. In 2017 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 3024–3031. IEEE, 2017.nders Krogh Mortensen, Mads Dyrmann, Henrik Karstoft, R Nyholm Jørgensen, Ren ́e Gislum, et al. Semantic segmentation of mixed crops using deep convolutional neural network. In CIGR-AgEng Conference, 26-29 June 2016, Aarhus, Denmark. Abstracts and Full papers, pages 1–6. Organising Committee, CIGR 2016, 2016.M. Dyrmann, R. N. Jørgensen, and H. S. Midtiby. Roboweedsupport - detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Advances in Animal Biosciences, 8(2):842–847, 2017.Vijay Badrinarayanan, Alex Kendall, and Roberto Cipolla. Segnet: A deep convo- lutional encoder-decoder architecture for image segmentation. IEEE transactions on pattern analysis and machine intelligence, 39(12):2481–2495, 2017.Maurilio Di Cicco, Ciro Potena, Giorgio Grisetti, and Alberto Pretto. Automatic model based dataset generation for fast and accurate crop and weeds detection. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 5188–5195. IEEE, 2017.Huasheng Huang, Yubin Lan, Jizhong Deng, Aqing Yang, Xiaoling Deng, Lei Zhang, and Sheng Wen. A semantic labeling approach for accurate weed mapping of high resolution uav imagery. Sensors, 18(7):2113, 2018.Soren Skovsen, Mads Dyrmann, Anders K Mortensen, Morten S Laursen, Ren ́e Gis- lum, Jorgen Eriksen, Sadaf Farkhani, Henrik Karstoft, and Rasmus N Jorgensen. The grassclover image dataset for semantic and hierarchical species understanding in agri- culture. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pages 0–0, 2019.Yannik Rist, Iurii Shendryk, Foivos Diakogiannis, and Shaun Levick. Weed mapping using very high resolution satellite imagery and fully convolutional neural network. In IGARSS 2019-2019 IEEE International Geoscience and Remote Sensing Symposium, pages 9784–9787. IEEE, 2019.Foivos I Diakogiannis, Fran ̧cois Waldner, Peter Caccetta, and Chen Wu. Resunet-a: A deep learning framework for semantic segmentation of remotely sensed data. ISPRS Journal of Photogrammetry and Remote Sensing, 162:94–114, 2020.Shyam Prasad Adhikari, Heechan Yang, and Hyongsuk Kim. Learning semantic graph- ics using convolutional encoder–decoder network for autonomous weeding in paddy. Frontiers in plant science, 10:1404, 2019.Olaf Ronneberger, Philipp Fischer, and Thomas Brox. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical image com- puting and computer-assisted intervention, pages 234–241. Springer, 2015.Liang-Chieh Chen, George Papandreou, Florian Schroff, and Hartwig Adam. Rethinking atrous convolution for semantic image segmentation. arXiv preprint arXiv:1706.05587, 2017.Anderson Brilhador, Matheus Gutoski, Leandro Takeshi Hattori, Andrei de Souza In ́acio, Andr ́e Eugˆenio Lazzaretti, and Heitor Silv ́erio Lopes. Classifi- cation of weeds and crops at the pixel-level using convolutional neural networks and data augmentation. In 2019 IEEE Latin American Conference on Computational Intelligence (LA-CCI), pages 1–6. IEEE, 2019.Mulham Fawakherji, Ali Youssef, Domenico Bloisi, Alberto Pretto, and Daniele Nardi. Crop and weeds classification for precision agriculture using context-independent pixel- wise segmentation. In 2019 Third IEEE International Conference on Robotic Computing (IRC), pages 146–152. IEEE, 2019.Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.Muhammad Hamza Asad and Abdul Bais. Weed detection in canola fields using maxi- mum likelihood classification and deep convolutional neural network. Information Pro- cessing in Agriculture, 7(4):535–545, 2020.Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.Xu Ma, Xiangwu Deng, Long Qi, Yu Jiang, Hongwei Li, Yuwei Wang, and Xupo Xing. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PloS one, 14(4):e0215676, 2019.Lukas Petrich, Georg Lohrmann, Matthias Neumann, Fabio Martin, Andreas Frey, Al- bert Stoll, and Volker Schmidt. Detection of colchicum autumnale in drone images, using a machine-learning approach. Precision Agriculture, 21(6):1291–1303, 2020.W Ramirez, P Achanccaray, LF Mendoza, and MAC Pacheco. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. In 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), pages 133–137. IEEE, 2020.Navneet Dalal and Bill Triggs. Histograms of oriented gradients for human detection. In 2005 IEEE computer society conference on computer vision and pattern recognition (CVPR’05), volume 1, pages 886–893. Ieee, 2005.Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. You only look once: Unified, real-time object detection. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 779–788, 2016.Kaiming He, Georgia Gkioxari, Piotr Doll ́ar, and Ross Girshick. Mask r-cnn. In Pro- ceedings of the IEEE international conference on computer vision, pages 2961–2969, 2017.Kunlin Zou, Xin Chen, Fan Zhang, Hang Zhou, and Chunlong Zhang. A field weed density evaluation method based on uav imaging and modified u-net. Remote Sensing, 13(2):310, 2021.Petra Bosilj, Erchan Aptoula, Tom Duckett, and Grzegorz Cielniak. Transfer learn- ing between crop types for semantic segmentation of crops versus weeds in precision agriculture. Journal of Field Robotics, 37(1):7–19, 2020.S Umamaheswari and Ashvini V Jain. Encoder–decoder architecture for crop-weed classification using pixel-wise labelling. In 2020 International Conference on Artificial Intelligence and Signal Processing (AISP), pages 1–6. IEEE, 2020.Aichen Wang, Yifei Xu, Xinhua Wei, and Bingbo Cui. Semantic segmentation of crop and weed using an encoder-decoder network and image enhancement method under uncontrolled outdoor illumination. IEEE Access, 8:81724–81734, 2020.Yuzhen Lu and Sierra Young. A survey of public datasets for computer vision tasks in precision agriculture. Computers and Electronics in Agriculture, 178:105760, 2020.Zhangnan Wu, Yajun Chen, Bo Zhao, Xiaobing Kang, and Yuanyuan Ding. Review of weed detection methods based on computer vision. Sensors, 21(11):3647, 2021.Merima Kulin, Tarik Kazaz, Eli De Poorter, and Ingrid Moerman. A survey on machine learning-based performance improvement of wireless networks: Phy, mac and network layer. Electronics, 10(3):318, 2021.Panqu Wang, Pengfei Chen, Ye Yuan, Ding Liu, Zehua Huang, Xiaodi Hou, and Gar- rison Cottrell. Understanding convolution for semantic segmentation. In 2018 IEEE winter conference on applications of computer vision (WACV), pages 1451–1460. IEEE, 2018.Peng Liu, Hui Zhang, and Kie B Eom. Active deep learning for classification of hyper- spectral images. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 10(2):712–724, 2016.Nguyen Thanh Toan and Nguyen Thanh Tam. Early bushfire detection with 3d cnn from streams of satellite images.Nikhil Jangamreddy. A survey on specialised hardware for machine learning. 2019.E-C Oerke. Crop losses to pests. The Journal of Agricultural Science, 144(1):31–43, 2006.Craig D Osteen and Jorge Fernandez-Cornejo. Herbicide use trends: a backgrounder. Choices, 31(4):1–7, 2016.John Peterson Myers, Michael N Antoniou, Bruce Blumberg, Lynn Carroll, Theo Col- born, Lorne G Everett, Michael Hansen, Philip J Landrigan, Bruce P Lanphear, Robin Mesnage, et al. Concerns over use of glyphosate-based herbicides and risks associated with exposures: a consensus statement. Environmental Health, 15(1):1–13, 2016.Kevis-Kokitsi Maninis, Jordi Pont-Tuset, Pablo Arbel ́aez, and Luc Van Gool. Deep retinal image understanding. In International conference on medical image computing and computer-assisted intervention, pages 140–148. Springer, 2016.Karen Simonyan and Andrew Zisserman. Very deep convolutional networks for large- scale image recognition. arXiv preprint arXiv:1409.1556, 2014.Liam Li and Ameet Talwalkar. Random search and reproducibility for neural architec- ture search. In Uncertainty in artificial intelligence, pages 367–377. PMLR, 2020.Aprendizaje Profundo en Imágenes de Cultivos para la Detección Automática de EnfermedadesColcienciasEstudiantesLICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/82351/1/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD51ORIGINAL1090457208.2022.pdf1090457208.2022.pdfTesis de Maestría en Ingeniería - Ingeniería de Sistemas y Computaciónapplication/pdf15554379https://repositorio.unal.edu.co/bitstream/unal/82351/2/1090457208.2022.pdf8e4d6575f15fd8f801784678171a8c6fMD52unal/82351oai:repositorio.unal.edu.co:unal/823512022-11-02 07:44:36.89Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo=