Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo
Aproximadamente el 75% de la superficie agrícola global pertenece a pequeños agricultores, siendo esenciales para el abastecimiento local de alimentos. Sin embargo, los desafíos comunes incluyen la falta de caracterización precisa de los cultivos y la escasa información detallada en las zonas produc...
- Autores:
-
Arregocés Guerra, Paulina
- Tipo de recurso:
- Fecha de publicación:
- 2024
- Institución:
- Universidad Nacional de Colombia
- Repositorio:
- Universidad Nacional de Colombia
- Idioma:
- spa
- OAI Identifier:
- oai:repositorio.unal.edu.co:unal/86302
- Palabra clave:
- 000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación
630 - Agricultura y tecnologías relacionadas
Procesamiento de imágenes
Agricultura Inteligente
imágenes aéreas
VANTs
Aprendizaje profundo
Redes Neuronales Convolucionales
Smart Farming
aerial imagery
UAVs
Deep Learning
Convolutional neural networks
Redes neuronales convolucionales
- Rights
- openAccess
- License
- Atribución-NoComercial-SinDerivadas 4.0 Internacional
id |
UNACIONAL2_41987796fb73c7698f549c64b15b822b |
---|---|
oai_identifier_str |
oai:repositorio.unal.edu.co:unal/86302 |
network_acronym_str |
UNACIONAL2 |
network_name_str |
Universidad Nacional de Colombia |
repository_id_str |
|
dc.title.spa.fl_str_mv |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
dc.title.translated.eng.fl_str_mv |
Method for the classification of small-scale agricultural crops using deep learning techniques |
title |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
spellingShingle |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo 000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación 630 - Agricultura y tecnologías relacionadas Procesamiento de imágenes Agricultura Inteligente imágenes aéreas VANTs Aprendizaje profundo Redes Neuronales Convolucionales Smart Farming aerial imagery UAVs Deep Learning Convolutional neural networks Redes neuronales convolucionales |
title_short |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
title_full |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
title_fullStr |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
title_full_unstemmed |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
title_sort |
Método para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo |
dc.creator.fl_str_mv |
Arregocés Guerra, Paulina |
dc.contributor.advisor.none.fl_str_mv |
Branch Bedoya, John Willian Restrepo Arias, Juan Felipe |
dc.contributor.author.none.fl_str_mv |
Arregocés Guerra, Paulina |
dc.contributor.researchgroup.spa.fl_str_mv |
Gidia: Grupo de Investigación YyDesarrollo en Inteligencia Artificial |
dc.contributor.orcid.spa.fl_str_mv |
Arregocés Guerra, Paulina [0000000195670231] |
dc.subject.ddc.spa.fl_str_mv |
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación 630 - Agricultura y tecnologías relacionadas |
topic |
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación 630 - Agricultura y tecnologías relacionadas Procesamiento de imágenes Agricultura Inteligente imágenes aéreas VANTs Aprendizaje profundo Redes Neuronales Convolucionales Smart Farming aerial imagery UAVs Deep Learning Convolutional neural networks Redes neuronales convolucionales |
dc.subject.lemb.none.fl_str_mv |
Procesamiento de imágenes |
dc.subject.proposal.spa.fl_str_mv |
Agricultura Inteligente imágenes aéreas VANTs Aprendizaje profundo Redes Neuronales Convolucionales |
dc.subject.proposal.eng.fl_str_mv |
Smart Farming aerial imagery UAVs Deep Learning Convolutional neural networks |
dc.subject.wikidata.none.fl_str_mv |
Redes neuronales convolucionales |
description |
Aproximadamente el 75% de la superficie agrícola global pertenece a pequeños agricultores, siendo esenciales para el abastecimiento local de alimentos. Sin embargo, los desafíos comunes incluyen la falta de caracterización precisa de los cultivos y la escasa información detallada en las zonas productivas. La Agricultura Inteligente, que utiliza tecnologías avanzadas como Vehículos Aéreos No Tripulados (VANTs) y visión por computadora, ofrece soluciones; sin embargo, su falta de accesibilidad excluye al 94% de los pequeños agricultores en Colombia. Este trabajo aborda la necesidad de proponer un método de clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo. Se utiliza una VANT DJI Mini 2 SE, accesible en el mercado, para capturar imágenes en San Cristóbal, un área rural de Medellín, Colombia, con el objetivo de identificar cultivos de cebolla verde o de rama, follaje y áreas sin cultivo. Con 259 imágenes y 4315 instancias etiquetadas, se emplean modelos de Redes Neuronales Convolucionales (CNNs, por sus siglas en inglés) para la clasificación de objetos, segmentación de instancias y segmentación semántica. Se evaluaron métodos de Aprendizaje Profundo utilizando Transfer Learning, siendo Mask R-CNN el elegido con un 93% de precisión, una tasa de falsos positivos del 9% y falsos negativos del 4%. Las métricas incluyen un porcentaje de precisión promedio medio (mAP%) del 55.49% para follaje, 49.09% para áreas sin cultivo y 58.21% para la cebolla. El conjunto de datos etiquetado está disponible para fomentar la colaboración e investigación comparativa. En términos generales se concluye que mediante la captura de imágenes digitales con VANTs y el uso de métodos de aprendizaje profundo, se puede obtener información precisa y oportuna sobre pequeñas explotaciones agrícolas. (Texto tomado de la fuente) |
publishDate |
2024 |
dc.date.accessioned.none.fl_str_mv |
2024-06-25T20:44:08Z |
dc.date.available.none.fl_str_mv |
2024-06-25T20:44:08Z |
dc.date.issued.none.fl_str_mv |
2024 |
dc.type.spa.fl_str_mv |
Trabajo de grado - Maestría |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/masterThesis |
dc.type.version.spa.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.content.spa.fl_str_mv |
Text |
dc.type.redcol.spa.fl_str_mv |
http://purl.org/redcol/resource_type/TM |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://repositorio.unal.edu.co/handle/unal/86302 |
dc.identifier.instname.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.identifier.reponame.spa.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
dc.identifier.repourl.spa.fl_str_mv |
https://repositorio.unal.edu.co/ |
url |
https://repositorio.unal.edu.co/handle/unal/86302 https://repositorio.unal.edu.co/ |
identifier_str_mv |
Universidad Nacional de Colombia Repositorio Institucional Universidad Nacional de Colombia |
dc.language.iso.spa.fl_str_mv |
spa |
language |
spa |
dc.relation.references.spa.fl_str_mv |
Agencia de desarrollo rural, FAO, y Gobernación de Antioquia. (2012). Plan integral de desarrollo agropecuario y rural con enfoque territorial (Vol. 91). Alamsyah, A., Saputra, M. A. A., & Masrury, R. A. (2019, March). Object detection using convolutional neural network to identify popular fashion product. In Journal of Physics: Conference Series (Vol. 1192, No. 1, p. 012040). IOP Publishing. Alba, A., Angela Burgos, Cárdenas, J., Lara, K., Sierra, A., y Rojas, G. A. M. (2013, 10). Panorama investigativo sobre la segunda revolución verde en el mundo y en Colombia. Tecciencia, 8 , 49-64. Descargado de SciELO doi: 10.18180/TECCIENCIA.2013.15.6 Alcaldía Mayor de Bogotá (22 de Marzo de 2022). Resolución 101 de 2022 Ministerio de Agricultura y Desarrollo Rural. Recuperado el 12 de Febrero de 2024 de https://www.alcaldiabogota.gov.co/sisjur/normas/Norma1.jsp?i=122204. Ammar, A., Koubaa, A., Ahmed, M., Saad, A., & Benjdira, B. (2019). Aerial images processing for car detection using convolutional neural networks: Comparison between faster r-cnn and yolov3. arXiv preprint arXiv:1910.07234. Ayaz, M., Ammad-Uddin, M., Sharif, Z., Mansour, A., y Aggoune, E.-H. M. (2019). Internet-of-Things (IoT)-based smart agriculture: Toward making the fields talk. IEEE access, 7 , 129551–129583. Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/TPAMI.2016.2644615 Bakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153-160. https://doi.org/10.1016/j.compag.2017.12.032 Bayraktar, E., Basarkan, M. E., & Celebi, N. (2020). A low-cost UAV framework towards ornamental plant detection and counting in the wild. ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11. https://doi.org/10.1016/j.isprsjprs.2020.06.012 Bochkovskiy, A., Wang, C. Y., & Liao, H. Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934. https://doi.org/10.48550/arXiv.2004.10934 Bouguettaya, A., Zarzour, H., Kechida, A., y Taberkit, A. M. (2022, 3). Deep learning techniques to classify agricultural crops through UAV imagery: a review. Neural Computing and Applications 2022 34:12 , 34 , 9511-9536. Descargado de Springer doi: 10.1007/S00521-022-07104-9 Botero, F., & Cristóbal, S (2017). política y económica del corregimiento de San Cristóbal. Disponible en https://bibliotecasmedellin.gov.co/wp-content/uploads/2018/10/Anexo_San_Cristo%CC%81bal.pdf Castañeda-Miranda, A., y Castaño-Meneses, V. M. (2020). Smart frost measurement for anti-disaster intelligent control in greenhouses via embedding IoT and hybrid AI methods. Measurement: Journal of the International Measurement Confederation, 164 . doi: 10.1016/j.measurement.2020.108043 Chamara, N., Bai, G., & Ge, Y. (2023). AICropCAM: Deploying classification, segmentation, detection, and counting deep-learning models for crop monitoring on the edge. Computers and Electronics in Agriculture, 215, 108420. https://doi.org/10.1016/j.compag.2023.108420 Chen, X., Girshick, R., He, K., & Dollar, P. (2019). TensorMask: A foundation for dense object segmentation. Proceedings of the IEEE International Conference on Computer Vision, 2019-Octob, 2061–2069. https://doi.org/10.1109/ICCV.2019.00215 Cheng, B., Collins, M. D., Zhu, Y., Liu, T., Huang, T. S., Adam, H., & Chen, L.-C. (2020). Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation. 12475–12485. https://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_Panoptic-DeepLab_A_Simple_Strong_and_Fast_Baseline_for_Bottom-Up_Panoptic_CVPR_2020_paper.html Chew, R., Rineer, J., Beach, R., O’neil, M., Ujeneza, N., Lapidus, D., .Temple, D. S. (2020). Deep neural networks and transfer learning for food crop identification in UAV images. Drones, 4 , 1-14. doi: 10.3390/drones4010007 Contiu, S., y Groza, A. (2016). Improving remote sensing crop classification by argumentation-based conflict resolution in ensemble learning. Expert Systems with Applications, 64 , 269-286. doi: 10.1016/j.eswa.2016.07.037 Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. En: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 248-255. DOI: 10.1109/CVPR.2009.5206848. https://doi.org/10.3389/fpls.2021.763479 Departamento Administrativo Nacional de Estadística (DANE). (2019). Encuesta nacional agropecuaria (ENA). Descargado de DANE Der Yang, M., Tseng, H. H., Hsu, Y. C., et al. (2020). Real-time crop classification using edge computing and deep learning. En: 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), IEEE, pp. 1–4. https://doi.org/10.1109/CCNC46108.2020.9045498 Dijkstra, K., van de Loosdrecht, J., Atsma, W. A., Schomaker, L. R., y Wiering, M. A. (2021). CentroidNetV2: A hybrid deep neural network for small-object segmentation and counting. Neurocomputing, 423 , 490-505. Descargado de doi doi: 10.1016/j.neucom.2020.10.075 El-Basioni, B. M. M., y El-Kader, S. M. A. (2020). Laying the foundations for an IoT reference architecture for agricultural application domain. IEEE Access, 8 , 190194-190230. doi: 10.1109/ACCESS.2020.3031634 Feng, T., Chai, Y., Huang, Y., & Liu, Y. (2019, December). A Real-time Monitoring and Control System for Crop. In Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence (pp. 183-188). https://doi.org/10.1145/3377713.3377742 Ferro, M. V., & Catania, P. (2023). Technologies and Innovative Methods for Precision Viticulture: A Comprehensive Review. Horticulturae, 9(3), 399. FindLight. (s.f.). Wide Dynamic Range Sensor NSC1005C. Recuperado de https://www.findlight.net/imaging-and-vision/image-sensors/area-scan-sensors/wide-dynamic-range-sensor-nsc1005c Food and Agriculture Organization (FAO). (2017). The future of food and agriculture: Trends and challenges. Descargado de FAO Food and Agriculture Organization (FAO). (2018). Fao’s work on agricultural innovation. , 20. Descargado de FAO FPN. (s/f). CloudFactory Computer Vision Wiki. Recuperado el 13 de febrero de 2024, de https://wiki.cloudfactory.com/docs/mp-wiki/model-architectures/fpn Fuentes-Peñailillo, F., Ortega-Farias, S., Rivera, M., Bardeen, M., & Moreno, M. (2018, October). Using clustering algorithms to segment UAV-based RGB images. In 2018 IEEE international conference on automation/XXIII congress of the Chilean association of automatic control (ICA-ACCA) (pp. 1-5). IEEE. doi: 10.1109/ICA-ACCA.2018.8609822 Fujiwara, R., Nashida, H., Fukushima, M., Suzuki, N., Sato, H., Sanada, Y., & Akiyama, Y. (2022). Convolutional neural network models help effectively estimate legume coverage in grass-legume mixed swards. Frontiers in Plant Science, 12, 763479. García-Santillán, I. D., y Pajares, G. (2018). On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields [Article]. Biosystems Engineering, 166 , 28 – 43. doi: 10.1016/j.biosystemseng.2017.11.003 Genze, N., Ajekwe, R., Güreli, Z., Haselbeck, F., Grieb, M., & Grimm, D. G. (2022). Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Computers and Electronics in Agriculture, 202, 107388. Girshick, R. (2015). Fast r-cnn. In Proceedings of the IEEE international conference on computer vision (pp. 1440-1448). Guler, R. A., Neverova, N., & Kokkinos, I. (2016). DensePose: Dense Human Pose Estimation In TheWild. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7297–7306. http://arxiv.org/abs/1612.01202 Hamuda, E., Glavin, M., y Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184-199. Descargado de doi.org (2016) doi: 10.1016/j.compag.2016.04.024 He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2020). Mask R-CNN. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(2), 386–397. https://doi.org/10.1109/TPAMI.2018.2844175 He, K., Zhang, X., Ren, S., et al. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770–778). https://doi.org/10.1109/CVPR.2016.90 Howard, A. G., et al. (2017, April). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. [Online]. arXiv preprint arXiv:1704.04861. Disponible en: http://arxiv.org/abs/1704.04861 Kawamura, K., Asai, H., Yasuda, T., Soisouvanh, P., & Phongchanmixay, S. (2021). Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm. Plant Production Science, 24(2), 198-215. https://doi.org/10.1080/1343943X.2020.1829490 Kirillov, A., He, K., Girshick, R., Rother, C., & Dollar, P. (2019). Panoptic segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 9396–9405. https://doi.org/10.1109/CVPR.2019.00963 Kitano, B. T., Mendes, C. C., Geus, A. R., et al. (2019). Corn plant counting using deep learning and UAV images. IEEE Geoscience and Remote Sensing Letters. https://doi.org/10.1109/LGRS.2019.2930549 Kitzler, F., Wagentristl, H., Neugschwandtner, R. W., Gronauer, A., & Motsch, V. (2022). Influence of Selected Modeling Parameters on Plant Segmentation Quality Using Decision Tree Classifiers. Agriculture, 12, 1408. https://doi.org/10.3390/agriculture12091408 Koirala, A., Walsh, K., Wang, Z., et al. (2019). Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘mangoyolo’. Precision Agriculture, 20(6), 1107–1135. https://doi.org/10.1007/s11119-019-09642-0 LeCun, Y., Bottou, L., Bengio, Y., et al. (1998). Gradient-based learning applied to document recognition. Proc IEEE, 86(11), 2278–2324. https://doi.org/10.1109/5.726791 Li, L., Mu, X., Jiang, H., Chianucci, F., Hu, R., Song, W., Yan, G. (2023). Review of ground and aerial methods for vegetation cover fraction (fcover) and related quantities estimation: definitions, advances, challenges, and future perspectives [Review]. ISPRS Journal of Photogrammetry and Remote Sensing, 199, 133 – 156. (Cited by: 1) doi: 10.1016/j.isprsjprs.2023.03.020 Li, W., Fu, H., Yu, L., y Cracknell, A. (2017). Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sensing, 9. doi: 10.3390/rs9010022 Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988). Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016). SSD: Single shot multibox detector. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9905 LNCS, 21–37. https://doi.org/10.1007/978-3-319-46448-0_2 Liu, H., Qi, Y., Xiao, W., Tian, H., Zhao, D., Zhang, K., Xiao, J., Lu, X., Lan, Y., & Zhang, Y. (2022). Identification of Male and Female Parents for Hybrid Rice Seed Production Using UAV-Based Multispectral Imagery. Agriculture, 12(7), 1005. https://doi.org/10.3390/agriculture12071005 Lohi, S. A., & Bhatt, C. (2022). Empirical Analysis of Crop Yield Prediction and Disease Detection Systems: A Statistical Perspective. ICT Infrastructure and Computing: Proceedings of ICT4SD 2022, 49-57. Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3431-3440). Lottes, P., H ̈orferlin, M., Sander, S., y Stachniss, C. (2017). Effective vision-based classification for separating sugar beets and weeds for precision farming. Journal of Field Robotics, 34, 1160-1178. (2017) doi: 10.1002/rob.21675 Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., y Stachniss, C. (2017). UAV-based crop and weed classification for smart farming. Proceedings - IEEE International Conference on Robotics and Automation, 3024-3031. doi: 10.1109/ICRA.2017.7989347 Lu, Y., Young, S., Wang, H., & Wijewardane, N. (2022). Robust plant segmentation of color images based on image contrast optimization. Computers and Electronics in Agriculture, 193, 106711. https://doi.org/10.1016/j.compag.2022.106711 Machefer, M., Lemarchand, F., Bonnefond, V., et al. (2020). Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sensing, 12(18). https://doi.org/10.3390/rs12183015 Mardanisamani, S., y Eramian, M. (2022). Segmentation of vegetation and microplots in aerial agriculture images: A survey [Review]. Plant Phenome Journal, 5 (1). Descargado de Plant Phenome Journal doi: 10.3390/data8050088 Mateen, A., y Zhu, Q. (2019). Weed detection in wheat crop using UAV for precision agriculture [Article]. Pakistan Journal of Agricultural Sciences, 56 (3), 809 – 817. (Cited by: 17) doi: 10.21162/PAKJAS/19.8116 Maulit, A., Nugumanova, A., Apayev, K., Baiburin, Y., y Sutula, M. (2023). A multispectral UAV imagery dataset of wheat, soybean and barley crops in East Kazakhstan [Article]. Data, 8 (5). doi: 10.3390/data8050088 Milioto, A., Lottes, P., & Stachniss, C. (2018, May). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2229-2235). IEEE. doi: 10.1109/ICRA.2018.8460962. Morales, G., Kemper, G., Sevillano, G., et al. (2018). Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests, 9(12). https://doi.org/10.3390/f9120736 Mortimer, A. M. (2000). Capítulo 2. La clasificación y ecología de las malezas. FAO. Recuperado el dia mes año de https://www.fao.org/3/T1147S/t1147s06.htm Mu, Y., Ni, R., Fu, L., Luo, T., Feng, R., Li, J., & Li, S. (2023). DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Frontiers in Plant Science, 13, 1041510. https://doi.org/10.3389/fpls.2022.1041510 Mukherjee, S. (2022, agosto 18). The annotated ResNet-50. Towards Data Science. https://towardsdatascience.com/the-annotated-resnet-50-a6c536034758 MyBotShop. (s.f.). Clearpath Husky A200. Recuperado de https://www.mybotshop.de/Clearpath-Husky-A200_3 Neupane, B., Horanont, T., & Hung, N. D. (2019). Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV). PLoS One, 14(10), e0223906. https://doi.org/1 Ngo, U. Q., Ngo, D. T., Nguyen, H. T., y Bui, T. D. (2022). Digital image processing methods for estimating leaf area of cucumber plants [Article]. Indonesian Journal of Electrical Engineering and Computer Science, 25 (1), 317 – 328. doi: 10.11591/ijeecs.v25.i1.pp317-328 Pashaei, M., Kamangir, H., Starek, M. J., & Tissot, P. (2020). Review and Evaluation of Deep Learning Architectures for Efficient Land Cover Mapping with UAS Hyper-Spatial Imagery: A Case Study Over a Wetland. Remote Sensing, 12(6), 959. https://doi.org/10.3390/rs12060959 Patidar, P. K., Tomar, D. S., Pateriya, R. K., & Sharma, Y. K. (2023, May). Precision Agriculture: Crop Image Segmentation and Loss Evaluation through Drone Surveillance. In 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC) (pp. 495-500). IEEE. doi: 10.1109/ICSCCC58608.2023.10176980 Pierce, F. J., y Nowak, P. (1999). Aspects of precision agriculture. En D. L. Sparks (Ed.), (Vol. 67, p. 1-85). Academic Press. Descargado de ScienceDirect doi: https://doi.org/10.1016/S0065-2113(08)60513-1 Puerta-Zapata, J., Cadavid-Castro, M. A., Montoya-Betancur, K. V., & Álvarez-Castaño, L. S. (2023). Distribución tradicional y corporativa de alimentos en una zona urbana: estudio de casos colectivos en San Cristóbal, Medellín-Colombia. Revista de Investigación, Desarrollo e Innovación, 13(1), 157-172. https://doi.org/10.19053/20278306.v13.n1.2023.16058 Qamar, T., & Bawany, N. Z. (2023). Agri-PAD: a scalable framework for smart agriculture. Indonesian Journal of Electrical Engineering and Computer Science, 29(3), 1597-1605. doi:10.11591/ijeecs.v29.i3.pp1597-1605 Quan, L., Jiang, W., Li, H., Li, H., Wang, Q., & Chen, L. (2022). Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosystems Engineering, 216, 13-31. Quiroz, R. A. A., Guidotti, F. P., y Bedoya, A. E. (2019). A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. 2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings. (2019) doi: 10.1109/STSIVA.2019.8730214 Radoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., y Moscholios, I. (2020). A compilation of UAV applications for precision agriculture. Computer Networks, 172, 107148. Descargado de doi.org doi: 10.1016/j.comnet.2020.107148 Rampersad, H. (2020). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Total Performance Scorecard, 159–183. https://doi.org/10.4324/9780080519340-12 Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem, 779–788. https://doi.org/10.1109/CVPR.2016.91 Rehman, T. U., Zaman, Q. U., Chang, Y. K., Schumann, A. W., & Corscadden, K. W. (2019). Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Computers and Electronics in Agriculture, 162, 1-13. https://doi.org/10.1016/j.compag.2019.03.023 Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28. Restrepo-Arias, J. (2023). Método de clasificación de imágenes, empleando técnicas de inteligencia artificial, integrado a una plataforma IoT de agricultura inteligente. Universidad Nacional de Colombia. https://repositorio.unal.edu.co/handle/unal/83849 Ronneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional Networks for Biomedical Image Segmentation. In N. Navab, J. Hornegger, W. M. Wells, & A. F. Frangi (Eds.), Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015 (pp. 234–241). Springer International Publishing. https://doi.org/10.1007/978-3-319-24574-4_28 Roychowdhury, S. (2021). U-net-for-Multi-class-semantic-segmentation. Recuperado de: https://github.com/sohiniroych/U-net-for-Multi-class-semantic-segmentation Roychowdhury, S., Koozekanani, D. D., & Parhi, K. K. (2014, septiembre). DREAM: Diabetic Retinopathy Analysis Using Machine Learning. IEEE Journal of Biomedical and Health Informatics, 18(5), 1717-1728. https://doi.org/10.1109/JBHI.2013.2294635 Saiz-Rubio, V., y Rovira-Más, F. (2020, 2). From smart farming towards agriculture 5.0: A review on crop data management (Vol. 10). MDPI. doi: 10.3390/agronomy10020207. Salvador Lopez, J. (2022). Aprendizaje profundo para Análisis de Maquetación en documentos manuscritos. Universitat Politècnica de València. http://hdl.handle.net/10251/186330. Santos, A. A., Marcato Junior, J., Araújo, M. S., et al. (2019). Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVs. Sensors, 19(16), 3595. https://doi.org/10.3390/s19163595 SkyMotion. (s.f.). DJI Mini 2 SE - Skymotion. Recuperado de https://skymotion.com.co/products/dji-mini-2-se?variant=47192926126397 Schrijver, R. (2016). Precision agriculture and the future of farming in Europe: Scientific foresight study: Study. European Parliament. Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556. Song, Z., Zhang, Z., Yang, S., et al. (2020). Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Computers and Electronics in Agriculture, 179(105), 812. https://doi.org/10.1016/j.compag.2020.105812 Suh, H. K., Hofstee, J. W., & Van Henten, E. J. (2020). Investigation on combinations of colour indices and threshold techniques in vegetation segmentation for volunteer potato control in sugar beet. Computers and Electronics in Agriculture, 179, 105819. Tan, C., Zhang, P., Zhang, Y., Zhou, X., Wang, Z., Du, Y., ... & Guo, W. (2020). Rapid recognition of field-grown wheat spikes based on a superpixel segmentation algorithm using digital images. Frontiers in Plant Science, 11, 259. https://doi.org/10.3389/fpls.2020.00259 Tan, M., & Le, Q. V. (2019). EfficientNet: Rethinking model scaling for convolutional neural networks. In 36th International Conference on Machine Learning, ICML 2019 (Vol. 2019-June, pp. 10691–10700). Torrey, Lisa; Shavlik, J. (2010). Transfer Learning. Handbook of Research on Machine Learning Applications, IGI Global, 657–665. https://doi.org/10.1201/b17320 Triantafyllou, A., Sarigiannidis, P., y Bibi, S. (2019). Precision agriculture: A remote sensing monitoring system architecture. Information (Switzerland), 10. doi: 10.3390/info10110348 Trivelli, L., Apicella, A., Chiarello, F., Rana, R., Fantoni, G., y Tarabella, A. (2019). From precision agriculture to industry 4.0: Unveiling technological connections in the agrifood sector. British Food Journal, 121(8), 1730–1743. Unidad Administrativa Especial de Aeronáutica Civil (UAEAC). (2023). Rac 91: Reglas generales de vuelo y operación. United Nations Development Programme (UNDP). (2021). What are the sustainable development goals? Descargado el 2023-11-07, de UNDP velog. (s/f). Velog.io. Recuperado el 13 de febrero de 2024, de https://velog.io/@skhim520/DeepLab-v3 Wang, X., Jiang, G., Zhang, H., Zhao, H., Chen, Y., Mei, C., y Jia, Z. (2020). Grayscale distribution of maize canopy based on HLS-SVM method [Article]. International Journal of Food Properties, 23(1), 839 – 852. doi: 10.1080/10942912.2020.1758717 Wang, J., Yao, X., & Nguyen, B. K. (2022, October 12). Identification and localisation of multiple weeds in grassland for removal operation. In Proc. SPIE 12342, Fourteenth International Conference on Digital Image Processing (ICDIP 2022) (p. 123420Z). https://doi.org/10.1117/12.2644281 Wu, J., Yang, G., Yang, H., et al. (2020). Extracting apple tree crown information from remote imagery using deep learning. Computers and Electronics in Agriculture, 174(105), 504. https://doi.org/10.1016/j.compag.2020.105504 Wu, Yuxin; Kirillov, Alexander; Massa, Francisco; Lo, W.-Y., & Girshick, R. (2019). Detectron2. https://github.com/facebookresearch/detectron2 Xu, K., Li, H., Cao, W., Zhu, Y., Chen, R., & Ni, J. (2020). Recognition of weeds in wheat fields based on the fusion of RGB images and depth images. IEEE Access, 8, 110362-110370. Xu, B., Fan, J., Chao, J., Arsenijevic, N., Werle, R., y Zhang, Z. (2023). Instance segmentation method for weed detection using UAV imagery in soybean fields [Article]. Computers and Electronics in Agriculture, 211. (Cited by: 0) doi: 10.1016/j.compag.2023.107994 Yang, M. D., Boubin, J. G., Tsai, H. P., Tseng, H. H., Hsu, Y. C., & Stewart, C. C. (2020). Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture, 179, 105817. https://doi.org/10.1016/j.compag.2020.105817 Yang, L., Bi, P., Tang, H., Zhang, F., & Wang, Z. (2022). Improving vegetation segmentation with shadow effects based on double input networks using polarization images. Computers and Electronics in Agriculture, 199, 107123. https://doi.org/10.1016/j.compag.2022.107123 Yang, M. D., Tseng, H. H., Hsu, Y. C., et al. (2020). Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images. Remote Sensing, 12(4). https://doi.org/10.3390/rs12040633 You, K., Liu, Y., Wang, J., & Long, M. (2021). LogME: Practical Assessment of Pre-trained Models for Transfer Learning. http://arxiv.org/abs/2102.11005 Yuan, J., Xue, B., Zhang, W., Xu, L., Sun, H., & Zhou, J. (2019). RPN-FCN based Rust detection on power equipment. Procedia Computer Science, 147, 349–353. https://doi.org/10.1016/j.procs.2019.01.236 Zhang, J., Zhao, B., Yang, C., Shi, Y., Liao, Q., Zhou, G., Xie, J. (2020). Rapeseed stand count estimation at leaf development stages with UAV imagery and convolutional neural networks [Article]. Frontiers in Plant Science, 11. (Cited by: 22; All Open Access, Gold Open Access, Green Open Access) doi: 10.3389/fpls.2020.00617 Zhang, X., Wang, Z., Liu, D., Lin, Q., y Ling, Q. (2021, 1). Deep adversarial data augmentation for extremely low data regimes. IEEE Transactions on Circuits and Systems for Video Technology, 31, 15-28. doi: 10.1109/TCSVT.2020.2967419 Zhang, Y., Wang, C., Wang, Y., y Cheng, P. (2022). Determining the stir-frying degree of Gardeniae Fructus Praeparatus based on deep learning and transfer learning [Article]. Sensors, 22(21). doi: 10.3390/s22218091 Zheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., ... & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-GB and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, 105223. https://doi.org/10.1016/j.compag.2020.105223 Zhao, H., Shi, J., Qi, X., Wang, X., & Jia, J. (2017). Pyramid Scene Parsing Network. 2881–2890. https://openaccess.thecvf.com/content_cvpr_2017/html/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.html Zhuang, F., Qi, Z., Duan, K., Xi, D., Zhu, Y., Zhu, H., Xiong, H., & He, Q. (2021). A Comprehensive Survey on Transfer Learning. Proceedings of the IEEE, 109(1), 43–76. https://doi.org/10.1109/JPROC.2020.3004555 Zhuang, S., Wang, P., y Jiang, B. (2020). Vegetation extraction in the field using multi-level features [Article]. Biosystems Engineering, 197, 352 – 366. (Cited by: 3) doi: 10.1016/j.biosystemseng.2020.07.013 |
dc.rights.coar.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
dc.rights.license.spa.fl_str_mv |
Atribución-NoComercial-SinDerivadas 4.0 Internacional |
dc.rights.uri.spa.fl_str_mv |
http://creativecommons.org/licenses/by-nc-nd/4.0/ |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
rights_invalid_str_mv |
Atribución-NoComercial-SinDerivadas 4.0 Internacional http://creativecommons.org/licenses/by-nc-nd/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.spa.fl_str_mv |
106 páginas |
dc.format.mimetype.spa.fl_str_mv |
application/pdf |
dc.publisher.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.publisher.program.spa.fl_str_mv |
Medellín - Minas - Maestría en Ingeniería - Analítica |
dc.publisher.faculty.spa.fl_str_mv |
Facultad de Minas |
dc.publisher.place.spa.fl_str_mv |
Medellín, Colombia |
dc.publisher.branch.spa.fl_str_mv |
Universidad Nacional de Colombia - Sede Medellín |
institution |
Universidad Nacional de Colombia |
bitstream.url.fl_str_mv |
https://repositorio.unal.edu.co/bitstream/unal/86302/1/license.txt https://repositorio.unal.edu.co/bitstream/unal/86302/2/1017232348.2024.pdf https://repositorio.unal.edu.co/bitstream/unal/86302/3/1017232348.2024.pdf.jpg |
bitstream.checksum.fl_str_mv |
eb34b1cf90b7e1103fc9dfd26be24b4a 7b88a3a60c69e9d956331da63d77d2e0 b689a272a38aaa83589a4971f19d63b5 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
repository.mail.fl_str_mv |
repositorio_nal@unal.edu.co |
_version_ |
1814089398967336960 |
spelling |
Atribución-NoComercial-SinDerivadas 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc-nd/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Branch Bedoya, John Willian112eaa0bbeeaeb0d3d14dfe15d672a15Restrepo Arias, Juan Felipecf104982d249f92bf4defaced4613e60Arregocés Guerra, Paulina7a421a2bf0f19046f9ec917a395d921fGidia: Grupo de Investigación YyDesarrollo en Inteligencia ArtificialArregocés Guerra, Paulina [0000000195670231]2024-06-25T20:44:08Z2024-06-25T20:44:08Z2024https://repositorio.unal.edu.co/handle/unal/86302Universidad Nacional de ColombiaRepositorio Institucional Universidad Nacional de Colombiahttps://repositorio.unal.edu.co/Aproximadamente el 75% de la superficie agrícola global pertenece a pequeños agricultores, siendo esenciales para el abastecimiento local de alimentos. Sin embargo, los desafíos comunes incluyen la falta de caracterización precisa de los cultivos y la escasa información detallada en las zonas productivas. La Agricultura Inteligente, que utiliza tecnologías avanzadas como Vehículos Aéreos No Tripulados (VANTs) y visión por computadora, ofrece soluciones; sin embargo, su falta de accesibilidad excluye al 94% de los pequeños agricultores en Colombia. Este trabajo aborda la necesidad de proponer un método de clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundo. Se utiliza una VANT DJI Mini 2 SE, accesible en el mercado, para capturar imágenes en San Cristóbal, un área rural de Medellín, Colombia, con el objetivo de identificar cultivos de cebolla verde o de rama, follaje y áreas sin cultivo. Con 259 imágenes y 4315 instancias etiquetadas, se emplean modelos de Redes Neuronales Convolucionales (CNNs, por sus siglas en inglés) para la clasificación de objetos, segmentación de instancias y segmentación semántica. Se evaluaron métodos de Aprendizaje Profundo utilizando Transfer Learning, siendo Mask R-CNN el elegido con un 93% de precisión, una tasa de falsos positivos del 9% y falsos negativos del 4%. Las métricas incluyen un porcentaje de precisión promedio medio (mAP%) del 55.49% para follaje, 49.09% para áreas sin cultivo y 58.21% para la cebolla. El conjunto de datos etiquetado está disponible para fomentar la colaboración e investigación comparativa. En términos generales se concluye que mediante la captura de imágenes digitales con VANTs y el uso de métodos de aprendizaje profundo, se puede obtener información precisa y oportuna sobre pequeñas explotaciones agrícolas. (Texto tomado de la fuente)Approximately 75% of the global agricultural land belongs to small-scale farmers, who are essential for local food supply. However, common challenges include the lack of accurate crop characterization and limited detailed information in productive areas. Smart Farming, employing advanced technologies such as Unmanned Aerial Vehicles (UAVs) and computer vision, offers solutions; however, its lack of accessibility excludes 94% of small-scale farmers in Colombia. This work addresses the need to propose a method for small-scale agricultural crop classification using deep learning techniques. A DJI Mini 2 SE UAV, readily available in the market, is used to capture images in San Cristóbal, a rural area of Medellín, Colombia, with the aim of identifying green onion or branch crops, foliage, and uncultivated areas. With 259 images and 4315 labeled instances, Convolutional Neural Network (CNN) models are employed for object detection, instance segmentation, and semantic segmentation. Deep Learning methods using transfer learning were evaluated, with Mask R-CNN selected, achieving 93% accuracy, a false positive rate of 9%, and false negative rate of 4%. Metrics include an average precision percentage (mAP%) of 55.49% for foliage, 49.09% for uncultivated areas, and 58.21% for onions. The labeled dataset is available to encourage collaboration and comparative research.In general terms, it is concluded that by capturing digital images with UAVs and using deep learning methods, precise and timely information about small agricultural operations can be obtained.MaestríaMagister en Ingeniería AnalíticaÁrea Curricular de Ingeniería de Sistemas e Informática106 páginasapplication/pdfspaUniversidad Nacional de ColombiaMedellín - Minas - Maestría en Ingeniería - AnalíticaFacultad de MinasMedellín, ColombiaUniversidad Nacional de Colombia - Sede Medellín000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores000 - Ciencias de la computación, información y obras generales::005 - Programación, programas, datos de computación630 - Agricultura y tecnologías relacionadasProcesamiento de imágenesAgricultura Inteligenteimágenes aéreasVANTsAprendizaje profundoRedes Neuronales ConvolucionalesSmart Farmingaerial imageryUAVsDeep LearningConvolutional neural networksRedes neuronales convolucionalesMétodo para la clasificación de cultivos agrícolas a pequeña escala empleando técnicas de aprendizaje profundoMethod for the classification of small-scale agricultural crops using deep learning techniquesTrabajo de grado - Maestríainfo:eu-repo/semantics/masterThesisinfo:eu-repo/semantics/acceptedVersionTexthttp://purl.org/redcol/resource_type/TMAgencia de desarrollo rural, FAO, y Gobernación de Antioquia. (2012). Plan integral de desarrollo agropecuario y rural con enfoque territorial (Vol. 91).Alamsyah, A., Saputra, M. A. A., & Masrury, R. A. (2019, March). Object detection using convolutional neural network to identify popular fashion product. In Journal of Physics: Conference Series (Vol. 1192, No. 1, p. 012040). IOP Publishing.Alba, A., Angela Burgos, Cárdenas, J., Lara, K., Sierra, A., y Rojas, G. A. M. (2013, 10). Panorama investigativo sobre la segunda revolución verde en el mundo y en Colombia. Tecciencia, 8 , 49-64. Descargado de SciELO doi: 10.18180/TECCIENCIA.2013.15.6Alcaldía Mayor de Bogotá (22 de Marzo de 2022). Resolución 101 de 2022 Ministerio de Agricultura y Desarrollo Rural. Recuperado el 12 de Febrero de 2024 de https://www.alcaldiabogota.gov.co/sisjur/normas/Norma1.jsp?i=122204.Ammar, A., Koubaa, A., Ahmed, M., Saad, A., & Benjdira, B. (2019). Aerial images processing for car detection using convolutional neural networks: Comparison between faster r-cnn and yolov3. arXiv preprint arXiv:1910.07234.Ayaz, M., Ammad-Uddin, M., Sharif, Z., Mansour, A., y Aggoune, E.-H. M. (2019). Internet-of-Things (IoT)-based smart agriculture: Toward making the fields talk. IEEE access, 7 , 129551–129583.Badrinarayanan, V., Kendall, A., & Cipolla, R. (2017). SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(12), 2481–2495. https://doi.org/10.1109/TPAMI.2016.2644615Bakhshipour, A., & Jafari, A. (2018). Evaluation of support vector machine and artificial neural networks in weed detection using shape features. Computers and Electronics in Agriculture, 145, 153-160. https://doi.org/10.1016/j.compag.2017.12.032Bayraktar, E., Basarkan, M. E., & Celebi, N. (2020). A low-cost UAV framework towards ornamental plant detection and counting in the wild. ISPRS Journal of Photogrammetry and Remote Sensing, 167, 1–11. https://doi.org/10.1016/j.isprsjprs.2020.06.012Bochkovskiy, A., Wang, C. Y., & Liao, H. Y. M. (2020). Yolov4: Optimal speed and accuracy of object detection. arXiv preprint arXiv:2004.10934. https://doi.org/10.48550/arXiv.2004.10934Bouguettaya, A., Zarzour, H., Kechida, A., y Taberkit, A. M. (2022, 3). Deep learning techniques to classify agricultural crops through UAV imagery: a review. Neural Computing and Applications 2022 34:12 , 34 , 9511-9536. Descargado de Springer doi: 10.1007/S00521-022-07104-9Botero, F., & Cristóbal, S (2017). política y económica del corregimiento de San Cristóbal. Disponible en https://bibliotecasmedellin.gov.co/wp-content/uploads/2018/10/Anexo_San_Cristo%CC%81bal.pdfCastañeda-Miranda, A., y Castaño-Meneses, V. M. (2020). Smart frost measurement for anti-disaster intelligent control in greenhouses via embedding IoT and hybrid AI methods. Measurement: Journal of the International Measurement Confederation, 164 . doi: 10.1016/j.measurement.2020.108043Chamara, N., Bai, G., & Ge, Y. (2023). AICropCAM: Deploying classification, segmentation, detection, and counting deep-learning models for crop monitoring on the edge. Computers and Electronics in Agriculture, 215, 108420. https://doi.org/10.1016/j.compag.2023.108420Chen, X., Girshick, R., He, K., & Dollar, P. (2019). TensorMask: A foundation for dense object segmentation. Proceedings of the IEEE International Conference on Computer Vision, 2019-Octob, 2061–2069. https://doi.org/10.1109/ICCV.2019.00215Cheng, B., Collins, M. D., Zhu, Y., Liu, T., Huang, T. S., Adam, H., & Chen, L.-C. (2020). Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation. 12475–12485. https://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_Panoptic-DeepLab_A_Simple_Strong_and_Fast_Baseline_for_Bottom-Up_Panoptic_CVPR_2020_paper.htmlChew, R., Rineer, J., Beach, R., O’neil, M., Ujeneza, N., Lapidus, D., .Temple, D. S. (2020). Deep neural networks and transfer learning for food crop identification in UAV images. Drones, 4 , 1-14. doi: 10.3390/drones4010007Contiu, S., y Groza, A. (2016). Improving remote sensing crop classification by argumentation-based conflict resolution in ensemble learning. Expert Systems with Applications, 64 , 269-286. doi: 10.1016/j.eswa.2016.07.037Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. En: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 248-255. DOI: 10.1109/CVPR.2009.5206848. https://doi.org/10.3389/fpls.2021.763479Departamento Administrativo Nacional de Estadística (DANE). (2019). Encuesta nacional agropecuaria (ENA). Descargado de DANEDer Yang, M., Tseng, H. H., Hsu, Y. C., et al. (2020). Real-time crop classification using edge computing and deep learning. En: 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC), IEEE, pp. 1–4. https://doi.org/10.1109/CCNC46108.2020.9045498Dijkstra, K., van de Loosdrecht, J., Atsma, W. A., Schomaker, L. R., y Wiering, M. A. (2021). CentroidNetV2: A hybrid deep neural network for small-object segmentation and counting. Neurocomputing, 423 , 490-505. Descargado de doi doi: 10.1016/j.neucom.2020.10.075El-Basioni, B. M. M., y El-Kader, S. M. A. (2020). Laying the foundations for an IoT reference architecture for agricultural application domain. IEEE Access, 8 , 190194-190230. doi: 10.1109/ACCESS.2020.3031634Feng, T., Chai, Y., Huang, Y., & Liu, Y. (2019, December). A Real-time Monitoring and Control System for Crop. In Proceedings of the 2019 2nd International Conference on Algorithms, Computing and Artificial Intelligence (pp. 183-188). https://doi.org/10.1145/3377713.3377742Ferro, M. V., & Catania, P. (2023). Technologies and Innovative Methods for Precision Viticulture: A Comprehensive Review. Horticulturae, 9(3), 399.FindLight. (s.f.). Wide Dynamic Range Sensor NSC1005C. Recuperado de https://www.findlight.net/imaging-and-vision/image-sensors/area-scan-sensors/wide-dynamic-range-sensor-nsc1005cFood and Agriculture Organization (FAO). (2017). The future of food and agriculture: Trends and challenges. Descargado de FAOFood and Agriculture Organization (FAO). (2018). Fao’s work on agricultural innovation. , 20. Descargado de FAOFPN. (s/f). CloudFactory Computer Vision Wiki. Recuperado el 13 de febrero de 2024, de https://wiki.cloudfactory.com/docs/mp-wiki/model-architectures/fpnFuentes-Peñailillo, F., Ortega-Farias, S., Rivera, M., Bardeen, M., & Moreno, M. (2018, October). Using clustering algorithms to segment UAV-based RGB images. In 2018 IEEE international conference on automation/XXIII congress of the Chilean association of automatic control (ICA-ACCA) (pp. 1-5). IEEE. doi: 10.1109/ICA-ACCA.2018.8609822Fujiwara, R., Nashida, H., Fukushima, M., Suzuki, N., Sato, H., Sanada, Y., & Akiyama, Y. (2022). Convolutional neural network models help effectively estimate legume coverage in grass-legume mixed swards. Frontiers in Plant Science, 12, 763479.García-Santillán, I. D., y Pajares, G. (2018). On-line crop/weed discrimination through the Mahalanobis distance from images in maize fields [Article]. Biosystems Engineering, 166 , 28 – 43. doi: 10.1016/j.biosystemseng.2017.11.003Genze, N., Ajekwe, R., Güreli, Z., Haselbeck, F., Grieb, M., & Grimm, D. G. (2022). Deep learning-based early weed segmentation using motion blurred UAV images of sorghum fields. Computers and Electronics in Agriculture, 202, 107388.Girshick, R. (2015). Fast r-cnn. In Proceedings of the IEEE international conference on computer vision (pp. 1440-1448).Guler, R. A., Neverova, N., & Kokkinos, I. (2016). DensePose: Dense Human Pose Estimation In TheWild. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7297–7306. http://arxiv.org/abs/1612.01202Hamuda, E., Glavin, M., y Jones, E. (2016). A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture, 125, 184-199. Descargado de doi.org (2016) doi: 10.1016/j.compag.2016.04.024He, K., Gkioxari, G., Dollár, P., & Girshick, R. (2020). Mask R-CNN. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42(2), 386–397. https://doi.org/10.1109/TPAMI.2018.2844175He, K., Zhang, X., Ren, S., et al. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770–778). https://doi.org/10.1109/CVPR.2016.90Howard, A. G., et al. (2017, April). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. [Online]. arXiv preprint arXiv:1704.04861. Disponible en: http://arxiv.org/abs/1704.04861Kawamura, K., Asai, H., Yasuda, T., Soisouvanh, P., & Phongchanmixay, S. (2021). Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm. Plant Production Science, 24(2), 198-215. https://doi.org/10.1080/1343943X.2020.1829490Kirillov, A., He, K., Girshick, R., Rother, C., & Dollar, P. (2019). Panoptic segmentation. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, 9396–9405. https://doi.org/10.1109/CVPR.2019.00963Kitano, B. T., Mendes, C. C., Geus, A. R., et al. (2019). Corn plant counting using deep learning and UAV images. IEEE Geoscience and Remote Sensing Letters. https://doi.org/10.1109/LGRS.2019.2930549Kitzler, F., Wagentristl, H., Neugschwandtner, R. W., Gronauer, A., & Motsch, V. (2022). Influence of Selected Modeling Parameters on Plant Segmentation Quality Using Decision Tree Classifiers. Agriculture, 12, 1408. https://doi.org/10.3390/agriculture12091408Koirala, A., Walsh, K., Wang, Z., et al. (2019). Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of ‘mangoyolo’. Precision Agriculture, 20(6), 1107–1135. https://doi.org/10.1007/s11119-019-09642-0LeCun, Y., Bottou, L., Bengio, Y., et al. (1998). Gradient-based learning applied to document recognition. Proc IEEE, 86(11), 2278–2324. https://doi.org/10.1109/5.726791Li, L., Mu, X., Jiang, H., Chianucci, F., Hu, R., Song, W., Yan, G. (2023). Review of ground and aerial methods for vegetation cover fraction (fcover) and related quantities estimation: definitions, advances, challenges, and future perspectives [Review]. ISPRS Journal of Photogrammetry and Remote Sensing, 199, 133 – 156. (Cited by: 1) doi: 10.1016/j.isprsjprs.2023.03.020Li, W., Fu, H., Yu, L., y Cracknell, A. (2017). Deep learning based oil palm tree detection and counting for high-resolution remote sensing images. Remote Sensing, 9. doi: 10.3390/rs9010022Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision (pp. 2980-2988).Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C. Y., & Berg, A. C. (2016). SSD: Single shot multibox detector. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9905 LNCS, 21–37. https://doi.org/10.1007/978-3-319-46448-0_2Liu, H., Qi, Y., Xiao, W., Tian, H., Zhao, D., Zhang, K., Xiao, J., Lu, X., Lan, Y., & Zhang, Y. (2022). Identification of Male and Female Parents for Hybrid Rice Seed Production Using UAV-Based Multispectral Imagery. Agriculture, 12(7), 1005. https://doi.org/10.3390/agriculture12071005Lohi, S. A., & Bhatt, C. (2022). Empirical Analysis of Crop Yield Prediction and Disease Detection Systems: A Statistical Perspective. ICT Infrastructure and Computing: Proceedings of ICT4SD 2022, 49-57.Long, J., Shelhamer, E., & Darrell, T. (2015). Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 3431-3440).Lottes, P., H ̈orferlin, M., Sander, S., y Stachniss, C. (2017). Effective vision-based classification for separating sugar beets and weeds for precision farming. Journal of Field Robotics, 34, 1160-1178. (2017) doi: 10.1002/rob.21675Lottes, P., Khanna, R., Pfeifer, J., Siegwart, R., y Stachniss, C. (2017). UAV-based crop and weed classification for smart farming. Proceedings - IEEE International Conference on Robotics and Automation, 3024-3031. doi: 10.1109/ICRA.2017.7989347Lu, Y., Young, S., Wang, H., & Wijewardane, N. (2022). Robust plant segmentation of color images based on image contrast optimization. Computers and Electronics in Agriculture, 193, 106711. https://doi.org/10.1016/j.compag.2022.106711Machefer, M., Lemarchand, F., Bonnefond, V., et al. (2020). Mask R-CNN refitting strategy for plant counting and sizing in UAV imagery. Remote Sensing, 12(18). https://doi.org/10.3390/rs12183015Mardanisamani, S., y Eramian, M. (2022). Segmentation of vegetation and microplots in aerial agriculture images: A survey [Review]. Plant Phenome Journal, 5 (1). Descargado de Plant Phenome Journal doi: 10.3390/data8050088Mateen, A., y Zhu, Q. (2019). Weed detection in wheat crop using UAV for precision agriculture [Article]. Pakistan Journal of Agricultural Sciences, 56 (3), 809 – 817. (Cited by: 17) doi: 10.21162/PAKJAS/19.8116Maulit, A., Nugumanova, A., Apayev, K., Baiburin, Y., y Sutula, M. (2023). A multispectral UAV imagery dataset of wheat, soybean and barley crops in East Kazakhstan [Article]. Data, 8 (5). doi: 10.3390/data8050088Milioto, A., Lottes, P., & Stachniss, C. (2018, May). Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2229-2235). IEEE. doi: 10.1109/ICRA.2018.8460962.Morales, G., Kemper, G., Sevillano, G., et al. (2018). Automatic segmentation of Mauritia flexuosa in unmanned aerial vehicle (UAV) imagery using deep learning. Forests, 9(12). https://doi.org/10.3390/f9120736Mortimer, A. M. (2000). Capítulo 2. La clasificación y ecología de las malezas. FAO. Recuperado el dia mes año de https://www.fao.org/3/T1147S/t1147s06.htmMu, Y., Ni, R., Fu, L., Luo, T., Feng, R., Li, J., & Li, S. (2023). DenseNet weed recognition model combining local variance preprocessing and attention mechanism. Frontiers in Plant Science, 13, 1041510. https://doi.org/10.3389/fpls.2022.1041510Mukherjee, S. (2022, agosto 18). The annotated ResNet-50. Towards Data Science. https://towardsdatascience.com/the-annotated-resnet-50-a6c536034758MyBotShop. (s.f.). Clearpath Husky A200. Recuperado de https://www.mybotshop.de/Clearpath-Husky-A200_3Neupane, B., Horanont, T., & Hung, N. D. (2019). Deep learning based banana plant detection and counting using high-resolution red-green-blue (RGB) images collected from unmanned aerial vehicle (UAV). PLoS One, 14(10), e0223906. https://doi.org/1Ngo, U. Q., Ngo, D. T., Nguyen, H. T., y Bui, T. D. (2022). Digital image processing methods for estimating leaf area of cucumber plants [Article]. Indonesian Journal of Electrical Engineering and Computer Science, 25 (1), 317 – 328. doi: 10.11591/ijeecs.v25.i1.pp317-328Pashaei, M., Kamangir, H., Starek, M. J., & Tissot, P. (2020). Review and Evaluation of Deep Learning Architectures for Efficient Land Cover Mapping with UAS Hyper-Spatial Imagery: A Case Study Over a Wetland. Remote Sensing, 12(6), 959. https://doi.org/10.3390/rs12060959Patidar, P. K., Tomar, D. S., Pateriya, R. K., & Sharma, Y. K. (2023, May). Precision Agriculture: Crop Image Segmentation and Loss Evaluation through Drone Surveillance. In 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC) (pp. 495-500). IEEE. doi: 10.1109/ICSCCC58608.2023.10176980Pierce, F. J., y Nowak, P. (1999). Aspects of precision agriculture. En D. L. Sparks (Ed.), (Vol. 67, p. 1-85). Academic Press. Descargado de ScienceDirect doi: https://doi.org/10.1016/S0065-2113(08)60513-1Puerta-Zapata, J., Cadavid-Castro, M. A., Montoya-Betancur, K. V., & Álvarez-Castaño, L. S. (2023). Distribución tradicional y corporativa de alimentos en una zona urbana: estudio de casos colectivos en San Cristóbal, Medellín-Colombia. Revista de Investigación, Desarrollo e Innovación, 13(1), 157-172. https://doi.org/10.19053/20278306.v13.n1.2023.16058Qamar, T., & Bawany, N. Z. (2023). Agri-PAD: a scalable framework for smart agriculture. Indonesian Journal of Electrical Engineering and Computer Science, 29(3), 1597-1605. doi:10.11591/ijeecs.v29.i3.pp1597-1605Quan, L., Jiang, W., Li, H., Li, H., Wang, Q., & Chen, L. (2022). Intelligent intra-row robotic weeding system combining deep learning technology with a targeted weeding mode. Biosystems Engineering, 216, 13-31.Quiroz, R. A. A., Guidotti, F. P., y Bedoya, A. E. (2019). A method for automatic identification of crop lines in drone images from a mango tree plantation using segmentation over YCrCb color space and Hough transform. 2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings. (2019) doi: 10.1109/STSIVA.2019.8730214Radoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., y Moscholios, I. (2020). A compilation of UAV applications for precision agriculture. Computer Networks, 172, 107148. Descargado de doi.org doi: 10.1016/j.comnet.2020.107148Rampersad, H. (2020). Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. Total Performance Scorecard, 159–183. https://doi.org/10.4324/9780080519340-12Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem, 779–788. https://doi.org/10.1109/CVPR.2016.91Rehman, T. U., Zaman, Q. U., Chang, Y. K., Schumann, A. W., & Corscadden, K. W. (2019). Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Computers and Electronics in Agriculture, 162, 1-13. https://doi.org/10.1016/j.compag.2019.03.023Ren, S., He, K., Girshick, R., & Sun, J. (2015). Faster r-cnn: Towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28.Restrepo-Arias, J. (2023). Método de clasificación de imágenes, empleando técnicas de inteligencia artificial, integrado a una plataforma IoT de agricultura inteligente. Universidad Nacional de Colombia. https://repositorio.unal.edu.co/handle/unal/83849Ronneberger, O., Fischer, P., & Brox, T. (2015). U-Net: Convolutional Networks for Biomedical Image Segmentation. In N. Navab, J. Hornegger, W. M. Wells, & A. F. Frangi (Eds.), Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015 (pp. 234–241). Springer International Publishing. https://doi.org/10.1007/978-3-319-24574-4_28Roychowdhury, S. (2021). U-net-for-Multi-class-semantic-segmentation. Recuperado de: https://github.com/sohiniroych/U-net-for-Multi-class-semantic-segmentationRoychowdhury, S., Koozekanani, D. D., & Parhi, K. K. (2014, septiembre). DREAM: Diabetic Retinopathy Analysis Using Machine Learning. IEEE Journal of Biomedical and Health Informatics, 18(5), 1717-1728. https://doi.org/10.1109/JBHI.2013.2294635Saiz-Rubio, V., y Rovira-Más, F. (2020, 2). From smart farming towards agriculture 5.0: A review on crop data management (Vol. 10). MDPI. doi: 10.3390/agronomy10020207.Salvador Lopez, J. (2022). Aprendizaje profundo para Análisis de Maquetación en documentos manuscritos. Universitat Politècnica de València. http://hdl.handle.net/10251/186330.Santos, A. A., Marcato Junior, J., Araújo, M. S., et al. (2019). Assessment of CNN-based methods for individual tree detection on images captured by RGB cameras attached to UAVs. Sensors, 19(16), 3595. https://doi.org/10.3390/s19163595SkyMotion. (s.f.). DJI Mini 2 SE - Skymotion. Recuperado de https://skymotion.com.co/products/dji-mini-2-se?variant=47192926126397Schrijver, R. (2016). Precision agriculture and the future of farming in Europe: Scientific foresight study: Study. European Parliament.Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556.Song, Z., Zhang, Z., Yang, S., et al. (2020). Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Computers and Electronics in Agriculture, 179(105), 812. https://doi.org/10.1016/j.compag.2020.105812Suh, H. K., Hofstee, J. W., & Van Henten, E. J. (2020). Investigation on combinations of colour indices and threshold techniques in vegetation segmentation for volunteer potato control in sugar beet. Computers and Electronics in Agriculture, 179, 105819.Tan, C., Zhang, P., Zhang, Y., Zhou, X., Wang, Z., Du, Y., ... & Guo, W. (2020). Rapid recognition of field-grown wheat spikes based on a superpixel segmentation algorithm using digital images. Frontiers in Plant Science, 11, 259. https://doi.org/10.3389/fpls.2020.00259Tan, M., & Le, Q. V. (2019). EfficientNet: Rethinking model scaling for convolutional neural networks. In 36th International Conference on Machine Learning, ICML 2019 (Vol. 2019-June, pp. 10691–10700).Torrey, Lisa; Shavlik, J. (2010). Transfer Learning. Handbook of Research on Machine Learning Applications, IGI Global, 657–665. https://doi.org/10.1201/b17320Triantafyllou, A., Sarigiannidis, P., y Bibi, S. (2019). Precision agriculture: A remote sensing monitoring system architecture. Information (Switzerland), 10. doi: 10.3390/info10110348Trivelli, L., Apicella, A., Chiarello, F., Rana, R., Fantoni, G., y Tarabella, A. (2019). From precision agriculture to industry 4.0: Unveiling technological connections in the agrifood sector. British Food Journal, 121(8), 1730–1743.Unidad Administrativa Especial de Aeronáutica Civil (UAEAC). (2023). Rac 91: Reglas generales de vuelo y operación.United Nations Development Programme (UNDP). (2021). What are the sustainable development goals? Descargado el 2023-11-07, de UNDPvelog. (s/f). Velog.io. Recuperado el 13 de febrero de 2024, de https://velog.io/@skhim520/DeepLab-v3Wang, X., Jiang, G., Zhang, H., Zhao, H., Chen, Y., Mei, C., y Jia, Z. (2020). Grayscale distribution of maize canopy based on HLS-SVM method [Article]. International Journal of Food Properties, 23(1), 839 – 852. doi: 10.1080/10942912.2020.1758717Wang, J., Yao, X., & Nguyen, B. K. (2022, October 12). Identification and localisation of multiple weeds in grassland for removal operation. In Proc. SPIE 12342, Fourteenth International Conference on Digital Image Processing (ICDIP 2022) (p. 123420Z). https://doi.org/10.1117/12.2644281Wu, J., Yang, G., Yang, H., et al. (2020). Extracting apple tree crown information from remote imagery using deep learning. Computers and Electronics in Agriculture, 174(105), 504. https://doi.org/10.1016/j.compag.2020.105504Wu, Yuxin; Kirillov, Alexander; Massa, Francisco; Lo, W.-Y., & Girshick, R. (2019). Detectron2. https://github.com/facebookresearch/detectron2Xu, K., Li, H., Cao, W., Zhu, Y., Chen, R., & Ni, J. (2020). Recognition of weeds in wheat fields based on the fusion of RGB images and depth images. IEEE Access, 8, 110362-110370.Xu, B., Fan, J., Chao, J., Arsenijevic, N., Werle, R., y Zhang, Z. (2023). Instance segmentation method for weed detection using UAV imagery in soybean fields [Article]. Computers and Electronics in Agriculture, 211. (Cited by: 0) doi: 10.1016/j.compag.2023.107994Yang, M. D., Boubin, J. G., Tsai, H. P., Tseng, H. H., Hsu, Y. C., & Stewart, C. C. (2020). Adaptive autonomous UAV scouting for rice lodging assessment using edge computing with deep learning EDANet. Computers and Electronics in Agriculture, 179, 105817. https://doi.org/10.1016/j.compag.2020.105817Yang, L., Bi, P., Tang, H., Zhang, F., & Wang, Z. (2022). Improving vegetation segmentation with shadow effects based on double input networks using polarization images. Computers and Electronics in Agriculture, 199, 107123. https://doi.org/10.1016/j.compag.2022.107123Yang, M. D., Tseng, H. H., Hsu, Y. C., et al. (2020). Semantic segmentation using deep learning with vegetation indices for rice lodging identification in multi-date UAV visible images. Remote Sensing, 12(4). https://doi.org/10.3390/rs12040633You, K., Liu, Y., Wang, J., & Long, M. (2021). LogME: Practical Assessment of Pre-trained Models for Transfer Learning. http://arxiv.org/abs/2102.11005Yuan, J., Xue, B., Zhang, W., Xu, L., Sun, H., & Zhou, J. (2019). RPN-FCN based Rust detection on power equipment. Procedia Computer Science, 147, 349–353. https://doi.org/10.1016/j.procs.2019.01.236Zhang, J., Zhao, B., Yang, C., Shi, Y., Liao, Q., Zhou, G., Xie, J. (2020). Rapeseed stand count estimation at leaf development stages with UAV imagery and convolutional neural networks [Article]. Frontiers in Plant Science, 11. (Cited by: 22; All Open Access, Gold Open Access, Green Open Access) doi: 10.3389/fpls.2020.00617Zhang, X., Wang, Z., Liu, D., Lin, Q., y Ling, Q. (2021, 1). Deep adversarial data augmentation for extremely low data regimes. IEEE Transactions on Circuits and Systems for Video Technology, 31, 15-28. doi: 10.1109/TCSVT.2020.2967419Zhang, Y., Wang, C., Wang, Y., y Cheng, P. (2022). Determining the stir-frying degree of Gardeniae Fructus Praeparatus based on deep learning and transfer learning [Article]. Sensors, 22(21). doi: 10.3390/s22218091Zheng, H., Zhou, X., He, J., Yao, X., Cheng, T., Zhu, Y., ... & Tian, Y. (2020). Early season detection of rice plants using RGB, NIR-GB and multispectral images from unmanned aerial vehicle (UAV). Computers and Electronics in Agriculture, 169, 105223. https://doi.org/10.1016/j.compag.2020.105223Zhao, H., Shi, J., Qi, X., Wang, X., & Jia, J. (2017). Pyramid Scene Parsing Network. 2881–2890. https://openaccess.thecvf.com/content_cvpr_2017/html/Zhao_Pyramid_Scene_Parsing_CVPR_2017_paper.htmlZhuang, F., Qi, Z., Duan, K., Xi, D., Zhu, Y., Zhu, H., Xiong, H., & He, Q. (2021). A Comprehensive Survey on Transfer Learning. Proceedings of the IEEE, 109(1), 43–76. https://doi.org/10.1109/JPROC.2020.3004555Zhuang, S., Wang, P., y Jiang, B. (2020). Vegetation extraction in the field using multi-level features [Article]. Biosystems Engineering, 197, 352 – 366. (Cited by: 3) doi: 10.1016/j.biosystemseng.2020.07.013EstudiantesInvestigadoresMaestrosLICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/86302/1/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD51ORIGINAL1017232348.2024.pdf1017232348.2024.pdfTesis de Maestría en Ingeniería - Analíticaapplication/pdf20338222https://repositorio.unal.edu.co/bitstream/unal/86302/2/1017232348.2024.pdf7b88a3a60c69e9d956331da63d77d2e0MD52THUMBNAIL1017232348.2024.pdf.jpg1017232348.2024.pdf.jpgGenerated Thumbnailimage/jpeg4997https://repositorio.unal.edu.co/bitstream/unal/86302/3/1017232348.2024.pdf.jpgb689a272a38aaa83589a4971f19d63b5MD53unal/86302oai:repositorio.unal.edu.co:unal/863022024-06-25 23:05:50.017Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo= |