Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial
Ilustraciones
- Autores:
-
Lopez Castaño, Alay Camilo
- Tipo de recurso:
- Fecha de publicación:
- 2024
- Institución:
- Universidad Nacional de Colombia
- Repositorio:
- Universidad Nacional de Colombia
- Idioma:
- spa
- OAI Identifier:
- oai:repositorio.unal.edu.co:unal/86335
- Palabra clave:
- 000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores
600 - Tecnología (Ciencias aplicadas)::607 - Educación, investigación, temas relacionados
620 - Ingeniería y operaciones afines::629 - Otras ramas de la ingeniería
Líneas eléctricas
Robótica
Líneas eléctricas - Automatización
Líneas eléctricas - Mantenimiento
Detección de Líneas Eléctricas Compactas
Robótica Autónoma
Segmentación de Instancias
Visión por Computadora
SGBM
YOLOv8
Compact Power Line Detection
Autonomous Robotics
Instance Segmentation
Computer Vision
- Rights
- openAccess
- License
- Atribución-NoComercial 4.0 Internacional
id |
UNACIONAL2_88eb54f91aca9dd4e109d8c808e3f374 |
---|---|
oai_identifier_str |
oai:repositorio.unal.edu.co:unal/86335 |
network_acronym_str |
UNACIONAL2 |
network_name_str |
Universidad Nacional de Colombia |
repository_id_str |
|
dc.title.spa.fl_str_mv |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
dc.title.translated.eng.fl_str_mv |
Detection, localization and clamping of a medium voltage line by means of artificial vision techniques |
title |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
spellingShingle |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial 000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 600 - Tecnología (Ciencias aplicadas)::607 - Educación, investigación, temas relacionados 620 - Ingeniería y operaciones afines::629 - Otras ramas de la ingeniería Líneas eléctricas Robótica Líneas eléctricas - Automatización Líneas eléctricas - Mantenimiento Detección de Líneas Eléctricas Compactas Robótica Autónoma Segmentación de Instancias Visión por Computadora SGBM YOLOv8 Compact Power Line Detection Autonomous Robotics Instance Segmentation Computer Vision |
title_short |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
title_full |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
title_fullStr |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
title_full_unstemmed |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
title_sort |
Detección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificial |
dc.creator.fl_str_mv |
Lopez Castaño, Alay Camilo |
dc.contributor.advisor.none.fl_str_mv |
Bolaños Martínez, Freddy Zapata Madrigal, German Darío |
dc.contributor.author.none.fl_str_mv |
Lopez Castaño, Alay Camilo |
dc.contributor.researchgroup.spa.fl_str_mv |
Grupo Teleinformatica y Teleautomatica |
dc.contributor.orcid.spa.fl_str_mv |
Lopez Castaño, Alay Camilo [0009-0007-4283-5531] |
dc.contributor.cvlac.spa.fl_str_mv |
https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=0001997474 |
dc.subject.ddc.spa.fl_str_mv |
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 600 - Tecnología (Ciencias aplicadas)::607 - Educación, investigación, temas relacionados 620 - Ingeniería y operaciones afines::629 - Otras ramas de la ingeniería |
topic |
000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores 600 - Tecnología (Ciencias aplicadas)::607 - Educación, investigación, temas relacionados 620 - Ingeniería y operaciones afines::629 - Otras ramas de la ingeniería Líneas eléctricas Robótica Líneas eléctricas - Automatización Líneas eléctricas - Mantenimiento Detección de Líneas Eléctricas Compactas Robótica Autónoma Segmentación de Instancias Visión por Computadora SGBM YOLOv8 Compact Power Line Detection Autonomous Robotics Instance Segmentation Computer Vision |
dc.subject.lemb.none.fl_str_mv |
Líneas eléctricas Robótica Líneas eléctricas - Automatización Líneas eléctricas - Mantenimiento |
dc.subject.proposal.spa.fl_str_mv |
Detección de Líneas Eléctricas Compactas Robótica Autónoma Segmentación de Instancias Visión por Computadora |
dc.subject.proposal.eng.fl_str_mv |
SGBM YOLOv8 Compact Power Line Detection Autonomous Robotics Instance Segmentation Computer Vision |
description |
Ilustraciones |
publishDate |
2024 |
dc.date.accessioned.none.fl_str_mv |
2024-07-02T13:39:16Z |
dc.date.available.none.fl_str_mv |
2024-07-02T13:39:16Z |
dc.date.issued.none.fl_str_mv |
2024 |
dc.type.spa.fl_str_mv |
Trabajo de grado - Maestría |
dc.type.driver.spa.fl_str_mv |
info:eu-repo/semantics/masterThesis |
dc.type.version.spa.fl_str_mv |
info:eu-repo/semantics/acceptedVersion |
dc.type.content.spa.fl_str_mv |
Text |
dc.type.redcol.spa.fl_str_mv |
http://purl.org/redcol/resource_type/TM |
status_str |
acceptedVersion |
dc.identifier.uri.none.fl_str_mv |
https://repositorio.unal.edu.co/handle/unal/86335 |
dc.identifier.instname.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.identifier.reponame.spa.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
dc.identifier.repourl.spa.fl_str_mv |
https://repositorio.unal.edu.co/ |
url |
https://repositorio.unal.edu.co/handle/unal/86335 https://repositorio.unal.edu.co/ |
identifier_str_mv |
Universidad Nacional de Colombia Repositorio Institucional Universidad Nacional de Colombia |
dc.language.iso.spa.fl_str_mv |
spa |
language |
spa |
dc.relation.indexed.spa.fl_str_mv |
LaReferencia |
dc.relation.references.spa.fl_str_mv |
[1] J. Bedi and D. Toshniwal, “Deep learning framework to forecast electricity demand,” Appl Energy, vol. 238, pp. 1312–1326, Mar. 2019, doi: 10.1016/j.apenergy.2019.01.113. [2] V. Padmanathan, L. Joseph, B. Omar, and R. Nawawi, “Prevalence of musculoskeletal disorders and related occupational causative factors among electricity linemen: A narrative review,” Int J Occup Med Environ Health, vol. 29, no. 5, pp. 725–734, Jul. 2016, doi: 10.13075/ijomeh.1896.00659. [3] A. Marroquin, A. Rehman, and A. Madani, “High-Voltage Arc Flash Assessment and Applications,” IEEE Trans Ind Appl, vol. 56, no. 3, pp. 2205–2215, May 2020, doi: 10.1109/TIA.2020.2980467. [4] B. Brenner, J. C. Cawley, and D. Majano, “Electrically Hazardous Jobs in the U.S.,” IEEE Trans Ind Appl, vol. 56, no. 3, pp. 2190–2195, May 2020, doi: 10.1109/TIA.2020.2980221. [5] B. Brenner and J. C. Cawley, “Occupations most at-risk in fatal overhead power line incidents: Using osha data to get a better understanding,” in 2015 IEEE IAS Electrical Safety Workshop, IEEE, Jan. 2015, pp. 1–6. doi: 10.1109/ESW.2015.7094939. [6] S. Pooladvand and S. Hasanzadeh, “Neurophysiological evaluation of workers’ decision dynamics under time pressure and increased mental demand,” Autom Constr, vol. 141, p. 104437, Sep. 2022, doi: 10.1016/j.autcon.2022.104437. [7] G. Gocsei, B. Nemeth, D. Szabo, and V. Faradzhev, “Induced Voltage: A Major Risk Not Only During Live Working,” in 2022 13th International Conference on Live Maintenance (ICOLIM), IEEE, Jun. 2022, pp. 1–5. doi: 10.1109/ICOLIM56184.2022.9840538. [8] K. Suresh and S. Paranthaman, “Transferred Potential—A Hidden Killer of Many Linemen,” IEEE Trans Ind Appl, vol. 51, no. 3, pp. 2691–2699, May 2015, doi: 10.1109/TIA.2014.2375386. [9] O. Menendez, F. A. Auat Cheein, M. Perez, and S. Kouro, “Robotics in Power Systems: Enabling a More Reliable and Safe Grid,” IEEE Industrial Electronics Magazine, vol. 11, no. 2, pp. 22–34, Jun. 2017, doi: 10.1109/MIE.2017.2686458. [10] D. Zhang, J. Cao, G. Dobie, and C. MacLeod, “A Framework of Using Customized LIDAR to Localize Robot for Nuclear Reactor Inspections,” IEEE Sens J, vol. 22, no. 6, pp. 5352–5359, Mar. 2022, doi: 10.1109/JSEN.2021.3083478. [11] T. Wang, Y. Zhao, L. Zhu, G. Liu, Z. Ma, and J. Zheng, “Research on control system of working robot in nuclear environment based on Neural Network PID,” in 2020 Chinese Automation Congress (CAC), IEEE, Nov. 2020, pp. 4828–4831. doi: 10.1109/CAC51589.2020.9327197. [12] S. Jimenez, D. Bookless, R. Nath, W. J. Leong, J. Kotaniemi, and P. Tikka, “Automated maintenance feasibility testing on the EU DEMO Automated Inspection and Maintenance Test Unit (AIM-TU),” Fusion Engineering and Design, vol. 170, p. 112517, Sep. 2021, doi: 10.1016/j.fusengdes.2021.112517. [13] J. Franko, S. Du, S. Kallweit, E. Duelberg, and H. Engemann, “Design of a Multi-Robot System for Wind Turbine Maintenance,” Energies (Basel), vol. 13, no. 10, p. 2552, May 2020, doi: 10.3390/en13102552. [14] F. Han, J. Yao, H. Zhu, and C. Wang, “Underwater Image Processing and Object Detection Based on Deep CNN Method,” J Sens, vol. 2020, pp. 1–20, May 2020, doi: 10.1155/2020/6707328. [15] E. K. Chiou et al., “Towards Human–Robot Teaming: Tradeoffs of Explanation-Based Communication Strategies in a Virtual Search and Rescue Task,” Int J Soc Robot, vol. 14, no. 5, pp. 1117–1136, Jul. 2022, doi: 10.1007/s12369-021-00834-1. [16] S. Montambault and N. Pouliot, “Hydro-Québec’s Power Line Robotics Program: 15 years of development, implementation and partnerships,” in Proceedings of the 2014 3rd International Conference on Applied Robotics for the Power Industry, IEEE, Oct. 2014, pp. 1–6. doi: 10.1109/CARPI.2014.7030065. [17] R. Aracil, M. Ferre, M. Hernando, E. Pinto, and J. M. Sebastian, “Telerobotic system for live-power line maintenance: ROBTET,” Control Eng Pract, vol. 10, no. 11, pp. 1271–1281, Nov. 2002, doi: 10.1016/S0967-0661(02)00182-X. [18] C. Friedrich, A. Csiszar, A. Lechler, and A. Verl, “Efficient Task and Path Planning for Maintenance Automation Using a Robot System,” IEEE Transactions on Automation Science and Engineering, vol. 15, no. 3, pp. 1205–1215, Jul. 2018, doi: 10.1109/TASE.2017.2759814. [19] M. S. Alvarez-Alvarado et al., “Power System Reliability and Maintenance Evolution: A Critical Review and Future Perspectives,” IEEE Access, vol. 10, pp. 51922–51950, 2022, doi: 10.1109/ACCESS.2022.3172697. [20] M. H. Sayour, S. E. Kozhaya, and S. S. Saab, “Autonomous Robotic Manipulation: Real-Time, Deep-Learning Approach for Grasping of Unknown Objects,” Journal of Robotics, vol. 2022, pp. 1–14, Jun. 2022, doi: 10.1155/2022/2585656. [21] F. García-Luna and A. Morales-Díaz, “Towards an artificial vision-robotic system for tomato identification,” IFAC-PapersOnLine, vol. 49, no. 16, pp. 365–370, 2016, doi: 10.1016/j.ifacol.2016.10.067. [22] J. A. Bagnell et al., “An integrated system for autonomous robotics manipulation,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, Oct. 2012, pp. 2955–2962. doi: 10.1109/IROS.2012.6385888. [23] A. Billard and D. Kragic, “Trends and challenges in robot manipulation,” Science (1979), vol. 364, no. 6446, p. eaat8414, 2019, doi: 10.1126/science.aat8414. [24] F. Sun, C. Liu, W. Huang, and J. Zhang, “Object Classification and Grasp Planning Using Visual and Tactile Sensing,” IEEE Trans Syst Man Cybern Syst, vol. 46, no. 7, pp. 969–979, Jul. 2016, doi: 10.1109/TSMC.2016.2524059. [25] Z. Zhou, L. Li, A. Fürsterling, H. J. Durocher, J. Mouridsen, and X. Zhang, “Learning-based object detection and localization for a mobile robot manipulator in SME production,” Robot Comput Integr Manuf, vol. 73, p. 102229, Feb. 2022, doi: 10.1016/j.rcim.2021.102229. [26] Q. Bai, S. Li, J. Yang, Q. Song, Z. Li, and X. Zhang, “Object Detection Recognition and Robot Grasping Based on Machine Learning: A Survey,” IEEE Access, vol. 8, pp. 181855–181879, 2020, doi: 10.1109/ACCESS.2020.3028740. [27] M. Mainampati and B. Chandrasekaran, “Evolution of Machine Learning Algorithms on Autonomous Robots,” in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), IEEE, Jan. 2020, pp. 0737–0741. doi: 10.1109/CCWC47524.2020.9031137. [28] Q. M. Marwan, S. C. Chua, and L. C. Kwek, “Comprehensive Review on Reaching and Grasping of Objects in Robotics,” Robotica, vol. 39, no. 10, pp. 1849–1882, Oct. 2021, doi: 10.1017/S0263574721000023. [29] P. Ramon Soria, B. Arrue, and A. Ollero, “Detection, Location and Grasping Objects Using a Stereo Sensor on UAV in Outdoor Environments,” Sensors, vol. 17, no. 12, p. 103, Jan. 2017, doi: 10.3390/s17010103. [30] H. Karaoguz and P. Jensfelt, “Object Detection Approach for Robot Grasp Detection,” in 2019 International Conference on Robotics and Automation (ICRA), IEEE, May 2019, pp. 4953–4959. doi: 10.1109/ICRA.2019.8793751. [31] R. P. Khurshid, N. T. Fitter, E. A. Fedalei, and K. J. Kuchenbecker, “Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task,” IEEE Trans Haptics, vol. 10, no. 1, pp. 40–53, Jan. 2017, doi: 10.1109/TOH.2016.2573301. [32] J. A. Haustein, K. Hang, J. Stork, and D. Kragic, “Object Placement Planning and optimization for Robot Manipulators,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Nov. 2019, pp. 7417–7424. doi: 10.1109/IROS40897.2019.8967732. [33] S. Noh, C. Park, and J. Park, “Position-Based Visual Servoing of Multiple Robotic Manipulators: Verification in Gazebo Simulator,” in 2020 International Conference on Information and Communication Technology Convergence (ICTC), IEEE, Oct. 2020, pp. 843–846. doi: 10.1109/ICTC49870.2020.9289554. [34] A. Taherian, A. H. Mazinan, and M. Aliyari-Shoorehdeli, “Image-based visual servoing improvement through utilization of adaptive control gain and pseudo-inverse of the weighted mean of the Jacobians,” Computers & Electrical Engineering, vol. 83, p. 106580, May 2020, doi: 10.1016/j.compeleceng.2020.106580. [35] D. Xu, J. Lu, P. Wang, Z. Zhang, D. Zhang, and Z. Liang, “A new image-based visual servoing method with rotational compensation,” in 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2016, pp. 1099–1104. doi: 10.1109/ROBIO.2016.7866472. [36] F. Chaumette, S. Hutchinson, and P. Corke, “Visual Servoing,” 2016, pp. 841–866. doi: 10.1007/978-3-319-32552-1_34. [37] F. Janabi-Sharifi, L. Deng, and W. J. Wilson, “Comparison of Basic Visual Servoing Methods,” IEEE/ASME Transactions on Mechatronics, vol. 16, no. 5, pp. 967–983, Oct. 2011, doi: 10.1109/TMECH.2010.2063710. [38] G. Palmieri, M. Palpacelli, M. Battistelli, and M. Callegari, “A Comparison between Position-Based and Image-Based Dynamic Visual Servoings in the Control of a Translating Parallel Manipulator,” Journal of Robotics, vol. 2012, pp. 1–11, 2012, doi: 10.1155/2012/103954. [39] S. Hutchinson, G. D. Hager, and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651–670, 1996, doi: 10.1109/70.538972. [40] T. Mao et al., “Development of Power Transmission Line Defects Diagnosis System for UAV Inspection based on Binocular Depth Imaging Technology,” in 2019 2nd International Conference on Electrical Materials and Power Equipment (ICEMPE), IEEE, Apr. 2019, pp. 478–481. doi: 10.1109/ICEMPE.2019.8727361. [41] V. Lippiello, B. Siciliano, and L. Villani, “Eye-in-Hand/Eye-to-Hand Multi-Camera Visual Servoing,” in Proceedings of the 44th IEEE Conference on Decision and Control, IEEE, pp. 5354–5359. doi: 10.1109/CDC.2005.1583013. [42] S. Beeran Kutty, S. Saaidin, P. N. A. Megat Yunus, and S. Abu Hassan, “Evaluation of canny and sobel operator for logo edge detection,” in 2014 International Symposium on Technology Management and Emerging Technologies, IEEE, May 2014, pp. 153–156. doi: 10.1109/ISTMET.2014.6936497. [43] J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans Pattern Anal Mach Intell, vol. 14, no. 10, pp. 965–980, 1992, doi: 10.1109/34.159901. [44] Z. Zou, K. Chen, Z. Shi, Y. Guo, and J. Ye, “Object Detection in 20 Years: A Survey,” Proceedings of the IEEE, vol. 111, no. 3, pp. 257–276, Mar. 2023, doi: 10.1109/JPROC.2023.3238524. [45] R. Padilla, S. L. Netto, and E. A. B. da Silva, “A Survey on Performance Metrics for Object-Detection Algorithms,” in 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), IEEE, Jul. 2020, pp. 237–242. doi: 10.1109/IWSSIP48289.2020.9145130. [46] P. Machado, A. Oikonomou, J. F. Ferreira, and T. M. Mcginnity, “HSMD: An Object Motion Detection Algorithm Using a Hybrid Spiking Neural Network Architecture,” IEEE Access, vol. 9, pp. 125258–125268, 2021, doi: 10.1109/ACCESS.2021.3111005. [47] P. Sharma and D. Valles, “Deep Convolutional Neural Network Design Approach for 3D Object Detection for Robotic Grasping,” in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), IEEE, Jan. 2020, pp. 0311–0316. doi: 10.1109/CCWC47524.2020.9031186. [48] P. Li, C. Zhang, J. Peng, Y. Ding, and J. Zhan, “Effect of Baseline Distance and Corner Consistency on Binocular Visual Locating,” in 2020 IEEE 6th International Conference on Computer and Communications (ICCC), IEEE, Dec. 2020, pp. 1471–1475. doi: 10.1109/ICCC51575.2020.9345063. [49] Y. Jiao and P.-H. Ho, “Design of Binocular Stereo Vision System Via CNN-based Stereo Matching Algorithm,” in 2021 International Conference on Networking and Network Applications (NaNA), IEEE, Oct. 2021, pp. 426–431. doi: 10.1109/NaNA53684.2021.00080. [50] R. A. Hamzah and H. Ibrahim, “Literature Survey on Stereo Vision Disparity Map Algorithms,” J Sens, vol. 2016, pp. 1–23, 2016, doi: 10.1155/2016/8742920. [51] D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings., IEEE Comput. Soc, pp. I-195-I–202. doi: 10.1109/CVPR.2003.1211354. [52] B. Hazel, J. Côté, Y. Laroche, and P. Mongenot, “A portable, multiprocess, track-based robot for in situ work on hydropower equipment,” J Field Robot, vol. 29, no. 1, pp. 69–101, Jan. 2012, doi: 10.1002/rob.20425. [53] A. B. Alhassan, X. Zhang, H. Shen, G. Jian, H. Xu, and K. Hamza, “Investigation of Aerodynamic Stability of a Lightweight Dual-Arm Power Transmission Line Inspection Robot under the Influence of Wind,” Math Probl Eng, vol. 2019, pp. 1–16, Nov. 2019, doi: 10.1155/2019/2139462. [54] N. Pouliot, P.-L. Richard, and S. Montambault, “LineScout Technology Opens the Way to Robotic Inspection and Maintenance of High-Voltage Power Lines,” IEEE Power and Energy Technology Systems Journal, vol. 2, no. 1, pp. 1–11, Mar. 2015, doi: 10.1109/JPETS.2015.2395388. [55] N. Pouliot and S. Montambaut, “Sensors for the non-destructive evaluation of ACSR, deployed with live-line robotics,” in 2017 12th International Conference on Live Maintenance (ICOLIM), IEEE, Apr. 2017, pp. 1–1. doi: 10.1109/ICOLIM.2017.7964162. [56] S. Montambault and N. Pouliot, “The HQ LineROVer: contributing to innovation in transmission line maintenance,” in 2003 IEEE 10th International Conference on Transmission and Distribution Construction, Operation and Live-Line Maintenance, 2003. 2003 IEEE ESMO., IEEE, pp. 33–40. doi: 10.1109/TDCLLM.2003.1196466. [57] J. Zhao, R. Guo, L. Cao, and F. Zhang, “Improvement of LineROVer: A mobile robot for de-icing of transmission lines,” in 2010 1st International Conference on Applied Robotics for the Power Industry (CARPI 2010), IEEE, Oct. 2010, pp. 1–4. doi: 10.1109/CARPI.2010.5624458. [58] F. Zhang et al., “Extended applications of LineROVer Technology,” in 2013 10th IEEE International Conference on Control and Automation (ICCA), IEEE, Jun. 2013, pp. 1415–1418. doi: 10.1109/ICCA.2013.6564877. [59] S. Montambault and N. Pouliot, “LineScout Technology: Development of an Inspection Robot Capable of Clearing Obstacles While Operating on a Live Line,” in ESMO 2006 - 2006 IEEE 11th International Conference on Transmission & Distribution Construction, Operation and Live-Line Maintenance, IEEE, 2006. doi: 10.1109/TDCLLM.2006.340744. [60] P.-L. Richard et al., “LineRanger: Analysis and Field Testing of an Innovative Robot for Efficient Assessment of Bundled High-Voltage Powerlines,” in 2019 International Conference on Robotics and Automation (ICRA), IEEE, May 2019, pp. 9130–9136. doi: 10.1109/ICRA.2019.8794397. [61] W. Chang, G. Yang, J. Yu, Z. Liang, L. Cheng, and C. Zhou, “Development of a power line inspection robot with hybrid operation modes,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Sep. 2017, pp. 973–978. doi: 10.1109/IROS.2017.8202263. [62] F. Miralles et al., “LineDrone Technology: Landing an Unmanned Aerial Vehicle on a Power Line,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, May 2018, pp. 6545–6552. doi: 10.1109/ICRA.2018.8461250. [63] P. Debenest et al., “Expliner - Robot for inspection of transmission lines,” in 2008 IEEE International Conference on Robotics and Automation, IEEE, May 2008, pp. 3978–3984. doi: 10.1109/ROBOT.2008.4543822. [64] P. Debenest and M. Guarnieri, “Expliner - From prototype towards a practical robot for inspection of high-voltage lines,” in 2010 1st International Conference on Applied Robotics for the Power Industry (CARPI 2010), IEEE, Oct. 2010, pp. 1–6. doi: 10.1109/CARPI.2010.5624434. [65] C. M. Shruthi, A. P. Sudheer, and M. L. Joy, “Dual arm electrical transmission line robot: motion through straight and jumper cable,” Automatika, vol. 60, no. 2, pp. 207–226, Apr. 2019, doi: 10.1080/00051144.2019.1609256. [66] R. Miller, F. Abbasi, and J. Mohammadpour, “Power line robotic device for overhead line inspection and maintenance,” Industrial Robot: An International Journal, vol. 44, no. 1, pp. 75–84, Jan. 2017, doi: 10.1108/IR-06-2016-0165. [67] G. Zhou, J. Yuan, I.-L. Yen, and F. Bastani, “Robust real-time UAV based power line detection and tracking,” in 2016 IEEE International Conference on Image Processing (ICIP), IEEE, Sep. 2016, pp. 744–748. doi: 10.1109/ICIP.2016.7532456. [68] J. Zhang, L. Liu, B. Wang, X. Chen, Q. Wang, and T. Zheng, “High Speed Automatic Power Line Detection and Tracking for a UAV-Based Inspection,” in 2012 International Conference on Industrial Control and Electronics Engineering, IEEE, Aug. 2012, pp. 266–269. doi: 10.1109/ICICEE.2012.77. [69] F. Shuang, X. Chen, Y. Li, Y. Wang, N. Miao, and Z. Zhou, “PLE: Power Line Extraction Algorithm for UAV-Based Power Inspection,” IEEE Sens J, vol. 22, no. 20, pp. 19941–19952, Oct. 2022, doi: 10.1109/JSEN.2022.3202033. [70] W. Wenfeng, Z. Shuhua, F. Yihao, and D. Weili, “Parallel edges detection from remote sensing image using local orientation coding,” Acta Optica Sinica, vol. 32, no. 3, p. 315001, 2012. [71] C. Yu, B. Qu, Y. Zhu, Y. Ji, H. Zhao, and Z. Xing, “Design of the Transmission Line Inspection System Based on UAV,” in 2020 10th International Conference on Power and Energy Systems (ICPES), IEEE, Dec. 2020, pp. 543–548. doi: 10.1109/ICPES51309.2020.9349675. [72] X. Hui, J. Bian, Y. Yu, X. Zhao, and M. Tan, “A novel autonomous navigation approach for UAV power line inspection,” in 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2017, pp. 634–639. doi: 10.1109/ROBIO.2017.8324488. [73] S. Fang, C. Haiyang, L. Sheng, and W. Xiaoyu, “A Framework of Power Pylon Detection for UAV-based Power Line Inspection,” in 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC), IEEE, Jun. 2020, pp. 350–357. doi: 10.1109/ITOEC49072.2020.9141693. [74] “IEEE Guide for Unmanned Aerial Vehicle-Based Patrol Inspection System for Transmission Lines,” IEEE Std 2821-2020, pp. 1–49, 2020, doi: 10.1109/IEEESTD.2020.9271964. [75] W. Jiang, G. Zuo, D. H. Zou, H. Li, J. J. Yan, and G. C. Ye, “Autonomous Behavior Intelligence Control of Self-Evolution Mobile Robot for High-Voltage Transmission Line in Complex Smart Grid,” Complexity, vol. 2020, pp. 1–17, Nov. 2020, doi: 10.1155/2020/8843178. [76] W. Zou, X. Shu, Q. Tang, and S. Lu, “A Survey of the Application of Robots in Power System Operation and Maintenance Management,” in 2019 Chinese Automation Congress (CAC), IEEE, Nov. 2019, pp. 4614–4619. doi: 10.1109/CAC48633.2019.8996362. [77] M. Chen, Y. Cao, Y. Tian, E. Li, Z. Liang, and M. Tan, “A Passive Compliance Obstacle-Crossing Robot for Power Line Inspection and Maintenance,” IEEE Robot Autom Lett, vol. 8, no. 5, pp. 2772–2779, May 2023, doi: 10.1109/LRA.2023.3261704. [78] Y. Xia, X. Jiang, Z. Zhang, J. Hu, and C. Sun, “Detecting broken strands in transmission line - Part 1: Design of a smart eddy current transducer carried by inspection robot,” International Transactions on Electrical Energy Systems, vol. 23, no. 8, pp. 1409–1422, Nov. 2013, doi: 10.1002/etep.1669. [79] Y. Song, H. Wang, Y. Jiang, and L. Ling, “AApe-D: A novel power transmission line maintenance robot for broken strand repair,” in 2012 2nd International Conference on Applied Robotics for the Power Industry (CARPI), IEEE, Sep. 2012, pp. 108–113. doi: 10.1109/CARPI.2012.6473359. [80] C. Yu, A. Chen, C. Tang, G. Yu, Y. Fang, and T. Liu, “Design of a Robot for Live-line Repairing of Transmission Line,” in 2022 7th Asia Conference on Power and Electrical Engineering (ACPEE), IEEE, Apr. 2022, pp. 1446–1450. doi: 10.1109/ACPEE53904.2022.9783773. [81] R. Hu et al., “Power Transmission Line Broken Strand Repair Robot and Visual Control Method,” J Phys Conf Ser, vol. 2333, no. 1, p. 012019, Aug. 2022, doi: 10.1088/1742-6596/2333/1/012019. [82] R. Miller, F. Abbasi, and J. Mohammadpour, “Power line robotic device for overhead line inspection and maintenance,” Industrial Robot: An International Journal, vol. 44, no. 1, pp. 75–84, Jan. 2017, doi: 10.1108/IR-06-2016-0165. [83] Y. Cao, H. Wang, Y. Chang, and L. Zhang, “An entanglement-clearing robot for power transmission line with composite clearing tool,” in 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), IEEE, Jun. 2015, pp. 591–596. doi: 10.1109/CYBER.2015.7288007. [84] L. Li et al., “Autonomous Removing Foreign Objects for Power Transmission Line by Using a Vision-Guided Unmanned Aerial Manipulator,” J Intell Robot Syst, vol. 103, no. 2, p. 23, Oct. 2021, doi: 10.1007/s10846-021-01482-3. [85] R. S. Goncalves, F. C. Souza, R. Z. Homma, D. E. T. Sudbrack, P. V. Trautmann, and B. C. Clasen, “Mobile Robot for Debris Removal from High Voltage Power Lines,” in 2022 Latin American Robotics Symposium (LARS), 2022 Brazilian Symposium on Robotics (SBR), and 2022 Workshop on Robotics in Education (WRE), IEEE, Oct. 2022, pp. 1–5. doi: 10.1109/LARS/SBR/WRE56824.2022.9995816. [86] Zhiyong Cheng, Juan Jia, Liang Zhong, Rui Guo, Chunlei Han, and Richeng Zhu, “Development of insulator cleaning robot,” in 2016 4th International Conference on Applied Robotics for the Power Industry (CARPI), IEEE, Oct. 2016, pp. 1–3. doi: 10.1109/CARPI.2016.7745641. [87] J. Guo, Y. Zhang, and X. Chen, “The utility model relates to a cleaning mechanical arm applicable to a high voltage insulator cleaning robot,” in 2020 5th International Conference on Mechanical, Control and Computer Engineering (ICMCCE), IEEE, Dec. 2020, pp. 248–251. doi: 10.1109/ICMCCE51767.2020.00062. [88] S. Tang, P. Zhou, X. Wang, Y. Yu, and H. Li, “Design and Experiment of Dry-Ice Cleaning Mechanical Arm for Insulators in Substation,” Applied Sciences, vol. 10, no. 7, p. 2461, Apr. 2020, doi: 10.3390/app10072461. [89] R. Lopez Lopez, M. J. Batista Sanchez, M. Perez Jimenez, B. C. Arrue, and A. Ollero, “Autonomous UAV System for Cleaning Insulators in Power Line Inspection and Maintenance,” Sensors, vol. 21, no. 24, p. 8488, Dec. 2021, doi: 10.3390/s21248488. [90] Z. Sun, D. Tang, K. Kang, Z. Huang, and D. Chen, “Design and Application of Remote Control System in Finder, a Vibration damper Recover Robot on Power Line,” in 2017 IEEE 7th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), IEEE, Jul. 2017, pp. 977–982. doi: 10.1109/CYBER.2017.8446344. [91] Y. Zhong, Z. Fu, M. Su, Y. Guan, H. Zhu, and L. Zhong, “Development of A Robot System Performing Maintenance Tasks on High-Voltage Power Transmission Lines,” in 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2019, pp. 1344–1349. doi: 10.1109/ROBIO49542.2019.8961863. [92] J. Feng and W. Zhang, “Autonomous Live-Line Maintenance Robot for a 10 kV Overhead Line,” IEEE Access, vol. 9, pp. 61819–61831, 2021, doi: 10.1109/ACCESS.2021.3074677. [93] Y. Chi, Q. Weinan, Z. Kai, L. Xinglie, Y. Guangkai, and Z. Qiang, “The Design and Performance Test Method of Live Working Anti-vibration Hammer Robot,” in 2022 2nd International Conference on Computer, Control and Robotics (ICCCR), IEEE, Mar. 2022, pp. 70–74. doi: 10.1109/ICCCR54399.2022.9790182. [94] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans Pattern Anal Mach Intell, vol. 22, no. 11, pp. 1330–1334, 2000, doi: 10.1109/34.888718. [95] J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Comput. Soc, pp. 1106–1112. doi: 10.1109/CVPR.1997.609468. [96] J. Tremblay, T. To, B. Sundaralingam, Y. Xiang, D. Fox, and S. Birchfield, “Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects,” ArXiv, vol. abs/1809.10790, 2018, [Online]. Available: https://api.semanticscholar.org/CorpusID:52893770. [97] G. Jocher and A. Vina, “Ultralytics YOLOv8 Docs-Performance Metrics Deep Dive,” Ultralytics YOLOv8 Docs. Accessed: Feb. 19, 2024. [Online]. Available: https://docs.ultralytics.com/guides/yolo-performance-metrics/#class-wise-metrics [98] D. A. Patterson and J. L. Hennessy, Computer Organization and Design: Elsevier, 2016. doi: 10.1016/C2013-0-08305-3. |
dc.rights.coar.fl_str_mv |
http://purl.org/coar/access_right/c_abf2 |
dc.rights.license.spa.fl_str_mv |
Atribución-NoComercial 4.0 Internacional |
dc.rights.uri.spa.fl_str_mv |
http://creativecommons.org/licenses/by-nc/4.0/ |
dc.rights.accessrights.spa.fl_str_mv |
info:eu-repo/semantics/openAccess |
rights_invalid_str_mv |
Atribución-NoComercial 4.0 Internacional http://creativecommons.org/licenses/by-nc/4.0/ http://purl.org/coar/access_right/c_abf2 |
eu_rights_str_mv |
openAccess |
dc.format.extent.spa.fl_str_mv |
78 páginas |
dc.format.mimetype.spa.fl_str_mv |
application/pdf |
dc.publisher.spa.fl_str_mv |
Universidad Nacional de Colombia |
dc.publisher.program.spa.fl_str_mv |
Medellín - Minas - Maestría en Ingeniería - Automatización Industrial |
dc.publisher.faculty.spa.fl_str_mv |
Facultad de Minas |
dc.publisher.place.spa.fl_str_mv |
Medellín, Colombia |
dc.publisher.branch.spa.fl_str_mv |
Universidad Nacional de Colombia - Sede Medellín |
institution |
Universidad Nacional de Colombia |
bitstream.url.fl_str_mv |
https://repositorio.unal.edu.co/bitstream/unal/86335/1/license.txt https://repositorio.unal.edu.co/bitstream/unal/86335/2/1193584210.2024.pdf https://repositorio.unal.edu.co/bitstream/unal/86335/3/1193584210.2024.pdf.jpg |
bitstream.checksum.fl_str_mv |
eb34b1cf90b7e1103fc9dfd26be24b4a 8b4ae14d5b1a8b5391f5935a7218637a 2503439b102dd148e1a22c83ff853dd5 |
bitstream.checksumAlgorithm.fl_str_mv |
MD5 MD5 MD5 |
repository.name.fl_str_mv |
Repositorio Institucional Universidad Nacional de Colombia |
repository.mail.fl_str_mv |
repositorio_nal@unal.edu.co |
_version_ |
1814090232497176576 |
spelling |
Atribución-NoComercial 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Bolaños Martínez, Freddybbb8602f0b5a36926e8031d8001ea92eZapata Madrigal, German Darío5c11c330ccb2d626c0d9aaa566364953Lopez Castaño, Alay Camilo3f5b8dcc9192fc6312a626b457c16dddGrupo Teleinformatica y TeleautomaticaLopez Castaño, Alay Camilo [0009-0007-4283-5531]https://scienti.minciencias.gov.co/cvlac/visualizador/generarCurriculoCv.do?cod_rh=00019974742024-07-02T13:39:16Z2024-07-02T13:39:16Z2024https://repositorio.unal.edu.co/handle/unal/86335Universidad Nacional de ColombiaRepositorio Institucional Universidad Nacional de Colombiahttps://repositorio.unal.edu.co/IlustracionesEsta tesis presenta una metodología para la detección, segmentación y sujeción de líneas eléctricas de media tensión compactas, orientada a mejorar la autonomía de los robots en tareas de mantenimiento e inspección. Utilizando un enfoque que combina umbrales de profundidad y color para la segmentación y localización de las líneas, el algoritmo propuesto ofrece una solución específica para la identificación de líneas eléctricas compactas en diversos entornos, este enfoque se combina con la detección de contornos para realizar segmentación de instancia, lo que permite discriminar cada una de las líneas eléctricas. Para la sujeción se aborda el problema de planificación de trayectoria, primero, estimando la posición y orientación de la línea eléctrica en el espacio, luego, moviendo el robot a través de tres puntos 3D que dependen de las condiciones iniciales del robot, y la orientación y posición de la línea. El algoritmo propuesto para la detección y segmentación de la línea resulta ser comparable en términos de precisión y desempeño con la de YOLOv8 nano. Además, se evalúa la eficacia del algoritmo para la estimación de la posición de la línea en un entorno práctico mediante el análisis de la precisión de la metodología para estimar la posición 3D del efector final de un robot, obteniendo un error promedio de 3 cm, basado en la medición de cien puntos distintos. Los resultados indican una precisión aceptable en el contexto de este trabajo. Esta investigación no solo destaca la viabilidad del algoritmo propuesto para la automatización en el sector eléctrico, además, establece un marco para futuras mejoras y aplicaciones en el campo de la robótica y el mantenimiento de líneas eléctricas. (Tomado de la fuente)This thesis presents a methodology for the detection, segmentation and gripping of compact medium voltage power lines, oriented to improve the autonomy of robots in maintenance and inspection tasks. Using an approach that combines depth and color thresholds for the segmentation and localization of the lines, the proposed algorithm offers a specific solution for the identification of compact power lines in various environments, this approach is combined with contour detection to perform instance segmentation, which allows to discriminate each of the power lines. For gripping, the path planning problem is addressed by first, estimating the position and orientation of the power line in space, then, moving the robot through three 3D points that depend on the initial conditions of the robot, and the orientation and position of the line. The proposed algorithm for line detection and segmentation turns out to be comparable in terms of accuracy and performance with that of YOLOv8 nano. In addition, the effectiveness of the algorithm for line position estimation in a practical environment is evaluated by analyzing the accuracy of the methodology for estimating the 3D position of the end-effector of a robot, obtaining an average error of 3 cm, based on the measurement of one hundred different points. The results indicate an acceptable accuracy in the context of this work. This research not only highlights the feasibility of the proposed algorithm for automation in the electrical sector, but also establishes a framework for future improvements and applications in the field of robotics and power line maintenance.MaestríaMagíster en Ingeniería - Automatización IndustrialAutomatización robóticaIngeniería Eléctrica E Ingeniería De Control.Sede Medellín78 páginasapplication/pdfspaUniversidad Nacional de ColombiaMedellín - Minas - Maestría en Ingeniería - Automatización IndustrialFacultad de MinasMedellín, ColombiaUniversidad Nacional de Colombia - Sede Medellín000 - Ciencias de la computación, información y obras generales::004 - Procesamiento de datos Ciencia de los computadores600 - Tecnología (Ciencias aplicadas)::607 - Educación, investigación, temas relacionados620 - Ingeniería y operaciones afines::629 - Otras ramas de la ingenieríaLíneas eléctricasRobóticaLíneas eléctricas - AutomatizaciónLíneas eléctricas - MantenimientoDetección de Líneas Eléctricas CompactasRobótica AutónomaSegmentación de InstanciasVisión por ComputadoraSGBMYOLOv8Compact Power Line DetectionAutonomous RoboticsInstance SegmentationComputer VisionDetección, localización y sujeción de una línea de media tensión por medio de técnicas de visión artificialDetection, localization and clamping of a medium voltage line by means of artificial vision techniquesTrabajo de grado - Maestríainfo:eu-repo/semantics/masterThesisinfo:eu-repo/semantics/acceptedVersionTexthttp://purl.org/redcol/resource_type/TMLaReferencia[1] J. Bedi and D. Toshniwal, “Deep learning framework to forecast electricity demand,” Appl Energy, vol. 238, pp. 1312–1326, Mar. 2019, doi: 10.1016/j.apenergy.2019.01.113.[2] V. Padmanathan, L. Joseph, B. Omar, and R. Nawawi, “Prevalence of musculoskeletal disorders and related occupational causative factors among electricity linemen: A narrative review,” Int J Occup Med Environ Health, vol. 29, no. 5, pp. 725–734, Jul. 2016, doi: 10.13075/ijomeh.1896.00659.[3] A. Marroquin, A. Rehman, and A. Madani, “High-Voltage Arc Flash Assessment and Applications,” IEEE Trans Ind Appl, vol. 56, no. 3, pp. 2205–2215, May 2020, doi: 10.1109/TIA.2020.2980467.[4] B. Brenner, J. C. Cawley, and D. Majano, “Electrically Hazardous Jobs in the U.S.,” IEEE Trans Ind Appl, vol. 56, no. 3, pp. 2190–2195, May 2020, doi: 10.1109/TIA.2020.2980221.[5] B. Brenner and J. C. Cawley, “Occupations most at-risk in fatal overhead power line incidents: Using osha data to get a better understanding,” in 2015 IEEE IAS Electrical Safety Workshop, IEEE, Jan. 2015, pp. 1–6. doi: 10.1109/ESW.2015.7094939.[6] S. Pooladvand and S. Hasanzadeh, “Neurophysiological evaluation of workers’ decision dynamics under time pressure and increased mental demand,” Autom Constr, vol. 141, p. 104437, Sep. 2022, doi: 10.1016/j.autcon.2022.104437.[7] G. Gocsei, B. Nemeth, D. Szabo, and V. Faradzhev, “Induced Voltage: A Major Risk Not Only During Live Working,” in 2022 13th International Conference on Live Maintenance (ICOLIM), IEEE, Jun. 2022, pp. 1–5. doi: 10.1109/ICOLIM56184.2022.9840538.[8] K. Suresh and S. Paranthaman, “Transferred Potential—A Hidden Killer of Many Linemen,” IEEE Trans Ind Appl, vol. 51, no. 3, pp. 2691–2699, May 2015, doi: 10.1109/TIA.2014.2375386.[9] O. Menendez, F. A. Auat Cheein, M. Perez, and S. Kouro, “Robotics in Power Systems: Enabling a More Reliable and Safe Grid,” IEEE Industrial Electronics Magazine, vol. 11, no. 2, pp. 22–34, Jun. 2017, doi: 10.1109/MIE.2017.2686458.[10] D. Zhang, J. Cao, G. Dobie, and C. MacLeod, “A Framework of Using Customized LIDAR to Localize Robot for Nuclear Reactor Inspections,” IEEE Sens J, vol. 22, no. 6, pp. 5352–5359, Mar. 2022, doi: 10.1109/JSEN.2021.3083478.[11] T. Wang, Y. Zhao, L. Zhu, G. Liu, Z. Ma, and J. Zheng, “Research on control system of working robot in nuclear environment based on Neural Network PID,” in 2020 Chinese Automation Congress (CAC), IEEE, Nov. 2020, pp. 4828–4831. doi: 10.1109/CAC51589.2020.9327197.[12] S. Jimenez, D. Bookless, R. Nath, W. J. Leong, J. Kotaniemi, and P. Tikka, “Automated maintenance feasibility testing on the EU DEMO Automated Inspection and Maintenance Test Unit (AIM-TU),” Fusion Engineering and Design, vol. 170, p. 112517, Sep. 2021, doi: 10.1016/j.fusengdes.2021.112517.[13] J. Franko, S. Du, S. Kallweit, E. Duelberg, and H. Engemann, “Design of a Multi-Robot System for Wind Turbine Maintenance,” Energies (Basel), vol. 13, no. 10, p. 2552, May 2020, doi: 10.3390/en13102552.[14] F. Han, J. Yao, H. Zhu, and C. Wang, “Underwater Image Processing and Object Detection Based on Deep CNN Method,” J Sens, vol. 2020, pp. 1–20, May 2020, doi: 10.1155/2020/6707328.[15] E. K. Chiou et al., “Towards Human–Robot Teaming: Tradeoffs of Explanation-Based Communication Strategies in a Virtual Search and Rescue Task,” Int J Soc Robot, vol. 14, no. 5, pp. 1117–1136, Jul. 2022, doi: 10.1007/s12369-021-00834-1.[16] S. Montambault and N. Pouliot, “Hydro-Québec’s Power Line Robotics Program: 15 years of development, implementation and partnerships,” in Proceedings of the 2014 3rd International Conference on Applied Robotics for the Power Industry, IEEE, Oct. 2014, pp. 1–6. doi: 10.1109/CARPI.2014.7030065.[17] R. Aracil, M. Ferre, M. Hernando, E. Pinto, and J. M. Sebastian, “Telerobotic system for live-power line maintenance: ROBTET,” Control Eng Pract, vol. 10, no. 11, pp. 1271–1281, Nov. 2002, doi: 10.1016/S0967-0661(02)00182-X.[18] C. Friedrich, A. Csiszar, A. Lechler, and A. Verl, “Efficient Task and Path Planning for Maintenance Automation Using a Robot System,” IEEE Transactions on Automation Science and Engineering, vol. 15, no. 3, pp. 1205–1215, Jul. 2018, doi: 10.1109/TASE.2017.2759814.[19] M. S. Alvarez-Alvarado et al., “Power System Reliability and Maintenance Evolution: A Critical Review and Future Perspectives,” IEEE Access, vol. 10, pp. 51922–51950, 2022, doi: 10.1109/ACCESS.2022.3172697.[20] M. H. Sayour, S. E. Kozhaya, and S. S. Saab, “Autonomous Robotic Manipulation: Real-Time, Deep-Learning Approach for Grasping of Unknown Objects,” Journal of Robotics, vol. 2022, pp. 1–14, Jun. 2022, doi: 10.1155/2022/2585656.[21] F. García-Luna and A. Morales-Díaz, “Towards an artificial vision-robotic system for tomato identification,” IFAC-PapersOnLine, vol. 49, no. 16, pp. 365–370, 2016, doi: 10.1016/j.ifacol.2016.10.067.[22] J. A. Bagnell et al., “An integrated system for autonomous robotics manipulation,” in 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, Oct. 2012, pp. 2955–2962. doi: 10.1109/IROS.2012.6385888.[23] A. Billard and D. Kragic, “Trends and challenges in robot manipulation,” Science (1979), vol. 364, no. 6446, p. eaat8414, 2019, doi: 10.1126/science.aat8414.[24] F. Sun, C. Liu, W. Huang, and J. Zhang, “Object Classification and Grasp Planning Using Visual and Tactile Sensing,” IEEE Trans Syst Man Cybern Syst, vol. 46, no. 7, pp. 969–979, Jul. 2016, doi: 10.1109/TSMC.2016.2524059.[25] Z. Zhou, L. Li, A. Fürsterling, H. J. Durocher, J. Mouridsen, and X. Zhang, “Learning-based object detection and localization for a mobile robot manipulator in SME production,” Robot Comput Integr Manuf, vol. 73, p. 102229, Feb. 2022, doi: 10.1016/j.rcim.2021.102229.[26] Q. Bai, S. Li, J. Yang, Q. Song, Z. Li, and X. Zhang, “Object Detection Recognition and Robot Grasping Based on Machine Learning: A Survey,” IEEE Access, vol. 8, pp. 181855–181879, 2020, doi: 10.1109/ACCESS.2020.3028740.[27] M. Mainampati and B. Chandrasekaran, “Evolution of Machine Learning Algorithms on Autonomous Robots,” in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), IEEE, Jan. 2020, pp. 0737–0741. doi: 10.1109/CCWC47524.2020.9031137.[28] Q. M. Marwan, S. C. Chua, and L. C. Kwek, “Comprehensive Review on Reaching and Grasping of Objects in Robotics,” Robotica, vol. 39, no. 10, pp. 1849–1882, Oct. 2021, doi: 10.1017/S0263574721000023.[29] P. Ramon Soria, B. Arrue, and A. Ollero, “Detection, Location and Grasping Objects Using a Stereo Sensor on UAV in Outdoor Environments,” Sensors, vol. 17, no. 12, p. 103, Jan. 2017, doi: 10.3390/s17010103.[30] H. Karaoguz and P. Jensfelt, “Object Detection Approach for Robot Grasp Detection,” in 2019 International Conference on Robotics and Automation (ICRA), IEEE, May 2019, pp. 4953–4959. doi: 10.1109/ICRA.2019.8793751.[31] R. P. Khurshid, N. T. Fitter, E. A. Fedalei, and K. J. Kuchenbecker, “Effects of Grip-Force, Contact, and Acceleration Feedback on a Teleoperated Pick-and-Place Task,” IEEE Trans Haptics, vol. 10, no. 1, pp. 40–53, Jan. 2017, doi: 10.1109/TOH.2016.2573301.[32] J. A. Haustein, K. Hang, J. Stork, and D. Kragic, “Object Placement Planning and optimization for Robot Manipulators,” in 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Nov. 2019, pp. 7417–7424. doi: 10.1109/IROS40897.2019.8967732.[33] S. Noh, C. Park, and J. Park, “Position-Based Visual Servoing of Multiple Robotic Manipulators: Verification in Gazebo Simulator,” in 2020 International Conference on Information and Communication Technology Convergence (ICTC), IEEE, Oct. 2020, pp. 843–846. doi: 10.1109/ICTC49870.2020.9289554.[34] A. Taherian, A. H. Mazinan, and M. Aliyari-Shoorehdeli, “Image-based visual servoing improvement through utilization of adaptive control gain and pseudo-inverse of the weighted mean of the Jacobians,” Computers & Electrical Engineering, vol. 83, p. 106580, May 2020, doi: 10.1016/j.compeleceng.2020.106580.[35] D. Xu, J. Lu, P. Wang, Z. Zhang, D. Zhang, and Z. Liang, “A new image-based visual servoing method with rotational compensation,” in 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2016, pp. 1099–1104. doi: 10.1109/ROBIO.2016.7866472.[36] F. Chaumette, S. Hutchinson, and P. Corke, “Visual Servoing,” 2016, pp. 841–866. doi: 10.1007/978-3-319-32552-1_34.[37] F. Janabi-Sharifi, L. Deng, and W. J. Wilson, “Comparison of Basic Visual Servoing Methods,” IEEE/ASME Transactions on Mechatronics, vol. 16, no. 5, pp. 967–983, Oct. 2011, doi: 10.1109/TMECH.2010.2063710.[38] G. Palmieri, M. Palpacelli, M. Battistelli, and M. Callegari, “A Comparison between Position-Based and Image-Based Dynamic Visual Servoings in the Control of a Translating Parallel Manipulator,” Journal of Robotics, vol. 2012, pp. 1–11, 2012, doi: 10.1155/2012/103954.[39] S. Hutchinson, G. D. Hager, and P. I. Corke, “A tutorial on visual servo control,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 651–670, 1996, doi: 10.1109/70.538972.[40] T. Mao et al., “Development of Power Transmission Line Defects Diagnosis System for UAV Inspection based on Binocular Depth Imaging Technology,” in 2019 2nd International Conference on Electrical Materials and Power Equipment (ICEMPE), IEEE, Apr. 2019, pp. 478–481. doi: 10.1109/ICEMPE.2019.8727361.[41] V. Lippiello, B. Siciliano, and L. Villani, “Eye-in-Hand/Eye-to-Hand Multi-Camera Visual Servoing,” in Proceedings of the 44th IEEE Conference on Decision and Control, IEEE, pp. 5354–5359. doi: 10.1109/CDC.2005.1583013.[42] S. Beeran Kutty, S. Saaidin, P. N. A. Megat Yunus, and S. Abu Hassan, “Evaluation of canny and sobel operator for logo edge detection,” in 2014 International Symposium on Technology Management and Emerging Technologies, IEEE, May 2014, pp. 153–156. doi: 10.1109/ISTMET.2014.6936497.[43] J. Weng, P. Cohen, and M. Herniou, “Camera calibration with distortion models and accuracy evaluation,” IEEE Trans Pattern Anal Mach Intell, vol. 14, no. 10, pp. 965–980, 1992, doi: 10.1109/34.159901.[44] Z. Zou, K. Chen, Z. Shi, Y. Guo, and J. Ye, “Object Detection in 20 Years: A Survey,” Proceedings of the IEEE, vol. 111, no. 3, pp. 257–276, Mar. 2023, doi: 10.1109/JPROC.2023.3238524.[45] R. Padilla, S. L. Netto, and E. A. B. da Silva, “A Survey on Performance Metrics for Object-Detection Algorithms,” in 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), IEEE, Jul. 2020, pp. 237–242. doi: 10.1109/IWSSIP48289.2020.9145130.[46] P. Machado, A. Oikonomou, J. F. Ferreira, and T. M. Mcginnity, “HSMD: An Object Motion Detection Algorithm Using a Hybrid Spiking Neural Network Architecture,” IEEE Access, vol. 9, pp. 125258–125268, 2021, doi: 10.1109/ACCESS.2021.3111005.[47] P. Sharma and D. Valles, “Deep Convolutional Neural Network Design Approach for 3D Object Detection for Robotic Grasping,” in 2020 10th Annual Computing and Communication Workshop and Conference (CCWC), IEEE, Jan. 2020, pp. 0311–0316. doi: 10.1109/CCWC47524.2020.9031186.[48] P. Li, C. Zhang, J. Peng, Y. Ding, and J. Zhan, “Effect of Baseline Distance and Corner Consistency on Binocular Visual Locating,” in 2020 IEEE 6th International Conference on Computer and Communications (ICCC), IEEE, Dec. 2020, pp. 1471–1475. doi: 10.1109/ICCC51575.2020.9345063.[49] Y. Jiao and P.-H. Ho, “Design of Binocular Stereo Vision System Via CNN-based Stereo Matching Algorithm,” in 2021 International Conference on Networking and Network Applications (NaNA), IEEE, Oct. 2021, pp. 426–431. doi: 10.1109/NaNA53684.2021.00080.[50] R. A. Hamzah and H. Ibrahim, “Literature Survey on Stereo Vision Disparity Map Algorithms,” J Sens, vol. 2016, pp. 1–23, 2016, doi: 10.1155/2016/8742920.[51] D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings., IEEE Comput. Soc, pp. I-195-I–202. doi: 10.1109/CVPR.2003.1211354.[52] B. Hazel, J. Côté, Y. Laroche, and P. Mongenot, “A portable, multiprocess, track-based robot for in situ work on hydropower equipment,” J Field Robot, vol. 29, no. 1, pp. 69–101, Jan. 2012, doi: 10.1002/rob.20425.[53] A. B. Alhassan, X. Zhang, H. Shen, G. Jian, H. Xu, and K. Hamza, “Investigation of Aerodynamic Stability of a Lightweight Dual-Arm Power Transmission Line Inspection Robot under the Influence of Wind,” Math Probl Eng, vol. 2019, pp. 1–16, Nov. 2019, doi: 10.1155/2019/2139462.[54] N. Pouliot, P.-L. Richard, and S. Montambault, “LineScout Technology Opens the Way to Robotic Inspection and Maintenance of High-Voltage Power Lines,” IEEE Power and Energy Technology Systems Journal, vol. 2, no. 1, pp. 1–11, Mar. 2015, doi: 10.1109/JPETS.2015.2395388.[55] N. Pouliot and S. Montambaut, “Sensors for the non-destructive evaluation of ACSR, deployed with live-line robotics,” in 2017 12th International Conference on Live Maintenance (ICOLIM), IEEE, Apr. 2017, pp. 1–1. doi: 10.1109/ICOLIM.2017.7964162.[56] S. Montambault and N. Pouliot, “The HQ LineROVer: contributing to innovation in transmission line maintenance,” in 2003 IEEE 10th International Conference on Transmission and Distribution Construction, Operation and Live-Line Maintenance, 2003. 2003 IEEE ESMO., IEEE, pp. 33–40. doi: 10.1109/TDCLLM.2003.1196466.[57] J. Zhao, R. Guo, L. Cao, and F. Zhang, “Improvement of LineROVer: A mobile robot for de-icing of transmission lines,” in 2010 1st International Conference on Applied Robotics for the Power Industry (CARPI 2010), IEEE, Oct. 2010, pp. 1–4. doi: 10.1109/CARPI.2010.5624458.[58] F. Zhang et al., “Extended applications of LineROVer Technology,” in 2013 10th IEEE International Conference on Control and Automation (ICCA), IEEE, Jun. 2013, pp. 1415–1418. doi: 10.1109/ICCA.2013.6564877.[59] S. Montambault and N. Pouliot, “LineScout Technology: Development of an Inspection Robot Capable of Clearing Obstacles While Operating on a Live Line,” in ESMO 2006 - 2006 IEEE 11th International Conference on Transmission & Distribution Construction, Operation and Live-Line Maintenance, IEEE, 2006. doi: 10.1109/TDCLLM.2006.340744.[60] P.-L. Richard et al., “LineRanger: Analysis and Field Testing of an Innovative Robot for Efficient Assessment of Bundled High-Voltage Powerlines,” in 2019 International Conference on Robotics and Automation (ICRA), IEEE, May 2019, pp. 9130–9136. doi: 10.1109/ICRA.2019.8794397.[61] W. Chang, G. Yang, J. Yu, Z. Liang, L. Cheng, and C. Zhou, “Development of a power line inspection robot with hybrid operation modes,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Sep. 2017, pp. 973–978. doi: 10.1109/IROS.2017.8202263.[62] F. Miralles et al., “LineDrone Technology: Landing an Unmanned Aerial Vehicle on a Power Line,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), IEEE, May 2018, pp. 6545–6552. doi: 10.1109/ICRA.2018.8461250.[63] P. Debenest et al., “Expliner - Robot for inspection of transmission lines,” in 2008 IEEE International Conference on Robotics and Automation, IEEE, May 2008, pp. 3978–3984. doi: 10.1109/ROBOT.2008.4543822.[64] P. Debenest and M. Guarnieri, “Expliner - From prototype towards a practical robot for inspection of high-voltage lines,” in 2010 1st International Conference on Applied Robotics for the Power Industry (CARPI 2010), IEEE, Oct. 2010, pp. 1–6. doi: 10.1109/CARPI.2010.5624434.[65] C. M. Shruthi, A. P. Sudheer, and M. L. Joy, “Dual arm electrical transmission line robot: motion through straight and jumper cable,” Automatika, vol. 60, no. 2, pp. 207–226, Apr. 2019, doi: 10.1080/00051144.2019.1609256.[66] R. Miller, F. Abbasi, and J. Mohammadpour, “Power line robotic device for overhead line inspection and maintenance,” Industrial Robot: An International Journal, vol. 44, no. 1, pp. 75–84, Jan. 2017, doi: 10.1108/IR-06-2016-0165.[67] G. Zhou, J. Yuan, I.-L. Yen, and F. Bastani, “Robust real-time UAV based power line detection and tracking,” in 2016 IEEE International Conference on Image Processing (ICIP), IEEE, Sep. 2016, pp. 744–748. doi: 10.1109/ICIP.2016.7532456.[68] J. Zhang, L. Liu, B. Wang, X. Chen, Q. Wang, and T. Zheng, “High Speed Automatic Power Line Detection and Tracking for a UAV-Based Inspection,” in 2012 International Conference on Industrial Control and Electronics Engineering, IEEE, Aug. 2012, pp. 266–269. doi: 10.1109/ICICEE.2012.77.[69] F. Shuang, X. Chen, Y. Li, Y. Wang, N. Miao, and Z. Zhou, “PLE: Power Line Extraction Algorithm for UAV-Based Power Inspection,” IEEE Sens J, vol. 22, no. 20, pp. 19941–19952, Oct. 2022, doi: 10.1109/JSEN.2022.3202033.[70] W. Wenfeng, Z. Shuhua, F. Yihao, and D. Weili, “Parallel edges detection from remote sensing image using local orientation coding,” Acta Optica Sinica, vol. 32, no. 3, p. 315001, 2012.[71] C. Yu, B. Qu, Y. Zhu, Y. Ji, H. Zhao, and Z. Xing, “Design of the Transmission Line Inspection System Based on UAV,” in 2020 10th International Conference on Power and Energy Systems (ICPES), IEEE, Dec. 2020, pp. 543–548. doi: 10.1109/ICPES51309.2020.9349675.[72] X. Hui, J. Bian, Y. Yu, X. Zhao, and M. Tan, “A novel autonomous navigation approach for UAV power line inspection,” in 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2017, pp. 634–639. doi: 10.1109/ROBIO.2017.8324488.[73] S. Fang, C. Haiyang, L. Sheng, and W. Xiaoyu, “A Framework of Power Pylon Detection for UAV-based Power Line Inspection,” in 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC), IEEE, Jun. 2020, pp. 350–357. doi: 10.1109/ITOEC49072.2020.9141693.[74] “IEEE Guide for Unmanned Aerial Vehicle-Based Patrol Inspection System for Transmission Lines,” IEEE Std 2821-2020, pp. 1–49, 2020, doi: 10.1109/IEEESTD.2020.9271964.[75] W. Jiang, G. Zuo, D. H. Zou, H. Li, J. J. Yan, and G. C. Ye, “Autonomous Behavior Intelligence Control of Self-Evolution Mobile Robot for High-Voltage Transmission Line in Complex Smart Grid,” Complexity, vol. 2020, pp. 1–17, Nov. 2020, doi: 10.1155/2020/8843178.[76] W. Zou, X. Shu, Q. Tang, and S. Lu, “A Survey of the Application of Robots in Power System Operation and Maintenance Management,” in 2019 Chinese Automation Congress (CAC), IEEE, Nov. 2019, pp. 4614–4619. doi: 10.1109/CAC48633.2019.8996362.[77] M. Chen, Y. Cao, Y. Tian, E. Li, Z. Liang, and M. Tan, “A Passive Compliance Obstacle-Crossing Robot for Power Line Inspection and Maintenance,” IEEE Robot Autom Lett, vol. 8, no. 5, pp. 2772–2779, May 2023, doi: 10.1109/LRA.2023.3261704.[78] Y. Xia, X. Jiang, Z. Zhang, J. Hu, and C. Sun, “Detecting broken strands in transmission line - Part 1: Design of a smart eddy current transducer carried by inspection robot,” International Transactions on Electrical Energy Systems, vol. 23, no. 8, pp. 1409–1422, Nov. 2013, doi: 10.1002/etep.1669.[79] Y. Song, H. Wang, Y. Jiang, and L. Ling, “AApe-D: A novel power transmission line maintenance robot for broken strand repair,” in 2012 2nd International Conference on Applied Robotics for the Power Industry (CARPI), IEEE, Sep. 2012, pp. 108–113. doi: 10.1109/CARPI.2012.6473359.[80] C. Yu, A. Chen, C. Tang, G. Yu, Y. Fang, and T. Liu, “Design of a Robot for Live-line Repairing of Transmission Line,” in 2022 7th Asia Conference on Power and Electrical Engineering (ACPEE), IEEE, Apr. 2022, pp. 1446–1450. doi: 10.1109/ACPEE53904.2022.9783773.[81] R. Hu et al., “Power Transmission Line Broken Strand Repair Robot and Visual Control Method,” J Phys Conf Ser, vol. 2333, no. 1, p. 012019, Aug. 2022, doi: 10.1088/1742-6596/2333/1/012019.[82] R. Miller, F. Abbasi, and J. Mohammadpour, “Power line robotic device for overhead line inspection and maintenance,” Industrial Robot: An International Journal, vol. 44, no. 1, pp. 75–84, Jan. 2017, doi: 10.1108/IR-06-2016-0165.[83] Y. Cao, H. Wang, Y. Chang, and L. Zhang, “An entanglement-clearing robot for power transmission line with composite clearing tool,” in 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), IEEE, Jun. 2015, pp. 591–596. doi: 10.1109/CYBER.2015.7288007.[84] L. Li et al., “Autonomous Removing Foreign Objects for Power Transmission Line by Using a Vision-Guided Unmanned Aerial Manipulator,” J Intell Robot Syst, vol. 103, no. 2, p. 23, Oct. 2021, doi: 10.1007/s10846-021-01482-3.[85] R. S. Goncalves, F. C. Souza, R. Z. Homma, D. E. T. Sudbrack, P. V. Trautmann, and B. C. Clasen, “Mobile Robot for Debris Removal from High Voltage Power Lines,” in 2022 Latin American Robotics Symposium (LARS), 2022 Brazilian Symposium on Robotics (SBR), and 2022 Workshop on Robotics in Education (WRE), IEEE, Oct. 2022, pp. 1–5. doi: 10.1109/LARS/SBR/WRE56824.2022.9995816.[86] Zhiyong Cheng, Juan Jia, Liang Zhong, Rui Guo, Chunlei Han, and Richeng Zhu, “Development of insulator cleaning robot,” in 2016 4th International Conference on Applied Robotics for the Power Industry (CARPI), IEEE, Oct. 2016, pp. 1–3. doi: 10.1109/CARPI.2016.7745641.[87] J. Guo, Y. Zhang, and X. Chen, “The utility model relates to a cleaning mechanical arm applicable to a high voltage insulator cleaning robot,” in 2020 5th International Conference on Mechanical, Control and Computer Engineering (ICMCCE), IEEE, Dec. 2020, pp. 248–251. doi: 10.1109/ICMCCE51767.2020.00062.[88] S. Tang, P. Zhou, X. Wang, Y. Yu, and H. Li, “Design and Experiment of Dry-Ice Cleaning Mechanical Arm for Insulators in Substation,” Applied Sciences, vol. 10, no. 7, p. 2461, Apr. 2020, doi: 10.3390/app10072461.[89] R. Lopez Lopez, M. J. Batista Sanchez, M. Perez Jimenez, B. C. Arrue, and A. Ollero, “Autonomous UAV System for Cleaning Insulators in Power Line Inspection and Maintenance,” Sensors, vol. 21, no. 24, p. 8488, Dec. 2021, doi: 10.3390/s21248488.[90] Z. Sun, D. Tang, K. Kang, Z. Huang, and D. Chen, “Design and Application of Remote Control System in Finder, a Vibration damper Recover Robot on Power Line,” in 2017 IEEE 7th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), IEEE, Jul. 2017, pp. 977–982. doi: 10.1109/CYBER.2017.8446344.[91] Y. Zhong, Z. Fu, M. Su, Y. Guan, H. Zhu, and L. Zhong, “Development of A Robot System Performing Maintenance Tasks on High-Voltage Power Transmission Lines,” in 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), IEEE, Dec. 2019, pp. 1344–1349. doi: 10.1109/ROBIO49542.2019.8961863.[92] J. Feng and W. Zhang, “Autonomous Live-Line Maintenance Robot for a 10 kV Overhead Line,” IEEE Access, vol. 9, pp. 61819–61831, 2021, doi: 10.1109/ACCESS.2021.3074677.[93] Y. Chi, Q. Weinan, Z. Kai, L. Xinglie, Y. Guangkai, and Z. Qiang, “The Design and Performance Test Method of Live Working Anti-vibration Hammer Robot,” in 2022 2nd International Conference on Computer, Control and Robotics (ICCCR), IEEE, Mar. 2022, pp. 70–74. doi: 10.1109/ICCCR54399.2022.9790182.[94] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans Pattern Anal Mach Intell, vol. 22, no. 11, pp. 1330–1334, 2000, doi: 10.1109/34.888718.[95] J. Heikkila and O. Silven, “A four-step camera calibration procedure with implicit image correction,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Comput. Soc, pp. 1106–1112. doi: 10.1109/CVPR.1997.609468.[96] J. Tremblay, T. To, B. Sundaralingam, Y. Xiang, D. Fox, and S. Birchfield, “Deep Object Pose Estimation for Semantic Robotic Grasping of Household Objects,” ArXiv, vol. abs/1809.10790, 2018, [Online]. Available: https://api.semanticscholar.org/CorpusID:52893770.[97] G. Jocher and A. Vina, “Ultralytics YOLOv8 Docs-Performance Metrics Deep Dive,” Ultralytics YOLOv8 Docs. Accessed: Feb. 19, 2024. [Online]. Available: https://docs.ultralytics.com/guides/yolo-performance-metrics/#class-wise-metrics[98] D. A. Patterson and J. L. Hennessy, Computer Organization and Design: Elsevier, 2016. doi: 10.1016/C2013-0-08305-3.Enel ColombiaEstudiantesInvestigadoresMaestrosPúblico generalLICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/86335/1/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD51ORIGINAL1193584210.2024.pdf1193584210.2024.pdfTesis de Maestría en Ingeniería - Automatización Industrialapplication/pdf2591758https://repositorio.unal.edu.co/bitstream/unal/86335/2/1193584210.2024.pdf8b4ae14d5b1a8b5391f5935a7218637aMD52THUMBNAIL1193584210.2024.pdf.jpg1193584210.2024.pdf.jpgGenerated Thumbnailimage/jpeg4771https://repositorio.unal.edu.co/bitstream/unal/86335/3/1193584210.2024.pdf.jpg2503439b102dd148e1a22c83ff853dd5MD53unal/86335oai:repositorio.unal.edu.co:unal/863352024-07-02 23:05:04.821Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo= |