Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico

ilustraciones, graficas

Autores:
Godoy Rojas, Diego Fernando
Tipo de recurso:
Fecha de publicación:
2022
Institución:
Universidad Nacional de Colombia
Repositorio:
Universidad Nacional de Colombia
Idioma:
spa
OAI Identifier:
oai:repositorio.unal.edu.co:unal/83956
Acceso en línea:
https://repositorio.unal.edu.co/handle/unal/83956
https://repositorio.unal.edu.co/
Palabra clave:
600 - Tecnología (Ciencias aplicadas)
Machine learning
APRENDIZAJE AUTOMATICO (INTELIGENCIA ARTIFICIAL)
Aprendizaje profundo
Inteligencia artificial
Mecanismos de atención
Redes Neuronales
Series de tiempo
Salud estructural
Deep Learning
Attention Mechanisms
Neural Networks
Time Series forecasting
GRU
LSTM
Rights
openAccess
License
Atribución-NoComercial 4.0 Internacional
id UNACIONAL2_1ee39190f03cd5f044a7836fc4be04df
oai_identifier_str oai:repositorio.unal.edu.co:unal/83956
network_acronym_str UNACIONAL2
network_name_str Universidad Nacional de Colombia
repository_id_str
dc.title.spa.fl_str_mv Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
dc.title.translated.eng.fl_str_mv Deep learning for temperature prediction in the refractory walls of an electric arc furnace
title Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
spellingShingle Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
600 - Tecnología (Ciencias aplicadas)
Machine learning
APRENDIZAJE AUTOMATICO (INTELIGENCIA ARTIFICIAL)
Aprendizaje profundo
Inteligencia artificial
Mecanismos de atención
Redes Neuronales
Series de tiempo
Salud estructural
Deep Learning
Attention Mechanisms
Neural Networks
Time Series forecasting
GRU
LSTM
title_short Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
title_full Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
title_fullStr Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
title_full_unstemmed Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
title_sort Aprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctrico
dc.creator.fl_str_mv Godoy Rojas, Diego Fernando
dc.contributor.advisor.none.fl_str_mv Tibaduiza Burgos, Diego Alexander
Leon-Medina, Jersson Xavier
dc.contributor.author.none.fl_str_mv Godoy Rojas, Diego Fernando
dc.contributor.researchgroup.spa.fl_str_mv Grupo de Investigación en Electrónica de Alta Frecuencia y Telecomunicaciones (Cmun)
dc.contributor.orcid.spa.fl_str_mv Diego F. Godoy-Rojas [0000-0002-1639-7992]
dc.subject.ddc.spa.fl_str_mv 600 - Tecnología (Ciencias aplicadas)
topic 600 - Tecnología (Ciencias aplicadas)
Machine learning
APRENDIZAJE AUTOMATICO (INTELIGENCIA ARTIFICIAL)
Aprendizaje profundo
Inteligencia artificial
Mecanismos de atención
Redes Neuronales
Series de tiempo
Salud estructural
Deep Learning
Attention Mechanisms
Neural Networks
Time Series forecasting
GRU
LSTM
dc.subject.lemb.spa.fl_str_mv Machine learning
dc.subject.lemb.eng.fl_str_mv APRENDIZAJE AUTOMATICO (INTELIGENCIA ARTIFICIAL)
dc.subject.proposal.spa.fl_str_mv Aprendizaje profundo
Inteligencia artificial
Mecanismos de atención
Redes Neuronales
Series de tiempo
Salud estructural
dc.subject.proposal.eng.fl_str_mv Deep Learning
Attention Mechanisms
Neural Networks
Time Series forecasting
dc.subject.proposal.none.fl_str_mv GRU
LSTM
description ilustraciones, graficas
publishDate 2022
dc.date.issued.none.fl_str_mv 2022
dc.date.accessioned.none.fl_str_mv 2023-06-02T14:26:53Z
dc.date.available.none.fl_str_mv 2023-06-02T14:26:53Z
dc.type.spa.fl_str_mv Trabajo de grado - Maestría
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_7a1f
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/masterThesis
dc.type.version.spa.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.content.spa.fl_str_mv Text
dc.type.redcol.spa.fl_str_mv http://purl.org/redcol/resource_type/TP
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://repositorio.unal.edu.co/handle/unal/83956
dc.identifier.instname.spa.fl_str_mv Universidad Nacional de Colombia
dc.identifier.reponame.spa.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
dc.identifier.repourl.spa.fl_str_mv https://repositorio.unal.edu.co/
url https://repositorio.unal.edu.co/handle/unal/83956
https://repositorio.unal.edu.co/
identifier_str_mv Universidad Nacional de Colombia
Repositorio Institucional Universidad Nacional de Colombia
dc.language.iso.spa.fl_str_mv spa
language spa
dc.relation.references.spa.fl_str_mv D. Tibaduiza et al., “Structural Health Monitoring System for Furnace Refractory Wall Thickness Measurements at Cerro Matoso SA”, Lecture Notes in Civil Engineering, pp. 414-423, 2021. DOI: 10.1007/978-3-030-64594-6_41
F. Pozo et al., “Structural health monitoring and condition monitoring applications: sensing, distributed communication and processing”, International Journal of distributed sensor networks, vol 16, no. 9, p 1-3, 2020. DOI: 10.1177/1550147720963270
J. Birat, “A futures study analysis of the technological evolution of the EAF by 2010”, Revue de Métallurgie, vol. 97, no. 11, pp. 1347-1363, 2000. DOI: 10.1051/metal:2000114
“Redes neuronales profundas - Tipos y Características - Código Fuente”, Código Fuente, 2021. [Online]. Disponible: https://www.codigofuente.org/redes-neuronales-profundas-tipos-caracteristicas/. [Acceso: 17- Jul- 2021].
“Illustrated Guide to LSTM’s and GRU’s: A step by step explanation”, Medium, 2021. [Online]. Disponible: https://towardsdatascience.com/illustrated-guide-to-lstms-and-grus-a-step-by-step-explanation-44e9eb85bf21. [Acceso: 17- Jul-2021].
“Major Mines & Projects | Cerro Matoso Mine”, Miningdataonline.com, 2021. [Online]. Disponible: https://miningdataonline.com/property/336/Cerro-Matoso-Mine.aspx. [Acceso: 25- Nov- 2021]
Janzen, J.; Gerritsen, T.; Voermann, N.; Veloza, E.R.; Delgado, R.C. Integrated Furnace Controls: Implementation on a Covered-Arc (Shielded Arc) Furnace at Cerro Matoso. In Proceedings of the 10th International Ferroalloys Congress, Cape Town, South Africa, 1–4 Feb. 2004; pp. 659–669.
R. Garcia-Segura, J. Vázquez Castillo, F. Martell-Chavez, O. Longoria-Gandara, and J. Ortegón Aguilar, “Electric Arc Furnace Modeling with Artificial Neural Networks and Arc Length with Variable Voltage Gradient,” Energies, vol. 10, no. 9, p. 1424, Sep. 2017
C. Chen, Y. Liu, M. Kumar, and J. Qin, “Energy Consumption Modelling Using Deep Learning Technique — A Case Study of EAF”, Procedia CIRP, vol. 72, pp. 1063-1068, 2018. DOI: 10.1016/j.procir.2018.03.095.
S. Ismaeel, A. Miri, A. Sadeghian, and D. Chourishi, “An Extreme Learning Machine (ELM) Predictor for Electric Arc Furnaces’ v-i Characteristics,”2015 IEEE 2nd International Conference on Cyber Security and Cloud Computing, 2015, pp. 329-334, DOI: 10.1109/CSCloud.2015.94.
J. Mesa Fernández, V. Cabal, V. Montequin and J. Balsera, “Online estimation of electric arc furnace tap temperature by using fuzzy neural networks”, Engineering Applications of Artificial Intelligence, vol. 21, no. 7, pp. 1001-1012, 2008. DOI: 10.1016/j.engappai.2007.11.008.
M. Kordos, M. Blachnik and T. Wieczorek, “Temperature Prediction in Electric Arc Furnace with Neural Network Tree”, Lecture Notes in Computer Science, pp. 71-78, 2011. DOI: 10.1007/978-3-642-21738-8 10.
J. Camacho et al., “A Data Cleaning Approach for a Structural Health Monitoring System in a 75 MW Electric Arc Ferronickel Furnace”, Proceedings of 7th International Electronic Conference on Sensors and Applications, 2020. DOI: 10.3390/ecsa-7-08245.
J. Leon-Medina et al., “Deep Learning for the Prediction of Temperature Time Series in the Lining of an Electric Arc Furnace for Structural Health Monitoring at Cerro Matoso S.A. (CMSA)”, Proceedings of 7th International Electronic Conference on Sensors and Applications, 2020. DOI: 10.3390/ecsa-7-08246.
J. Leon-Medina et al., “Temperature Prediction Using Multivariate Time Series Deep Learning in the Lining of an Electric Arc Furnace for Ferronickel Production”, Sensors, vol. 21, no. 20, p. 6894, 2021. DOI: 10.3390/s21206894.
R. Wan, S. Mei, J. Wang, M. Liu, and F. Yang, “Multivariate Temporal Convolutional Network: A Deep Neural Networks Approach for Multivariate Time Series Forecasting”, Electronics, vol. 8, no. 8, p. 876, 2019. DOI: 10.3390/electronics8080876
S. Shih, F. Sun, and H. Lee, “Temporal pattern attention for multivariate time series forecasting”, Machine Learning, vol. 108, no. 8-9, pp. 1421-1441, 2019. DOI: 10.1007/s10994- 019-05815-0
S. Du, T. Li, Y. Yang and S. Horng, “Multivariate time series forecasting via attention-based encoder–decoder framework”, Neurocomputing, vol. 388, pp. 269-279, 2020. DOI: 10.1016/j.neucom.2019.12.118.
S. Huang, D. Wang, X. Wu, and A. Tang, “DSANet: Dual Self-Attention Network for Multivariate Time Series Forecasting”, Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019. DOI: 10.1145/3357384.3358132
CMSA, PR032018OP - Manual del Sistema de Control Estructural del Horno Eléctrico 412-FC-01, 02 ed., 2017.
D. F. Godoy-Rojas et al., “Attention-Based Deep Recurrent Neural Network to Forecast the Temperature Behavior of an Electric Arc Furnace Side-Wall,” Sensors, vol. 22, no. 4, p. 1418, Feb. 2022, doi: 10.3390/s22041418.
American Petroleum Institute (API), “API RP 551 - Process Measurement”, 2da edición, pp. 30-36, febrero 2016, Disponible: https://standards.globalspec.com /std/9988220/API %20RP
“Specification for temperature-electromotive force (EMF) tables for standardized thermocouples” ASTM DOI: 10.1520/e0230_e0230m-17.
W. W. S. Wei, “Time Series analysis”, Oxford Handbooks Online, pp. 458–485, 2013.
J. D. Hamilton, “Time Series analysis”, Princeton, NJ: Princeton University Press, 2020.
P. P. Shinde and S. Shah, “A Review of Machine Learning and Deep Learning Applications”, 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA), Pune, India, 2018, pp. 1-6, doi: 10.1109/ICCU BEA.2018.8697857.
LeCun, Y., Bengio, Y. and Hinton, G. Deep learning. Nature 521, 436–444 (2015). https://doi.org/10.1038/nature14539.
L. Zhang, J. Tan, D. Han, and H. Zhu, “From machine learning to Deep Learning: Progress in Machine Intelligence for Rational Drug Discovery”, Drug Discovery Today, vol. 22, no. 11, pp. 1680–1685, 2017.
C. M. Bishop, “Neural networks and their applications”, Review of Scientific Instruments, vol. 65, no. 6, pp. 1803–1832, 1994.
K. Suzuki, Ed., “Artificial Neural Networks - Architectures and Applications”, Jan. 2013, doi: 10.5772/3409.
C. Zanchettin and T. B. Ludermir, “A methodology to train and improve artificial neural networks weights and connections”, The 2006 IEEE International Joint Conference on Neural Network Proceedings.
Sibi, P., S. Allwyn Jones, and P. Siddarth., “Analysis of different activation functions using back propagation neural networks”, Journal of theoretical and applied information technology 47.3 (2013): 1264-1268.
A. D. Rasamoelina, F. Adjailia and P. Sincák, “A Review of Activation Function for Artificial Neural Network”, 2020 IEEE 18th World Symposium on Applied Machine Intelligence and Informatics (SAMI), Herlany, Slovakia, 2020, pp. 281-286, doi: 10.1109/SA MI48414.2020.9108717
Sharma, Sagar, Simone Sharma, and Anidhya Athaiya, “Activation functions in neural networks”, towards data science 6.12 (2017): 310-316.
Elliott, David L., “A better activation function for artificial neural networks”, 1993.
XU, Jingyi, et al., “A semantic loss function for deep learning with symbolic knowledge”, International conference on machine learning. PMLR, 2018. p. 5502-5511.
LEE, Tae-Hwy., “Loss functions in time series forecasting”, International encyclopedia of the social sciences, 2008, p. 495-502.
HODSON, Timothy O., “Root-mean-square error (RMSE) or mean absolute error (MAE): when to use them or not”, Geoscientific Model Development, 2022, vol. 15, no 14, p. 5481-5487.
C. Alippi, “Weight update in back-propagation neural networks: The role of activation functions”, 1991 IEEE International Joint Conference on Neural Networks, 1991.
D. Svozil, V. Kvasnicka, and Pospichal Jirí, “Introduction to multi-layer feed-forward neural networks”, Chemometrics and Intelligent Laboratory Systems, vol. 39, no. 1, pp. 43–62, 1997.
Ruder, S., “An overview of gradient descent optimization algorithms”, arXiv:1609.04747
S.-ichi Amari, “Backpropagation and stochastic gradient descent method,” Neurocomputing, vol. 5, no. 4-5, pp. 185–196, 1993.
Smith, Leslie N., “A disciplined approach to neural network hyper-parameters: Part 1–learning rate, batch size, momentum, and weight decay” arXiv:1803.09820, 2018.
N. Bacanin, T. Bezdan, E. Tuba, I. Strumberger, and M. Tuba, “Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics”, Algorithms, vol. 13, no. 3, p. 67, 2020.
M. Kuan and K. Hornik, “Convergence of learning algorithms with constant learning rates”, IEEE Transactions on Neural Networks, vol. 2, no. 5, pp. 484-489, Sept. 1991, doi: 10.1109/72.134285.
D. R. Wilson and T. R. Martinez, “The need for small learning rates on large problems” IJCNN’01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222), Washington, DC, USA, 2001, pp. 115-119 vol.1, doi: 10.1109/IJCNN.2001.939002
Y. Bengio, “Gradient-based optimization of hyperparameters”, Neural Computation, vol. 12, no. 8, pp. 1889–1900, 2000.
“Recurrent neural networks architectures”, Wiley Series in Adaptive and Learning Sys tems for Signal Processing, Communications and Control, pp. 69–89.
A. L. Caterini and D. E. Chang, “Recurrent neural networks”, Deep Neural Networks in a Mathematical Framework, pp. 59–79, 2018.
Sutskever, Ilya, “Training recurrent neural networks” Toronto, ON, Canada: University of Toronto, 2013.
A. Sherstinsky, “Fundamentals of Recurrent Neural Network (RNN) and long short-term memory (LSTM) network”, Physica D: Nonlinear Phenomena, vol. 404, p. 132306, 2020
S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory”, In Neural Computation, vol. 9, no. 8, pp. 1735-1780, 15 Nov. 1997, doi: 10.1162/neco.1997.9.8.1735.
Y. Hua, Z. Zhao, R. Li, X. Chen, Z. Liu and H. Zhang, “Deep Learning with Long Short-Term Memory for Time Series Prediction”, In IEEE Communications Magazine, vol. 57, no. 6, pp. 114-119, June 2019, doi: 10.1109/MCOM.2019.1800155.
X. Song, Y. Liu, L. Xue, J. Wang, J. Zhang, J. Wang, L. Jiang, and Z. Cheng, “Time-series well performance prediction based on Long Short-Term Memory (LSTM) neural network model”, Journal of Petroleum Science and Engineering, vol. 186, p. 106682, 2020.
R. Dey and F. M. Salem, ”Gate-variants of Gated Recurrent Unit (GRU) neural networks”, 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, 2017, pp. 1597-1600, doi: 10.1109/MWSCAS.2017.8053243.
Y. Wang, W. Liao, and Y. Chang, “Gated recurrent unit network-based short-term photovoltaic forecasting”, Energies, vol. 11, no. 8, p. 2163, 2018.
H. Lin, A. Gharehbaghi, Q. Zhang, S. S. Band, H. T. Pai, K.-W. Chau, and A. Mosavi, “Time Series-based groundwater level forecasting using gated recurrent unit deep neural networks”, Engineering Applications of Computational Fluid Mechanics, vol. 16, no. 1, pp. 1655–1672, 2022
S. H. Park, B. Kim, C. M. Kang, C. C. Chung and J. W. Choi, “Sequence-to-Sequence Prediction of Vehicle Trajectory via LSTM Encoder-Decoder Architecture”, 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 2018, pp. 1672-1678, doi: 10.1109/IVS.2018.8500658.
R. Laubscher, “Time-series forecasting of coal-fired power plant reheater metal temperatures using encoder-decoder recurrent neural networks”, Energy, vol. 189, p. 116187, 2019.
C. Olah and S. Carter, “Attention and augmented recurrent neural networks”, Distill, 08-Sep-2016. [Online]. Disponible: https://distill.pub/2016/augmented-rnns/. [Acceso: 15- Sep-2022].
Z. Niu, G. Zhong, and H. Yu, “A review on the attention mechanism of Deep Learning”, Neurocomputing, vol. 452, pp. 48–62, 2021.
Y. Qin, D. Song, H. Chen, W. Cheng, G. Jiang, and G. W. Cottrell, “A dual-stage attention-based recurrent neural network for time series prediction”, Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, 2017.
IT-0003-A28-C3-V1-18.11.2019 - Informe preliminar con análisis estadístico de datos y correlaciones posibles.
IT-O3O4-C15C34.2.3-V1-17.06.2020 - Informe técnico de caracterización e identificación de variables del horno línea 1 FC01.
IT-O3O4.C38.2.1-V1-04.10.2021 - Informe técnico de caracterización e identificación de variables del horno línea 2 FC150.
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.license.spa.fl_str_mv Atribución-NoComercial 4.0 Internacional
dc.rights.uri.spa.fl_str_mv http://creativecommons.org/licenses/by-nc/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
rights_invalid_str_mv Atribución-NoComercial 4.0 Internacional
http://creativecommons.org/licenses/by-nc/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv xvi, 90 páginas
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.spa.fl_str_mv Universidad Nacional de Colombia
dc.publisher.program.spa.fl_str_mv Bogotá - Ingeniería - Maestría en Ingeniería - Automatización Industrial
dc.publisher.faculty.spa.fl_str_mv Facultad de Ingeniería
dc.publisher.place.spa.fl_str_mv Bogotá, Colombia
dc.publisher.branch.spa.fl_str_mv Universidad Nacional de Colombia - Sede Bogotá
institution Universidad Nacional de Colombia
bitstream.url.fl_str_mv https://repositorio.unal.edu.co/bitstream/unal/83956/6/1031173820.2023.pdf
https://repositorio.unal.edu.co/bitstream/unal/83956/5/license.txt
https://repositorio.unal.edu.co/bitstream/unal/83956/7/1031173820.2023.pdf.jpg
bitstream.checksum.fl_str_mv 9043d9f5323483aaadad6a50d0b94040
eb34b1cf90b7e1103fc9dfd26be24b4a
f5b7297968ef5ae0576a026fa3f11189
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
repository.mail.fl_str_mv repositorio_nal@unal.edu.co
_version_ 1814090051399712768
spelling Atribución-NoComercial 4.0 Internacionalhttp://creativecommons.org/licenses/by-nc/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Tibaduiza Burgos, Diego Alexanderb3416ad87ce35b324e978bb991d649a2Leon-Medina, Jersson Xaviere32da8a2a3c656054c9eecb83be8fec4600Godoy Rojas, Diego Fernando92651b7b7fd158045195194780397aceGrupo de Investigación en Electrónica de Alta Frecuencia y Telecomunicaciones (Cmun)Diego F. Godoy-Rojas [0000-0002-1639-7992]2023-06-02T14:26:53Z2023-06-02T14:26:53Z2022https://repositorio.unal.edu.co/handle/unal/83956Universidad Nacional de ColombiaRepositorio Institucional Universidad Nacional de Colombiahttps://repositorio.unal.edu.co/ilustraciones, graficasEn el presente documento se detalla el flujo de trabajo llevado a cabo para el desarrollo de modelos de aprendizaje profundo para la estimación de temperatura de pared media en dos hornos de arco eléctrico pertenecientes a la empresa Cerro Matoso S.A. El documento inicia con una introducción al contexto bajo el cual se desarrolló el trabajo final de maestría, dando paso a la descripción teórica de todos los aspectos relevantes y generalidades sobre el funcionamiento de la planta, las series de tiempo y el aprendizaje profundo requeridas durante el desarrollo del proyecto. El flujo de trabajo se divide en una metodología de 3 pasos, empezando por el estudio y preparación del conjunto de datos brindado por CMSA, seguido por el desarrollo, entrenamiento y selección de diversos modelos de aprendizaje profundo usados en predicciones con datos de un conjunto de prueba obteniendo errores RMSE entre 1-2 °C y finalizando con una etapa de validación que estudia el desempeño de los diversos modelos obtenidos frente a diversas variaciones en las condiciones de los parámetros de entrenamiento. (Texto tomado de la fuente)This document details the workflow followed for the development of deep learning models for the estimation of mean wall temperature in two electric arc furnaces belonging to the company Cerro Matoso S.A. The document begins by establishing the development context of the final master's degree project. Afterwards, the theoretical description of all the relevant aspects and generalities about the operation of the plant, the time series and the deep learning required during the development of the project is given. The workflow is divided into a 3-step methodology starting with the study and preparation of the data set provided by CMSA, followed by the development, training and selection of various deep learning models used in predictions with data from a test set. obtaining RMSE errors between 1-2 °C and ending with a validation stage that studies the performance of the various models obtained against various variations in the conditions of the training parameters.Contiene diagramas, formulas, ilustraciones y tablas.El presente trabajo fue realizado dentro del marco de la colaboración entre la Universidad Nacional de Colombia y Cerro Matoso S.A, financiada por el Ministerio Colombiano de Ciencia mediante la convocatoria 786: “Convocatoria para el registro de proyectos que aspiran a obtener beneficios tributarios por inversión en CTel“. La totalidad de los registros empleados en el presente proyecto son de carácter privado y pertenecen a Cerro Matoso S.A. Dichos registros no pueden ser publicados, compartidos o reproducidos total o parcialmente sin el conocimiento y expresa autorización de Cerro Matoso S.A.MaestríaMagíster en Ingeniería - Automatización IndustrialAutomatización de Procesos y Máquinasxvi, 90 páginasapplication/pdfspaUniversidad Nacional de ColombiaBogotá - Ingeniería - Maestría en Ingeniería - Automatización IndustrialFacultad de IngenieríaBogotá, ColombiaUniversidad Nacional de Colombia - Sede Bogotá600 - Tecnología (Ciencias aplicadas)Machine learningAPRENDIZAJE AUTOMATICO (INTELIGENCIA ARTIFICIAL)Aprendizaje profundoInteligencia artificialMecanismos de atenciónRedes NeuronalesSeries de tiempoSalud estructuralDeep LearningAttention MechanismsNeural NetworksTime Series forecastingGRULSTMAprendizaje profundo para la predicción de temperatura en las paredes refractarias de un horno de arco eléctricoDeep learning for temperature prediction in the refractory walls of an electric arc furnaceTrabajo de grado - Maestríainfo:eu-repo/semantics/masterThesisinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_7a1fTexthttp://purl.org/redcol/resource_type/TPD. Tibaduiza et al., “Structural Health Monitoring System for Furnace Refractory Wall Thickness Measurements at Cerro Matoso SA”, Lecture Notes in Civil Engineering, pp. 414-423, 2021. DOI: 10.1007/978-3-030-64594-6_41F. Pozo et al., “Structural health monitoring and condition monitoring applications: sensing, distributed communication and processing”, International Journal of distributed sensor networks, vol 16, no. 9, p 1-3, 2020. DOI: 10.1177/1550147720963270J. Birat, “A futures study analysis of the technological evolution of the EAF by 2010”, Revue de Métallurgie, vol. 97, no. 11, pp. 1347-1363, 2000. DOI: 10.1051/metal:2000114“Redes neuronales profundas - Tipos y Características - Código Fuente”, Código Fuente, 2021. [Online]. Disponible: https://www.codigofuente.org/redes-neuronales-profundas-tipos-caracteristicas/. [Acceso: 17- Jul- 2021].“Illustrated Guide to LSTM’s and GRU’s: A step by step explanation”, Medium, 2021. [Online]. Disponible: https://towardsdatascience.com/illustrated-guide-to-lstms-and-grus-a-step-by-step-explanation-44e9eb85bf21. [Acceso: 17- Jul-2021].“Major Mines & Projects | Cerro Matoso Mine”, Miningdataonline.com, 2021. [Online]. Disponible: https://miningdataonline.com/property/336/Cerro-Matoso-Mine.aspx. [Acceso: 25- Nov- 2021]Janzen, J.; Gerritsen, T.; Voermann, N.; Veloza, E.R.; Delgado, R.C. Integrated Furnace Controls: Implementation on a Covered-Arc (Shielded Arc) Furnace at Cerro Matoso. In Proceedings of the 10th International Ferroalloys Congress, Cape Town, South Africa, 1–4 Feb. 2004; pp. 659–669.R. Garcia-Segura, J. Vázquez Castillo, F. Martell-Chavez, O. Longoria-Gandara, and J. Ortegón Aguilar, “Electric Arc Furnace Modeling with Artificial Neural Networks and Arc Length with Variable Voltage Gradient,” Energies, vol. 10, no. 9, p. 1424, Sep. 2017C. Chen, Y. Liu, M. Kumar, and J. Qin, “Energy Consumption Modelling Using Deep Learning Technique — A Case Study of EAF”, Procedia CIRP, vol. 72, pp. 1063-1068, 2018. DOI: 10.1016/j.procir.2018.03.095.S. Ismaeel, A. Miri, A. Sadeghian, and D. Chourishi, “An Extreme Learning Machine (ELM) Predictor for Electric Arc Furnaces’ v-i Characteristics,”2015 IEEE 2nd International Conference on Cyber Security and Cloud Computing, 2015, pp. 329-334, DOI: 10.1109/CSCloud.2015.94.J. Mesa Fernández, V. Cabal, V. Montequin and J. Balsera, “Online estimation of electric arc furnace tap temperature by using fuzzy neural networks”, Engineering Applications of Artificial Intelligence, vol. 21, no. 7, pp. 1001-1012, 2008. DOI: 10.1016/j.engappai.2007.11.008.M. Kordos, M. Blachnik and T. Wieczorek, “Temperature Prediction in Electric Arc Furnace with Neural Network Tree”, Lecture Notes in Computer Science, pp. 71-78, 2011. DOI: 10.1007/978-3-642-21738-8 10.J. Camacho et al., “A Data Cleaning Approach for a Structural Health Monitoring System in a 75 MW Electric Arc Ferronickel Furnace”, Proceedings of 7th International Electronic Conference on Sensors and Applications, 2020. DOI: 10.3390/ecsa-7-08245.J. Leon-Medina et al., “Deep Learning for the Prediction of Temperature Time Series in the Lining of an Electric Arc Furnace for Structural Health Monitoring at Cerro Matoso S.A. (CMSA)”, Proceedings of 7th International Electronic Conference on Sensors and Applications, 2020. DOI: 10.3390/ecsa-7-08246.J. Leon-Medina et al., “Temperature Prediction Using Multivariate Time Series Deep Learning in the Lining of an Electric Arc Furnace for Ferronickel Production”, Sensors, vol. 21, no. 20, p. 6894, 2021. DOI: 10.3390/s21206894.R. Wan, S. Mei, J. Wang, M. Liu, and F. Yang, “Multivariate Temporal Convolutional Network: A Deep Neural Networks Approach for Multivariate Time Series Forecasting”, Electronics, vol. 8, no. 8, p. 876, 2019. DOI: 10.3390/electronics8080876S. Shih, F. Sun, and H. Lee, “Temporal pattern attention for multivariate time series forecasting”, Machine Learning, vol. 108, no. 8-9, pp. 1421-1441, 2019. DOI: 10.1007/s10994- 019-05815-0S. Du, T. Li, Y. Yang and S. Horng, “Multivariate time series forecasting via attention-based encoder–decoder framework”, Neurocomputing, vol. 388, pp. 269-279, 2020. DOI: 10.1016/j.neucom.2019.12.118.S. Huang, D. Wang, X. Wu, and A. Tang, “DSANet: Dual Self-Attention Network for Multivariate Time Series Forecasting”, Proceedings of the 28th ACM International Conference on Information and Knowledge Management, 2019. DOI: 10.1145/3357384.3358132CMSA, PR032018OP - Manual del Sistema de Control Estructural del Horno Eléctrico 412-FC-01, 02 ed., 2017.D. F. Godoy-Rojas et al., “Attention-Based Deep Recurrent Neural Network to Forecast the Temperature Behavior of an Electric Arc Furnace Side-Wall,” Sensors, vol. 22, no. 4, p. 1418, Feb. 2022, doi: 10.3390/s22041418.American Petroleum Institute (API), “API RP 551 - Process Measurement”, 2da edición, pp. 30-36, febrero 2016, Disponible: https://standards.globalspec.com /std/9988220/API %20RP“Specification for temperature-electromotive force (EMF) tables for standardized thermocouples” ASTM DOI: 10.1520/e0230_e0230m-17.W. W. S. Wei, “Time Series analysis”, Oxford Handbooks Online, pp. 458–485, 2013.J. D. Hamilton, “Time Series analysis”, Princeton, NJ: Princeton University Press, 2020.P. P. Shinde and S. Shah, “A Review of Machine Learning and Deep Learning Applications”, 2018 Fourth International Conference on Computing Communication Control and Automation (ICCUBEA), Pune, India, 2018, pp. 1-6, doi: 10.1109/ICCU BEA.2018.8697857.LeCun, Y., Bengio, Y. and Hinton, G. Deep learning. Nature 521, 436–444 (2015). https://doi.org/10.1038/nature14539.L. Zhang, J. Tan, D. Han, and H. Zhu, “From machine learning to Deep Learning: Progress in Machine Intelligence for Rational Drug Discovery”, Drug Discovery Today, vol. 22, no. 11, pp. 1680–1685, 2017.C. M. Bishop, “Neural networks and their applications”, Review of Scientific Instruments, vol. 65, no. 6, pp. 1803–1832, 1994.K. Suzuki, Ed., “Artificial Neural Networks - Architectures and Applications”, Jan. 2013, doi: 10.5772/3409.C. Zanchettin and T. B. Ludermir, “A methodology to train and improve artificial neural networks weights and connections”, The 2006 IEEE International Joint Conference on Neural Network Proceedings.Sibi, P., S. Allwyn Jones, and P. Siddarth., “Analysis of different activation functions using back propagation neural networks”, Journal of theoretical and applied information technology 47.3 (2013): 1264-1268.A. D. Rasamoelina, F. Adjailia and P. Sincák, “A Review of Activation Function for Artificial Neural Network”, 2020 IEEE 18th World Symposium on Applied Machine Intelligence and Informatics (SAMI), Herlany, Slovakia, 2020, pp. 281-286, doi: 10.1109/SA MI48414.2020.9108717Sharma, Sagar, Simone Sharma, and Anidhya Athaiya, “Activation functions in neural networks”, towards data science 6.12 (2017): 310-316.Elliott, David L., “A better activation function for artificial neural networks”, 1993.XU, Jingyi, et al., “A semantic loss function for deep learning with symbolic knowledge”, International conference on machine learning. PMLR, 2018. p. 5502-5511.LEE, Tae-Hwy., “Loss functions in time series forecasting”, International encyclopedia of the social sciences, 2008, p. 495-502.HODSON, Timothy O., “Root-mean-square error (RMSE) or mean absolute error (MAE): when to use them or not”, Geoscientific Model Development, 2022, vol. 15, no 14, p. 5481-5487.C. Alippi, “Weight update in back-propagation neural networks: The role of activation functions”, 1991 IEEE International Joint Conference on Neural Networks, 1991.D. Svozil, V. Kvasnicka, and Pospichal Jirí, “Introduction to multi-layer feed-forward neural networks”, Chemometrics and Intelligent Laboratory Systems, vol. 39, no. 1, pp. 43–62, 1997.Ruder, S., “An overview of gradient descent optimization algorithms”, arXiv:1609.04747S.-ichi Amari, “Backpropagation and stochastic gradient descent method,” Neurocomputing, vol. 5, no. 4-5, pp. 185–196, 1993.Smith, Leslie N., “A disciplined approach to neural network hyper-parameters: Part 1–learning rate, batch size, momentum, and weight decay” arXiv:1803.09820, 2018.N. Bacanin, T. Bezdan, E. Tuba, I. Strumberger, and M. Tuba, “Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics”, Algorithms, vol. 13, no. 3, p. 67, 2020.M. Kuan and K. Hornik, “Convergence of learning algorithms with constant learning rates”, IEEE Transactions on Neural Networks, vol. 2, no. 5, pp. 484-489, Sept. 1991, doi: 10.1109/72.134285.D. R. Wilson and T. R. Martinez, “The need for small learning rates on large problems” IJCNN’01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222), Washington, DC, USA, 2001, pp. 115-119 vol.1, doi: 10.1109/IJCNN.2001.939002Y. Bengio, “Gradient-based optimization of hyperparameters”, Neural Computation, vol. 12, no. 8, pp. 1889–1900, 2000.“Recurrent neural networks architectures”, Wiley Series in Adaptive and Learning Sys tems for Signal Processing, Communications and Control, pp. 69–89.A. L. Caterini and D. E. Chang, “Recurrent neural networks”, Deep Neural Networks in a Mathematical Framework, pp. 59–79, 2018.Sutskever, Ilya, “Training recurrent neural networks” Toronto, ON, Canada: University of Toronto, 2013.A. Sherstinsky, “Fundamentals of Recurrent Neural Network (RNN) and long short-term memory (LSTM) network”, Physica D: Nonlinear Phenomena, vol. 404, p. 132306, 2020S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory”, In Neural Computation, vol. 9, no. 8, pp. 1735-1780, 15 Nov. 1997, doi: 10.1162/neco.1997.9.8.1735.Y. Hua, Z. Zhao, R. Li, X. Chen, Z. Liu and H. Zhang, “Deep Learning with Long Short-Term Memory for Time Series Prediction”, In IEEE Communications Magazine, vol. 57, no. 6, pp. 114-119, June 2019, doi: 10.1109/MCOM.2019.1800155.X. Song, Y. Liu, L. Xue, J. Wang, J. Zhang, J. Wang, L. Jiang, and Z. Cheng, “Time-series well performance prediction based on Long Short-Term Memory (LSTM) neural network model”, Journal of Petroleum Science and Engineering, vol. 186, p. 106682, 2020.R. Dey and F. M. Salem, ”Gate-variants of Gated Recurrent Unit (GRU) neural networks”, 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS), Boston, MA, USA, 2017, pp. 1597-1600, doi: 10.1109/MWSCAS.2017.8053243.Y. Wang, W. Liao, and Y. Chang, “Gated recurrent unit network-based short-term photovoltaic forecasting”, Energies, vol. 11, no. 8, p. 2163, 2018.H. Lin, A. Gharehbaghi, Q. Zhang, S. S. Band, H. T. Pai, K.-W. Chau, and A. Mosavi, “Time Series-based groundwater level forecasting using gated recurrent unit deep neural networks”, Engineering Applications of Computational Fluid Mechanics, vol. 16, no. 1, pp. 1655–1672, 2022S. H. Park, B. Kim, C. M. Kang, C. C. Chung and J. W. Choi, “Sequence-to-Sequence Prediction of Vehicle Trajectory via LSTM Encoder-Decoder Architecture”, 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 2018, pp. 1672-1678, doi: 10.1109/IVS.2018.8500658.R. Laubscher, “Time-series forecasting of coal-fired power plant reheater metal temperatures using encoder-decoder recurrent neural networks”, Energy, vol. 189, p. 116187, 2019.C. Olah and S. Carter, “Attention and augmented recurrent neural networks”, Distill, 08-Sep-2016. [Online]. Disponible: https://distill.pub/2016/augmented-rnns/. [Acceso: 15- Sep-2022].Z. Niu, G. Zhong, and H. Yu, “A review on the attention mechanism of Deep Learning”, Neurocomputing, vol. 452, pp. 48–62, 2021.Y. Qin, D. Song, H. Chen, W. Cheng, G. Jiang, and G. W. Cottrell, “A dual-stage attention-based recurrent neural network for time series prediction”, Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, 2017.IT-0003-A28-C3-V1-18.11.2019 - Informe preliminar con análisis estadístico de datos y correlaciones posibles.IT-O3O4-C15C34.2.3-V1-17.06.2020 - Informe técnico de caracterización e identificación de variables del horno línea 1 FC01.IT-O3O4.C38.2.1-V1-04.10.2021 - Informe técnico de caracterización e identificación de variables del horno línea 2 FC150.EstudiantesInvestigadoresMaestrosPúblico generalORIGINAL1031173820.2023.pdf1031173820.2023.pdfTesis de Maestría en Automatización Industrialapplication/pdf5789571https://repositorio.unal.edu.co/bitstream/unal/83956/6/1031173820.2023.pdf9043d9f5323483aaadad6a50d0b94040MD56LICENSElicense.txtlicense.txttext/plain; charset=utf-85879https://repositorio.unal.edu.co/bitstream/unal/83956/5/license.txteb34b1cf90b7e1103fc9dfd26be24b4aMD55THUMBNAIL1031173820.2023.pdf.jpg1031173820.2023.pdf.jpgGenerated Thumbnailimage/jpeg4792https://repositorio.unal.edu.co/bitstream/unal/83956/7/1031173820.2023.pdf.jpgf5b7297968ef5ae0576a026fa3f11189MD57unal/83956oai:repositorio.unal.edu.co:unal/839562024-08-09 23:19:53.378Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUEFSVEUgMS4gVMOJUk1JTk9TIERFIExBIExJQ0VOQ0lBIFBBUkEgUFVCTElDQUNJw5NOIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KCkxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgbG9zIGRlcmVjaG9zIHBhdHJpbW9uaWFsZXMgZGUgYXV0b3IsIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEsIGxpbWl0YWRhIHkgZ3JhdHVpdGEgc29icmUgbGEgb2JyYSBxdWUgc2UgaW50ZWdyYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsLCBiYWpvIGxvcyBzaWd1aWVudGVzIHTDqXJtaW5vczoKCgphKQlMb3MgYXV0b3JlcyB5L28gbG9zIHRpdHVsYXJlcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBzb2JyZSBsYSBvYnJhIGNvbmZpZXJlbiBhIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHVuYSBsaWNlbmNpYSBubyBleGNsdXNpdmEgcGFyYSByZWFsaXphciBsb3Mgc2lndWllbnRlcyBhY3RvcyBzb2JyZSBsYSBvYnJhOiBpKSByZXByb2R1Y2lyIGxhIG9icmEgZGUgbWFuZXJhIGRpZ2l0YWwsIHBlcm1hbmVudGUgbyB0ZW1wb3JhbCwgaW5jbHV5ZW5kbyBlbCBhbG1hY2VuYW1pZW50byBlbGVjdHLDs25pY28sIGFzw60gY29tbyBjb252ZXJ0aXIgZWwgZG9jdW1lbnRvIGVuIGVsIGN1YWwgc2UgZW5jdWVudHJhIGNvbnRlbmlkYSBsYSBvYnJhIGEgY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBleGlzdGVudGUgYSBsYSBmZWNoYSBkZSBsYSBzdXNjcmlwY2nDs24gZGUgbGEgcHJlc2VudGUgbGljZW5jaWEsIHkgaWkpIGNvbXVuaWNhciBhbCBww7pibGljbyBsYSBvYnJhIHBvciBjdWFscXVpZXIgbWVkaW8gbyBwcm9jZWRpbWllbnRvLCBlbiBtZWRpb3MgYWzDoW1icmljb3MgbyBpbmFsw6FtYnJpY29zLCBpbmNsdXllbmRvIGxhIHB1ZXN0YSBhIGRpc3Bvc2ljacOzbiBlbiBhY2Nlc28gYWJpZXJ0by4gQWRpY2lvbmFsIGEgbG8gYW50ZXJpb3IsIGVsIGF1dG9yIHkvbyB0aXR1bGFyIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBxdWUsIGVuIGxhIHJlcHJvZHVjY2nDs24geSBjb211bmljYWNpw7NuIGFsIHDDumJsaWNvIHF1ZSBsYSBVbml2ZXJzaWRhZCByZWFsaWNlIHNvYnJlIGxhIG9icmEsIGhhZ2EgbWVuY2nDs24gZGUgbWFuZXJhIGV4cHJlc2EgYWwgdGlwbyBkZSBsaWNlbmNpYSBDcmVhdGl2ZSBDb21tb25zIGJham8gbGEgY3VhbCBlbCBhdXRvciB5L28gdGl0dWxhciBkZXNlYSBvZnJlY2VyIHN1IG9icmEgYSBsb3MgdGVyY2Vyb3MgcXVlIGFjY2VkYW4gYSBkaWNoYSBvYnJhIGEgdHJhdsOpcyBkZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCwgY3VhbmRvIHNlYSBlbCBjYXNvLiBFbCBhdXRvciB5L28gdGl0dWxhciBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBwb2Ryw6EgZGFyIHBvciB0ZXJtaW5hZGEgbGEgcHJlc2VudGUgbGljZW5jaWEgbWVkaWFudGUgc29saWNpdHVkIGVsZXZhZGEgYSBsYSBEaXJlY2Npw7NuIE5hY2lvbmFsIGRlIEJpYmxpb3RlY2FzIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLiAKCmIpIAlMb3MgYXV0b3JlcyB5L28gdGl0dWxhcmVzIGRlIGxvcyBkZXJlY2hvcyBwYXRyaW1vbmlhbGVzIGRlIGF1dG9yIHNvYnJlIGxhIG9icmEgY29uZmllcmVuIGxhIGxpY2VuY2lhIHNlw7FhbGFkYSBlbiBlbCBsaXRlcmFsIGEpIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gcG9yIGVsIHRpZW1wbyBkZSBwcm90ZWNjacOzbiBkZSBsYSBvYnJhIGVuIHRvZG9zIGxvcyBwYcOtc2VzIGRlbCBtdW5kbywgZXN0byBlcywgc2luIGxpbWl0YWNpw7NuIHRlcnJpdG9yaWFsIGFsZ3VuYS4KCmMpCUxvcyBhdXRvcmVzIHkvbyB0aXR1bGFyZXMgZGUgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciBtYW5pZmllc3RhbiBlc3RhciBkZSBhY3VlcmRvIGNvbiBxdWUgbGEgcHJlc2VudGUgbGljZW5jaWEgc2Ugb3RvcmdhIGEgdMOtdHVsbyBncmF0dWl0bywgcG9yIGxvIHRhbnRvLCByZW51bmNpYW4gYSByZWNpYmlyIGN1YWxxdWllciByZXRyaWJ1Y2nDs24gZWNvbsOzbWljYSBvIGVtb2x1bWVudG8gYWxndW5vIHBvciBsYSBwdWJsaWNhY2nDs24sIGRpc3RyaWJ1Y2nDs24sIGNvbXVuaWNhY2nDs24gcMO6YmxpY2EgeSBjdWFscXVpZXIgb3RybyB1c28gcXVlIHNlIGhhZ2EgZW4gbG9zIHTDqXJtaW5vcyBkZSBsYSBwcmVzZW50ZSBsaWNlbmNpYSB5IGRlIGxhIGxpY2VuY2lhIENyZWF0aXZlIENvbW1vbnMgY29uIHF1ZSBzZSBwdWJsaWNhLgoKZCkJUXVpZW5lcyBmaXJtYW4gZWwgcHJlc2VudGUgZG9jdW1lbnRvIGRlY2xhcmFuIHF1ZSBwYXJhIGxhIGNyZWFjacOzbiBkZSBsYSBvYnJhLCBubyBzZSBoYW4gdnVsbmVyYWRvIGxvcyBkZXJlY2hvcyBkZSBwcm9waWVkYWQgaW50ZWxlY3R1YWwsIGluZHVzdHJpYWwsIG1vcmFsZXMgeSBwYXRyaW1vbmlhbGVzIGRlIHRlcmNlcm9zLiBEZSBvdHJhIHBhcnRlLCAgcmVjb25vY2VuIHF1ZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhY3TDumEgY29tbyB1biB0ZXJjZXJvIGRlIGJ1ZW5hIGZlIHkgc2UgZW5jdWVudHJhIGV4ZW50YSBkZSBjdWxwYSBlbiBjYXNvIGRlIHByZXNlbnRhcnNlIGFsZ8O6biB0aXBvIGRlIHJlY2xhbWFjacOzbiBlbiBtYXRlcmlhIGRlIGRlcmVjaG9zIGRlIGF1dG9yIG8gcHJvcGllZGFkIGludGVsZWN0dWFsIGVuIGdlbmVyYWwuIFBvciBsbyB0YW50bywgbG9zIGZpcm1hbnRlcyAgYWNlcHRhbiBxdWUgY29tbyB0aXR1bGFyZXMgw7puaWNvcyBkZSBsb3MgZGVyZWNob3MgcGF0cmltb25pYWxlcyBkZSBhdXRvciwgYXN1bWlyw6FuIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIGNpdmlsLCBhZG1pbmlzdHJhdGl2YSB5L28gcGVuYWwgcXVlIHB1ZWRhIGRlcml2YXJzZSBkZSBsYSBwdWJsaWNhY2nDs24gZGUgbGEgb2JyYS4gIAoKZikJQXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyBhZ3JlZ2Fkb3JlcyBkZSBjb250ZW5pZG9zLCBidXNjYWRvcmVzIGFjYWTDqW1pY29zLCBtZXRhYnVzY2Fkb3Jlcywgw61uZGljZXMgeSBkZW3DoXMgbWVkaW9zIHF1ZSBzZSBlc3RpbWVuIG5lY2VzYXJpb3MgcGFyYSBwcm9tb3ZlciBlbCBhY2Nlc28geSBjb25zdWx0YSBkZSBsYSBtaXNtYS4gCgpnKQlFbiBlbCBjYXNvIGRlIGxhcyB0ZXNpcyBjcmVhZGFzIHBhcmEgb3B0YXIgZG9ibGUgdGl0dWxhY2nDs24sIGxvcyBmaXJtYW50ZXMgc2Vyw6FuIGxvcyByZXNwb25zYWJsZXMgZGUgY29tdW5pY2FyIGEgbGFzIGluc3RpdHVjaW9uZXMgbmFjaW9uYWxlcyBvIGV4dHJhbmplcmFzIGVuIGNvbnZlbmlvLCBsYXMgbGljZW5jaWFzIGRlIGFjY2VzbyBhYmllcnRvIENyZWF0aXZlIENvbW1vbnMgeSBhdXRvcml6YWNpb25lcyBhc2lnbmFkYXMgYSBzdSBvYnJhIHBhcmEgbGEgcHVibGljYWNpw7NuIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU5BTCBkZSBhY3VlcmRvIGNvbiBsYXMgZGlyZWN0cmljZXMgZGUgbGEgUG9sw610aWNhIEdlbmVyYWwgZGUgbGEgQmlibGlvdGVjYSBEaWdpdGFsLgoKCmgpCVNlIGF1dG9yaXphIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgY29tbyByZXNwb25zYWJsZSBkZWwgdHJhdGFtaWVudG8gZGUgZGF0b3MgcGVyc29uYWxlcywgZGUgYWN1ZXJkbyBjb24gbGEgbGV5IDE1ODEgZGUgMjAxMiBlbnRlbmRpZW5kbyBxdWUgc2UgZW5jdWVudHJhbiBiYWpvIG1lZGlkYXMgcXVlIGdhcmFudGl6YW4gbGEgc2VndXJpZGFkLCBjb25maWRlbmNpYWxpZGFkIGUgaW50ZWdyaWRhZCwgeSBzdSB0cmF0YW1pZW50byB0aWVuZSB1bmEgZmluYWxpZGFkIGhpc3TDs3JpY2EsIGVzdGFkw61zdGljYSBvIGNpZW50w61maWNhIHNlZ8O6biBsbyBkaXNwdWVzdG8gZW4gbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMuCgoKClBBUlRFIDIuIEFVVE9SSVpBQ0nDk04gUEFSQSBQVUJMSUNBUiBZIFBFUk1JVElSIExBIENPTlNVTFRBIFkgVVNPIERFIE9CUkFTIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU5BTC4KClNlIGF1dG9yaXphIGxhIHB1YmxpY2FjacOzbiBlbGVjdHLDs25pY2EsIGNvbnN1bHRhIHkgdXNvIGRlIGxhIG9icmEgcG9yIHBhcnRlIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgZGUgc3VzIHVzdWFyaW9zIGRlIGxhIHNpZ3VpZW50ZSBtYW5lcmE6CgphLglDb25jZWRvIGxpY2VuY2lhIGVuIGxvcyB0w6lybWlub3Mgc2XDsWFsYWRvcyBlbiBsYSBwYXJ0ZSAxIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8sIGNvbiBlbCBvYmpldGl2byBkZSBxdWUgbGEgb2JyYSBlbnRyZWdhZGEgc2VhIHB1YmxpY2FkYSBlbiBlbCBSZXBvc2l0b3JpbyBJbnN0aXR1Y2lvbmFsIGRlIGxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhIHkgcHVlc3RhIGEgZGlzcG9zaWNpw7NuIGVuIGFjY2VzbyBhYmllcnRvIHBhcmEgc3UgY29uc3VsdGEgcG9yIGxvcyB1c3VhcmlvcyBkZSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSAgYSB0cmF2w6lzIGRlIGludGVybmV0LgoKCgpQQVJURSAzIEFVVE9SSVpBQ0nDk04gREUgVFJBVEFNSUVOVE8gREUgREFUT1MgUEVSU09OQUxFUy4KCkxhIFVuaXZlcnNpZGFkIE5hY2lvbmFsIGRlIENvbG9tYmlhLCBjb21vIHJlc3BvbnNhYmxlIGRlbCBUcmF0YW1pZW50byBkZSBEYXRvcyBQZXJzb25hbGVzLCBpbmZvcm1hIHF1ZSBsb3MgZGF0b3MgZGUgY2Fyw6FjdGVyIHBlcnNvbmFsIHJlY29sZWN0YWRvcyBtZWRpYW50ZSBlc3RlIGZvcm11bGFyaW8sIHNlIGVuY3VlbnRyYW4gYmFqbyBtZWRpZGFzIHF1ZSBnYXJhbnRpemFuIGxhIHNlZ3VyaWRhZCwgY29uZmlkZW5jaWFsaWRhZCBlIGludGVncmlkYWQgeSBzdSB0cmF0YW1pZW50byBzZSByZWFsaXphIGRlIGFjdWVyZG8gYWwgY3VtcGxpbWllbnRvIG5vcm1hdGl2byBkZSBsYSBMZXkgMTU4MSBkZSAyMDEyIHkgZGUgbGEgUG9sw610aWNhIGRlIFRyYXRhbWllbnRvIGRlIERhdG9zIFBlcnNvbmFsZXMgZGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEuIFB1ZWRlIGVqZXJjZXIgc3VzIGRlcmVjaG9zIGNvbW8gdGl0dWxhciBhIGNvbm9jZXIsIGFjdHVhbGl6YXIsIHJlY3RpZmljYXIgeSByZXZvY2FyIGxhcyBhdXRvcml6YWNpb25lcyBkYWRhcyBhIGxhcyBmaW5hbGlkYWRlcyBhcGxpY2FibGVzIGEgdHJhdsOpcyBkZSBsb3MgY2FuYWxlcyBkaXNwdWVzdG9zIHkgZGlzcG9uaWJsZXMgZW4gd3d3LnVuYWwuZWR1LmNvIG8gZS1tYWlsOiBwcm90ZWNkYXRvc19uYUB1bmFsLmVkdS5jbyIKClRlbmllbmRvIGVuIGN1ZW50YSBsbyBhbnRlcmlvciwgYXV0b3Jpem8gZGUgbWFuZXJhIHZvbHVudGFyaWEsIHByZXZpYSwgZXhwbMOtY2l0YSwgaW5mb3JtYWRhIGUgaW5lcXXDrXZvY2EgYSBsYSBVbml2ZXJzaWRhZCBOYWNpb25hbCBkZSBDb2xvbWJpYSBhIHRyYXRhciBsb3MgZGF0b3MgcGVyc29uYWxlcyBkZSBhY3VlcmRvIGNvbiBsYXMgZmluYWxpZGFkZXMgZXNwZWPDrWZpY2FzIHBhcmEgZWwgZGVzYXJyb2xsbyB5IGVqZXJjaWNpbyBkZSBsYXMgZnVuY2lvbmVzIG1pc2lvbmFsZXMgZGUgZG9jZW5jaWEsIGludmVzdGlnYWNpw7NuIHkgZXh0ZW5zacOzbiwgYXPDrSBjb21vIGxhcyByZWxhY2lvbmVzIGFjYWTDqW1pY2FzLCBsYWJvcmFsZXMsIGNvbnRyYWN0dWFsZXMgeSB0b2RhcyBsYXMgZGVtw6FzIHJlbGFjaW9uYWRhcyBjb24gZWwgb2JqZXRvIHNvY2lhbCBkZSBsYSBVbml2ZXJzaWRhZC4gCgo=