A framework for online prediction using kernel adaptive filtering

Nowadays, the task of predicting in schemas online is an essential field of study for machine learning. The Filters Adaptive based on kernel methods have taken an essential role in this type of task; this is primarily due to their condition of universal approximation, their ability to solve nonlinea...

Full description

Autores:
León Gómez, Eder Arley
Tipo de recurso:
Work document
Fecha de publicación:
2019
Institución:
Universidad Nacional de Colombia
Repositorio:
Universidad Nacional de Colombia
Idioma:
eng
OAI Identifier:
oai:repositorio.unal.edu.co:unal/75985
Acceso en línea:
https://repositorio.unal.edu.co/handle/unal/75985
Palabra clave:
620 - Ingeniería y operaciones afines
Machine learning
Forecasts
Kernel adaptative filtering
Dictionary
Learning rate
Kernel bandwidth
Clustering adaptive
Aprendizaje de máquina
Predicción
Filtros adaptativos Kernel
Diccionario
Tasa de aprendizaje
Ancho de banda del Kernel
Agrupamiento adaptativo
Rights
openAccess
License
Atribución-NoComercial 4.0 Internacional
id UNACIONAL2_2e0ba94f19f38e0adf8c81595ebb7cbc
oai_identifier_str oai:repositorio.unal.edu.co:unal/75985
network_acronym_str UNACIONAL2
network_name_str Universidad Nacional de Colombia
repository_id_str
dc.title.spa.fl_str_mv A framework for online prediction using kernel adaptive filtering
dc.title.alternative.spa.fl_str_mv Marco de predicción en línea usando filtros adaptativos kernel
title A framework for online prediction using kernel adaptive filtering
spellingShingle A framework for online prediction using kernel adaptive filtering
620 - Ingeniería y operaciones afines
Machine learning
Forecasts
Kernel adaptative filtering
Dictionary
Learning rate
Kernel bandwidth
Clustering adaptive
Aprendizaje de máquina
Predicción
Filtros adaptativos Kernel
Diccionario
Tasa de aprendizaje
Ancho de banda del Kernel
Agrupamiento adaptativo
title_short A framework for online prediction using kernel adaptive filtering
title_full A framework for online prediction using kernel adaptive filtering
title_fullStr A framework for online prediction using kernel adaptive filtering
title_full_unstemmed A framework for online prediction using kernel adaptive filtering
title_sort A framework for online prediction using kernel adaptive filtering
dc.creator.fl_str_mv León Gómez, Eder Arley
dc.contributor.advisor.spa.fl_str_mv Castellanos Domínguez, César Germán
Garcia Vega, Sergio
dc.contributor.author.spa.fl_str_mv León Gómez, Eder Arley
dc.contributor.corporatename.spa.fl_str_mv Universidad Nacional de Colombia
dc.contributor.researchgroup.spa.fl_str_mv Grupo de Control y Procesamiento Digital de Señales
dc.subject.ddc.spa.fl_str_mv 620 - Ingeniería y operaciones afines
topic 620 - Ingeniería y operaciones afines
Machine learning
Forecasts
Kernel adaptative filtering
Dictionary
Learning rate
Kernel bandwidth
Clustering adaptive
Aprendizaje de máquina
Predicción
Filtros adaptativos Kernel
Diccionario
Tasa de aprendizaje
Ancho de banda del Kernel
Agrupamiento adaptativo
dc.subject.proposal.eng.fl_str_mv Machine learning
Forecasts
Kernel adaptative filtering
Dictionary
Learning rate
Kernel bandwidth
Clustering adaptive
dc.subject.proposal.spa.fl_str_mv Aprendizaje de máquina
Predicción
Filtros adaptativos Kernel
Diccionario
Tasa de aprendizaje
Ancho de banda del Kernel
Agrupamiento adaptativo
description Nowadays, the task of predicting in schemas online is an essential field of study for machine learning. The Filters Adaptive based on kernel methods have taken an essential role in this type of task; this is primarily due to their condition of universal approximation, their ability to solve nonlinear problems and the modest computing cost they possess. However, although they have significant advantages with similar methods, they present different challenges to be solved such as: (1) the tuning of the kernel bandwidth parameters and the learning rate; (2) the limitation in the model size, product of the number of elements that the filtered dictionary may contain; and, (3) the efficient construction and modeling of multiple filters. The improvement of these conditions will allow an improvement in the representation of time series dynamics, which translates into a decrease in prediction error. This thesis document addresses the previous issues raised from three proposals. The first is through the interactive search for adequate kernel bandwidth and learning rate, which is achieved by minimizing the correntropy within a proposed cost function. The second contribution corresponds to a scheme of sequential construction of filters, which unlike other methods of state of the art, does not restrict the samples to a single dictionary, and that additionally updates the weights of the samples shared in several filters. The third and last one corresponds to the integration of a kernel bandwidth update method with another that sequentially builds a filter bank. These different proposed frameworks were validated in synthetic data sets as in the real world. The results, in general, show an improvement in the convergence rate, the reduction of the mean square error and the size of the dictionary with different filters of state of the art and a neural network for a specific case.
publishDate 2019
dc.date.issued.spa.fl_str_mv 2019
dc.date.accessioned.spa.fl_str_mv 2020-03-09T15:52:43Z
dc.date.available.spa.fl_str_mv 2020-03-09T15:52:43Z
dc.type.spa.fl_str_mv Documento de trabajo
dc.type.driver.spa.fl_str_mv info:eu-repo/semantics/workingPaper
dc.type.version.spa.fl_str_mv info:eu-repo/semantics/acceptedVersion
dc.type.coar.spa.fl_str_mv http://purl.org/coar/resource_type/c_8042
dc.type.content.spa.fl_str_mv Text
dc.type.redcol.spa.fl_str_mv http://purl.org/redcol/resource_type/WP
format http://purl.org/coar/resource_type/c_8042
status_str acceptedVersion
dc.identifier.uri.none.fl_str_mv https://repositorio.unal.edu.co/handle/unal/75985
url https://repositorio.unal.edu.co/handle/unal/75985
dc.language.iso.spa.fl_str_mv eng
language eng
dc.relation.references.spa.fl_str_mv S. Shanmuganathan and S. Samarasinghe, Artificial neural network modelling, vol. 628. Springer, 2016.
Y. Cui, S. Ahmad, and J. Hawkins, “Continuous online sequence learning with an unsupervised neural network model,” Neural computation, vol. 28, no. 11, pp. 2474–2504, 2016.
C. Deb, F. Zhang, J. Yang, S. E. Lee, and K. W. Shah, “A review on time series forecasting techniques for building energy consumption,” Renewable and Sustainable Energy Reviews, vol. 74, pp. 902–924, 2017.
Y. Feng, P. Zhang, M. Yang, Q. Li, and A. Zhang, “Short term load forecasting of offshore oil field microgrids based on da-svm,” Energy Procedia, vol. 158, pp. 2448–2455, 2019.
B. Schölkopf, A. J. Smola, et al., Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2002.
F. Girosi, M. Jones, and T. Poggio, “Regularization theory and neural networks architectures,” Neural computation, vol. 7, no. 2, pp. 219–269, 1995.
B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural computation, vol. 10, no. 5, pp. 1299–1319, 1998.
W. Liu, P. P. Pokharel, and J. C. Principe, “The kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 56, no. 2, pp. 543–554, 2008.
Y. Engel, S. Mannor, and R. Meir, “The kernel recursive least-squares algorithm,” IEEE Transactions on signal processing, vol. 52, no. 8, pp. 2275–2285, 2004.
W. Liu, I. Park, Y. Wang, and J. C. Príncipe, “Extended kernel recursive least squares algorithm,” IEEE Transactions on Signal Processing, vol. 57, no. 10, pp. 3801–3814, 2009.
B. Chen, S. Zhao, P. Zhu, and J. C. Príncipe, “Quantized kernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 1, pp. 22–32, 2012.
H. Fan and Q. Song, “A linear recurrent kernel online learning algorithm with sparse updates,” Neural Networks, vol. 50, pp. 142–153, 2014.
W. Liu, I. Park, and J. C. Principe, “An information theoretic approach of designing sparse kernel adaptive filters,” IEEE Transactions on Neural Networks, vol. 20, no. 12, pp. 1950–1961, 2009.
W. Ao, W.-Q. Xiang, Y.-P. Zhang, L. Wang, C.-Y. Lv, and Z.-H. Wang, “A new variable step size lms adaptive filtering algorithm,” in 2012 International Conference on Computer Science and Electronics Engineering, vol. 2, pp. 265–268, IEEE, 2012.
Q. Niu and T. Chen, “A new variable step size lms adaptive algorithm,” in 2018 Chinese Control And Decision Conference (CCDC), pp. 1–4, IEEE, 2018.
S. Garcia-Vega, X.-J. Zeng, and J. Keane, “Learning from data streams using kernel least-mean-square with multiple kernel-sizes and adaptive step-size,” Neurocomputing, vol. 339, pp. 105–115, 2019.
B. Chen, J. Liang, N. Zheng, and J. C. Príncipe, “Kernel least mean square with adaptive kernel size,” Neurocomputing, vol. 191, pp. 95–106, 2016.
W. W. Qitang Sun, Lujuan Dang and S. Wang, “Kernel least mean square algorithm with mixed kernel,” In Advanced Computational Intelligence (ICACI), 2018 Tenth International Conference, pp. 140–144, 2018.
J. Platt, A resource-allocating network for function interpolation. MIT Press, 1991.
L. Csató and M. Opper, “Sparse on-line gaussian processes,” Neural computation, vol. 14, no. 3, pp. 641–668, 2002.
K. Li and J. C. Principe, “Transfer learning in adaptive filters: The nearest instance centroid-estimation kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 65, no. 24, pp. 6520–6535, 2017.
G. Wahba, Spline models for observational data, vol. 59. Siam, 1990.
J. Racine, “An efficient cross-validation algorithm for window width selection for nonparametric kernel regression,” Communications in Statistics-Simulation and Computation, vol. 22, no. 4, pp. 1107–1114, 1993.
E. Herrmann, “Local bandwidth choice in kernel regression estimation,” Journal of Computational and Graphical Statistics, vol. 6, no. 1, pp. 35–54, 1997.
B. W. Silverman, Density estimation for statistics and data analysis. Routledge, 2018.
Y. Gao and S.-L. Xie, “A variable step size lms adaptive filtering algorithm and its analysis,” Acta Electronica Sinica, vol. 29, no. 8, pp. 1094–1097, 2001.
Y. Qian, “A new variable step size algorithm applied in lms adaptive signal processing,” in 2016 Chinese Control and Decision Conference (CCDC), pp. 4326–4329, IEEE, 2016.
UPME, Plan de Expansión de Referencia Generación-Transmisión 2017 - 2031. Unidad de Planeación Minero Energética, 2017.
S. Sagiroglu and D. Sinanc, “Big data: A review,” in 2013 International Conference on Collaboration Technologies and Systems (CTS), pp. 42–47, IEEE, 2013.
N. Aronszajn, “Theory of reproducing kernels,” Transactions of the American mathematical society, vol. 68, no. 3, pp. 337–404, 1950.
C. J. Burges, “A tutorial on support vector machines for pattern recognition,” Data mining and knowledge discovery, vol. 2, no. 2, pp. 121–167, 1998.
B. Chen, L. Li, W. Liu, and J. C. Príncipe, “Nonlinear adaptive filtering in kernel spaces,” in Springer Handbook of Bio-/Neuroinformatics, pp. 715–734, Springer, 2014.
C. Cheng, A. Sa-Ngasoongsong, O. Beyca, T. Le, H. Yang, Z. Kong, and S. T. Bukkapatnam, “Time series forecasting for nonlinear and non-stationary processes: a review and comparative study,” Iie Transactions, vol. 47, no. 10, pp. 1053–1071, 2015.
B. Scholkopf and A. J. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2001.
K. I. Kim, M. O. Franz, and B. Scholkopf, “Iterative kernel principal component analysis for image modeling,” IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 9, pp. 1351–1366, 2005.
T.-T. Frieß and R. F. Harrison, “A kernel based adaline.,” in ESANN, vol. 72, pp. 21–23, 1999.
S. An, W. Liu, and S. Venkatesh, “Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression,” Pattern Recognition, vol. 40, no. 8, pp. 2154–2162, 2007.
G. C. Cawley and N. L. Talbot, “Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers,” Pattern Recognition, vol. 36, no. 11, pp. 2585–2592, 2003.
W. Hardle, “ ‘applied nonparametric regression (cambridge: Cambridge university press),” 1990.
C. M. Bishop, Pattern recognition and machine learning. springer, 2006.
R. O. Duda, P. E. Hart, and D. G. Stork, Pattern classification. John Wiley & Sons, 2012.
A. K. Jain, “Data clustering: 50 years beyond k-means,” Pattern recognition letters, vol. 31, no. 8, pp. 651–666, 2010.
J. M. Keller, M. R. Gray, and J. A. Givens, “A fuzzy k-nearest neighbor algorithm,” IEEE transactions on systems, man, and cybernetics, no. 4, pp. 580–585, 1985.
K. Fu, Sequential methods in pattern recognition and machine learning, vol. 52. Academic press, 1968.
C. Richard, J. C. M. Bermudez, and P. Honeine, “Online prediction of time series data with kernels,” IEEE Transactions on Signal Processing, vol. 57, no. 3, pp. 1058–1067, 2008.
C. Henry and R. Williams, “Real-time recursive estimation of statistical parameters,” Analytica chimica acta, vol. 242, pp. 17–23, 1991.
C. G. Bezerra, B. S. J. Costa, L. A. Guedes, and P. P. Angelov, “A new evolving clustering algorithm for online data streams,” in 2016 IEEE Conference on Evolving and Adaptive Intelligent Systems (EAIS), pp. 162–168, IEEE, 2016.
A. G. Salman, Y. Heryadi, E. Abdurahman, and W. Suparta, “Single layer & multilayer long short-term memory (lstm) model with intermediate variables for weather forecasting,” Procedia Computer Science, vol. 135, pp. 89–98, 2018
M. Yukawa and R.-i. Ishii, “On adaptivity of online model selection method based on multikernel adaptive filtering,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.
M. Yukawa, “Multikernel adaptive filtering,” IEEE Transactions on Signal Processing, vol. 60, no. 9, pp. 4672–4682, 2012.
F. A. Tobar, S.-Y. Kung, and D. P. Mandic, “Multikernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 2, pp. 265– 277, 2013.
T. Ishida and T. Tanaka, “Multikernel adaptive filters with multiple dictionaries and regularization,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.
Q. Sun, L. Dang, W. Wang, and S. Wang, “Kernel least mean square algorithm with mixed kernel,” in 2018 Tenth International Conference on Advanced Computational Intelligence (ICACI), pp. 140–144, IEEE, 2018.
T. Burton, D. Sharpe, N. Jenkins, and E. Bossanyi, Wind energy handbook. John Wiley & Sons, 2001.
M. Elshendy, A. F. Colladon, E. Battistoni, and P. A. Gloor, “Using four different online media sources to forecast the crude oil price,” Journal of Information Science, 2017.
P. J. Brockwell and R. A. Davis, Introduction to time series and forecasting. ingerir, 2016.
V. Kotu, “Chapter 12: Time series forecasting, editor (s): Vijay kotu, bala deshpande, data science,” 2019.
Q. Song, X. Zhao, Z. Feng, and B. Song, “Recursive least squares algorithm with adaptive forgetting factor based on echo state network,” in 2011 9th World Congress on Intelligent Control and Automation, pp. 295–298, IEEE, 2011.
S. Wen, R. Hu, Y. Yang, T. Huang, Z. Zeng, and Y.-D. Song, “Memristor-based echo state network with online least mean square,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, no. 99, pp. 1–10, 2018.
W.-Y. Chang, “A literature review of wind forecasting methods,” Journal of Power and Energy Engineering, vol. 2, no. 4, 2014.
C. Voyant, G. Notton, S. Kalogirou, M.-L. Nivet, C. Paoli, F. Motte, and A. Fouilloy, “Machine learning methods for solar radiation forecasting: A review,” Renewable Energy, vol. 105, pp. 569–582, 2017.
G. Bontempi, S. B. Taieb, and Y.-A. Le Borgne, “Machine learning strategies for time series forecasting,” in European business intelligence summer school, pp. 62–77, Springer, 2012.
L. Yu, Y. Zhao, L. Tang, and Z. Yang, “Online big data-driven oil consumption forecasting with google trends,” International Journal of Forecasting, vol. 35, no. 1, pp. 213– 223, 2019.
O. Schaer, N. Kourentzes, and R. Fildes, “Demand forecasting with user-generated online
A. S.Weigend, Time series prediction: forecasting the future and understanding the past. Routledge, 2018.
F. Kaytez, M. C. Taplamacioglu, E. Cam, and F. Hardalac, “Forecasting electricity consumption: A comparison of regression analysis, neural networks and least squares support vector machines,” International Journal of Electrical Power & Energy Systems, vol. 67, pp. 431–438, 2015.
W. Liu, J. C. Principe, and S. Haykin, Kernel adaptive filtering: a comprehensive introduction, vol. 57. John Wiley & Sons, 2011.
dc.rights.spa.fl_str_mv Derechos reservados - Universidad Nacional de Colombia
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_abf2
dc.rights.license.spa.fl_str_mv Atribución-NoComercial 4.0 Internacional
dc.rights.spa.spa.fl_str_mv Acceso abierto
dc.rights.uri.spa.fl_str_mv http://creativecommons.org/licenses/by-nc/4.0/
dc.rights.accessrights.spa.fl_str_mv info:eu-repo/semantics/openAccess
rights_invalid_str_mv Atribución-NoComercial 4.0 Internacional
Derechos reservados - Universidad Nacional de Colombia
Acceso abierto
http://creativecommons.org/licenses/by-nc/4.0/
http://purl.org/coar/access_right/c_abf2
eu_rights_str_mv openAccess
dc.format.extent.spa.fl_str_mv 64
dc.format.mimetype.spa.fl_str_mv application/pdf
dc.publisher.department.spa.fl_str_mv Departamento de Ingeniería Eléctrica y Electrónica
dc.publisher.branch.spa.fl_str_mv Universidad Nacional de Colombia - Sede Manizales
institution Universidad Nacional de Colombia
bitstream.url.fl_str_mv https://repositorio.unal.edu.co/bitstream/unal/75985/4/1098671785.2019.pdf
https://repositorio.unal.edu.co/bitstream/unal/75985/5/license.txt
https://repositorio.unal.edu.co/bitstream/unal/75985/6/license_rdf
https://repositorio.unal.edu.co/bitstream/unal/75985/7/1098671785.2019.pdf.jpg
bitstream.checksum.fl_str_mv 92e56d7255eec3080e91702cd83bce8d
6f3f13b02594d02ad110b3ad534cd5df
42fd4ad1e89814f5e4a476b409eb708c
6e4260ade198218734a0938562338723
bitstream.checksumAlgorithm.fl_str_mv MD5
MD5
MD5
MD5
repository.name.fl_str_mv Repositorio Institucional Universidad Nacional de Colombia
repository.mail.fl_str_mv repositorio_nal@unal.edu.co
_version_ 1814089252565155840
spelling Atribución-NoComercial 4.0 InternacionalDerechos reservados - Universidad Nacional de ColombiaAcceso abiertohttp://creativecommons.org/licenses/by-nc/4.0/info:eu-repo/semantics/openAccesshttp://purl.org/coar/access_right/c_abf2Castellanos Domínguez, César Germán40af5ca8-3cf6-4f6d-a8b9-cd3d6b73bf61-1Garcia Vega, Sergiocda5fedb-a498-4ef5-9330-75bed337cdf3-1León Gómez, Eder Arley03bad1a2-7710-4cfa-aa46-493554678ab6Universidad Nacional de ColombiaGrupo de Control y Procesamiento Digital de Señales2020-03-09T15:52:43Z2020-03-09T15:52:43Z2019https://repositorio.unal.edu.co/handle/unal/75985Nowadays, the task of predicting in schemas online is an essential field of study for machine learning. The Filters Adaptive based on kernel methods have taken an essential role in this type of task; this is primarily due to their condition of universal approximation, their ability to solve nonlinear problems and the modest computing cost they possess. However, although they have significant advantages with similar methods, they present different challenges to be solved such as: (1) the tuning of the kernel bandwidth parameters and the learning rate; (2) the limitation in the model size, product of the number of elements that the filtered dictionary may contain; and, (3) the efficient construction and modeling of multiple filters. The improvement of these conditions will allow an improvement in the representation of time series dynamics, which translates into a decrease in prediction error. This thesis document addresses the previous issues raised from three proposals. The first is through the interactive search for adequate kernel bandwidth and learning rate, which is achieved by minimizing the correntropy within a proposed cost function. The second contribution corresponds to a scheme of sequential construction of filters, which unlike other methods of state of the art, does not restrict the samples to a single dictionary, and that additionally updates the weights of the samples shared in several filters. The third and last one corresponds to the integration of a kernel bandwidth update method with another that sequentially builds a filter bank. These different proposed frameworks were validated in synthetic data sets as in the real world. The results, in general, show an improvement in the convergence rate, the reduction of the mean square error and the size of the dictionary with different filters of state of the art and a neural network for a specific case.La tarea de predicción en esquemas secuenciales en línea, es hoy un importante campo de estudio para el aprendizaje de máquina. Los Filtros Adaptativos basados en métodos kernel han tomado un papel importante para este tipo de tareas, esto se debe en gram medida a su condición de aproximación universal, su capacidad de solucionar problemas no lineales y al modesto costo computación que poseen. Sin embargo, aunque tienen ventajas significativas con métodos similares, presentan diferentes desafíos a solucionar como: (1) la sintonización de los parámetros del ancho de banda del kernel y la tasa de aprendizaje; (2) la limitación en el tamaño de modelo, producto del número de elementos que pueda contener el diccionario del filtro; y, (3) la eficaz construcción y modelamiento de múltiples filtros. El mejoramiento de estas condiciones permitirá una mejora en la representación de las dinámicas de series de tiempo, lo que se traduce en una disminución del error de predicción. Este documento de tesis aborda las problemáticas anteriores planteadas a partir de tres propuestas. La primera es vía de la búsqueda interativa de un adecuado ancho de banda del kernel y tasa de aprendizaje, lo cual se logra mediante la minimización de la correntropía dentro de una función de costos propuesta. El segundo aporte corresponde a un esquema de construcción secuencial de filtros, que a diferencia de otros métodos del estado del arte, no restringe las muestras a un único diccionario, y que adicionalmente actualiza los pesos de las muestras compartidas en varios filtros. La tercera y última, corresponde a la integración de un método de actualización del ancho de banda del kernel con otro que construye secuencialmente un banco de filtros. Estos distintos marcos propuestos, fueron validados en conjuntos de datos sintéticos como del mundo real. Los resultados en general presentan una mejora en la tasa de convergencia, la reducción del error cuadrático medio y el tamaño del diccionario con diferentes filtros del estado del arte y un red neuronal para un caso especifico.Trabajo de grado presentado como requisito parcial para el título de: Magister en Ingeniería - Ingeniería Eléctrica. -- Línea de investigación: Aprendizaje de Máquina.Maestría64application/pdfeng620 - Ingeniería y operaciones afinesMachine learningForecastsKernel adaptative filteringDictionaryLearning rateKernel bandwidthClustering adaptiveAprendizaje de máquinaPredicciónFiltros adaptativos KernelDiccionarioTasa de aprendizajeAncho de banda del KernelAgrupamiento adaptativoA framework for online prediction using kernel adaptive filteringMarco de predicción en línea usando filtros adaptativos kernelDocumento de trabajoinfo:eu-repo/semantics/workingPaperinfo:eu-repo/semantics/acceptedVersionhttp://purl.org/coar/resource_type/c_8042Texthttp://purl.org/redcol/resource_type/WPDepartamento de Ingeniería Eléctrica y ElectrónicaUniversidad Nacional de Colombia - Sede ManizalesS. Shanmuganathan and S. Samarasinghe, Artificial neural network modelling, vol. 628. Springer, 2016.Y. Cui, S. Ahmad, and J. Hawkins, “Continuous online sequence learning with an unsupervised neural network model,” Neural computation, vol. 28, no. 11, pp. 2474–2504, 2016.C. Deb, F. Zhang, J. Yang, S. E. Lee, and K. W. Shah, “A review on time series forecasting techniques for building energy consumption,” Renewable and Sustainable Energy Reviews, vol. 74, pp. 902–924, 2017.Y. Feng, P. Zhang, M. Yang, Q. Li, and A. Zhang, “Short term load forecasting of offshore oil field microgrids based on da-svm,” Energy Procedia, vol. 158, pp. 2448–2455, 2019.B. Schölkopf, A. J. Smola, et al., Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2002.F. Girosi, M. Jones, and T. Poggio, “Regularization theory and neural networks architectures,” Neural computation, vol. 7, no. 2, pp. 219–269, 1995.B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear component analysis as a kernel eigenvalue problem,” Neural computation, vol. 10, no. 5, pp. 1299–1319, 1998.W. Liu, P. P. Pokharel, and J. C. Principe, “The kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 56, no. 2, pp. 543–554, 2008.Y. Engel, S. Mannor, and R. Meir, “The kernel recursive least-squares algorithm,” IEEE Transactions on signal processing, vol. 52, no. 8, pp. 2275–2285, 2004.W. Liu, I. Park, Y. Wang, and J. C. Príncipe, “Extended kernel recursive least squares algorithm,” IEEE Transactions on Signal Processing, vol. 57, no. 10, pp. 3801–3814, 2009.B. Chen, S. Zhao, P. Zhu, and J. C. Príncipe, “Quantized kernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 1, pp. 22–32, 2012.H. Fan and Q. Song, “A linear recurrent kernel online learning algorithm with sparse updates,” Neural Networks, vol. 50, pp. 142–153, 2014.W. Liu, I. Park, and J. C. Principe, “An information theoretic approach of designing sparse kernel adaptive filters,” IEEE Transactions on Neural Networks, vol. 20, no. 12, pp. 1950–1961, 2009.W. Ao, W.-Q. Xiang, Y.-P. Zhang, L. Wang, C.-Y. Lv, and Z.-H. Wang, “A new variable step size lms adaptive filtering algorithm,” in 2012 International Conference on Computer Science and Electronics Engineering, vol. 2, pp. 265–268, IEEE, 2012.Q. Niu and T. Chen, “A new variable step size lms adaptive algorithm,” in 2018 Chinese Control And Decision Conference (CCDC), pp. 1–4, IEEE, 2018.S. Garcia-Vega, X.-J. Zeng, and J. Keane, “Learning from data streams using kernel least-mean-square with multiple kernel-sizes and adaptive step-size,” Neurocomputing, vol. 339, pp. 105–115, 2019.B. Chen, J. Liang, N. Zheng, and J. C. Príncipe, “Kernel least mean square with adaptive kernel size,” Neurocomputing, vol. 191, pp. 95–106, 2016.W. W. Qitang Sun, Lujuan Dang and S. Wang, “Kernel least mean square algorithm with mixed kernel,” In Advanced Computational Intelligence (ICACI), 2018 Tenth International Conference, pp. 140–144, 2018.J. Platt, A resource-allocating network for function interpolation. MIT Press, 1991.L. Csató and M. Opper, “Sparse on-line gaussian processes,” Neural computation, vol. 14, no. 3, pp. 641–668, 2002.K. Li and J. C. Principe, “Transfer learning in adaptive filters: The nearest instance centroid-estimation kernel least-mean-square algorithm,” IEEE Transactions on Signal Processing, vol. 65, no. 24, pp. 6520–6535, 2017.G. Wahba, Spline models for observational data, vol. 59. Siam, 1990.J. Racine, “An efficient cross-validation algorithm for window width selection for nonparametric kernel regression,” Communications in Statistics-Simulation and Computation, vol. 22, no. 4, pp. 1107–1114, 1993.E. Herrmann, “Local bandwidth choice in kernel regression estimation,” Journal of Computational and Graphical Statistics, vol. 6, no. 1, pp. 35–54, 1997.B. W. Silverman, Density estimation for statistics and data analysis. Routledge, 2018.Y. Gao and S.-L. Xie, “A variable step size lms adaptive filtering algorithm and its analysis,” Acta Electronica Sinica, vol. 29, no. 8, pp. 1094–1097, 2001.Y. Qian, “A new variable step size algorithm applied in lms adaptive signal processing,” in 2016 Chinese Control and Decision Conference (CCDC), pp. 4326–4329, IEEE, 2016.UPME, Plan de Expansión de Referencia Generación-Transmisión 2017 - 2031. Unidad de Planeación Minero Energética, 2017.S. Sagiroglu and D. Sinanc, “Big data: A review,” in 2013 International Conference on Collaboration Technologies and Systems (CTS), pp. 42–47, IEEE, 2013.N. Aronszajn, “Theory of reproducing kernels,” Transactions of the American mathematical society, vol. 68, no. 3, pp. 337–404, 1950.C. J. Burges, “A tutorial on support vector machines for pattern recognition,” Data mining and knowledge discovery, vol. 2, no. 2, pp. 121–167, 1998.B. Chen, L. Li, W. Liu, and J. C. Príncipe, “Nonlinear adaptive filtering in kernel spaces,” in Springer Handbook of Bio-/Neuroinformatics, pp. 715–734, Springer, 2014.C. Cheng, A. Sa-Ngasoongsong, O. Beyca, T. Le, H. Yang, Z. Kong, and S. T. Bukkapatnam, “Time series forecasting for nonlinear and non-stationary processes: a review and comparative study,” Iie Transactions, vol. 47, no. 10, pp. 1053–1071, 2015.B. Scholkopf and A. J. Smola, Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press, 2001.K. I. Kim, M. O. Franz, and B. Scholkopf, “Iterative kernel principal component analysis for image modeling,” IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 9, pp. 1351–1366, 2005.T.-T. Frieß and R. F. Harrison, “A kernel based adaline.,” in ESANN, vol. 72, pp. 21–23, 1999.S. An, W. Liu, and S. Venkatesh, “Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression,” Pattern Recognition, vol. 40, no. 8, pp. 2154–2162, 2007.G. C. Cawley and N. L. Talbot, “Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers,” Pattern Recognition, vol. 36, no. 11, pp. 2585–2592, 2003.W. Hardle, “ ‘applied nonparametric regression (cambridge: Cambridge university press),” 1990.C. M. Bishop, Pattern recognition and machine learning. springer, 2006.R. O. Duda, P. E. Hart, and D. G. Stork, Pattern classification. John Wiley & Sons, 2012.A. K. Jain, “Data clustering: 50 years beyond k-means,” Pattern recognition letters, vol. 31, no. 8, pp. 651–666, 2010.J. M. Keller, M. R. Gray, and J. A. Givens, “A fuzzy k-nearest neighbor algorithm,” IEEE transactions on systems, man, and cybernetics, no. 4, pp. 580–585, 1985.K. Fu, Sequential methods in pattern recognition and machine learning, vol. 52. Academic press, 1968.C. Richard, J. C. M. Bermudez, and P. Honeine, “Online prediction of time series data with kernels,” IEEE Transactions on Signal Processing, vol. 57, no. 3, pp. 1058–1067, 2008.C. Henry and R. Williams, “Real-time recursive estimation of statistical parameters,” Analytica chimica acta, vol. 242, pp. 17–23, 1991.C. G. Bezerra, B. S. J. Costa, L. A. Guedes, and P. P. Angelov, “A new evolving clustering algorithm for online data streams,” in 2016 IEEE Conference on Evolving and Adaptive Intelligent Systems (EAIS), pp. 162–168, IEEE, 2016.A. G. Salman, Y. Heryadi, E. Abdurahman, and W. Suparta, “Single layer & multilayer long short-term memory (lstm) model with intermediate variables for weather forecasting,” Procedia Computer Science, vol. 135, pp. 89–98, 2018M. Yukawa and R.-i. Ishii, “On adaptivity of online model selection method based on multikernel adaptive filtering,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.M. Yukawa, “Multikernel adaptive filtering,” IEEE Transactions on Signal Processing, vol. 60, no. 9, pp. 4672–4682, 2012.F. A. Tobar, S.-Y. Kung, and D. P. Mandic, “Multikernel least mean square algorithm,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 2, pp. 265– 277, 2013.T. Ishida and T. Tanaka, “Multikernel adaptive filters with multiple dictionaries and regularization,” in 2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pp. 1–6, IEEE, 2013.Q. Sun, L. Dang, W. Wang, and S. Wang, “Kernel least mean square algorithm with mixed kernel,” in 2018 Tenth International Conference on Advanced Computational Intelligence (ICACI), pp. 140–144, IEEE, 2018.T. Burton, D. Sharpe, N. Jenkins, and E. Bossanyi, Wind energy handbook. John Wiley & Sons, 2001.M. Elshendy, A. F. Colladon, E. Battistoni, and P. A. Gloor, “Using four different online media sources to forecast the crude oil price,” Journal of Information Science, 2017.P. J. Brockwell and R. A. Davis, Introduction to time series and forecasting. ingerir, 2016.V. Kotu, “Chapter 12: Time series forecasting, editor (s): Vijay kotu, bala deshpande, data science,” 2019.Q. Song, X. Zhao, Z. Feng, and B. Song, “Recursive least squares algorithm with adaptive forgetting factor based on echo state network,” in 2011 9th World Congress on Intelligent Control and Automation, pp. 295–298, IEEE, 2011.S. Wen, R. Hu, Y. Yang, T. Huang, Z. Zeng, and Y.-D. Song, “Memristor-based echo state network with online least mean square,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, no. 99, pp. 1–10, 2018.W.-Y. Chang, “A literature review of wind forecasting methods,” Journal of Power and Energy Engineering, vol. 2, no. 4, 2014.C. Voyant, G. Notton, S. Kalogirou, M.-L. Nivet, C. Paoli, F. Motte, and A. Fouilloy, “Machine learning methods for solar radiation forecasting: A review,” Renewable Energy, vol. 105, pp. 569–582, 2017.G. Bontempi, S. B. Taieb, and Y.-A. Le Borgne, “Machine learning strategies for time series forecasting,” in European business intelligence summer school, pp. 62–77, Springer, 2012.L. Yu, Y. Zhao, L. Tang, and Z. Yang, “Online big data-driven oil consumption forecasting with google trends,” International Journal of Forecasting, vol. 35, no. 1, pp. 213– 223, 2019.O. Schaer, N. Kourentzes, and R. Fildes, “Demand forecasting with user-generated onlineA. S.Weigend, Time series prediction: forecasting the future and understanding the past. Routledge, 2018.F. Kaytez, M. C. Taplamacioglu, E. Cam, and F. Hardalac, “Forecasting electricity consumption: A comparison of regression analysis, neural networks and least squares support vector machines,” International Journal of Electrical Power & Energy Systems, vol. 67, pp. 431–438, 2015.W. Liu, J. C. Principe, and S. Haykin, Kernel adaptive filtering: a comprehensive introduction, vol. 57. John Wiley & Sons, 2011.ORIGINAL1098671785.2019.pdf1098671785.2019.pdfapplication/pdf2117706https://repositorio.unal.edu.co/bitstream/unal/75985/4/1098671785.2019.pdf92e56d7255eec3080e91702cd83bce8dMD54LICENSElicense.txtlicense.txttext/plain; charset=utf-83991https://repositorio.unal.edu.co/bitstream/unal/75985/5/license.txt6f3f13b02594d02ad110b3ad534cd5dfMD55CC-LICENSElicense_rdflicense_rdfapplication/rdf+xml; charset=utf-8701https://repositorio.unal.edu.co/bitstream/unal/75985/6/license_rdf42fd4ad1e89814f5e4a476b409eb708cMD56THUMBNAIL1098671785.2019.pdf.jpg1098671785.2019.pdf.jpgGenerated Thumbnailimage/jpeg4175https://repositorio.unal.edu.co/bitstream/unal/75985/7/1098671785.2019.pdf.jpg6e4260ade198218734a0938562338723MD57unal/75985oai:repositorio.unal.edu.co:unal/759852024-07-08 00:54:23.104Repositorio Institucional Universidad Nacional de Colombiarepositorio_nal@unal.edu.coUExBTlRJTExBIERFUMOTU0lUTwoKQ29tbyBlZGl0b3IgZGUgZXN0ZSDDrXRlbSwgdXN0ZWQgcHVlZGUgbW92ZXJsbyBhIHJldmlzacOzbiBzaW4gYW50ZXMgcmVzb2x2ZXIgbG9zIHByb2JsZW1hcyBpZGVudGlmaWNhZG9zLCBkZSBsbyBjb250cmFyaW8sIGhhZ2EgY2xpYyBlbiBHdWFyZGFyIHBhcmEgZ3VhcmRhciBlbCDDrXRlbSB5IHNvbHVjaW9uYXIgZXN0b3MgcHJvYmxlbWFzIG1hcyB0YXJkZS4KCk5PVEFTOgoqU0kgTEEgVEVTSVMgQSBQVUJMSUNBUiBBRFFVSVJJw5MgQ09NUFJPTUlTT1MgREUgQ09ORklERU5DSUFMSURBRCBFTiBFTCBERVNBUlJPTExPIE8gUEFSVEVTIERFTCBET0NVTUVOVE8uIFNJR0EgTEEgRElSRUNUUklaIERFIExBIFJFU09MVUNJw5NOIDAyMyBERSAyMDE1LCBQT1IgTEEgQ1VBTCBTRSBFU1RBQkxFQ0UgRUwgUFJPQ0VESU1JRU5UTyBQQVJBIExBIFBVQkxJQ0FDScOTTiBERSBURVNJUyBERSBNQUVTVFLDjUEgWSBET0NUT1JBRE8gREUgTE9TIEVTVFVESUFOVEVTIERFIExBIFVOSVZFUlNJREFEIE5BQ0lPTkFMIERFIENPTE9NQklBIEVOIEVMIFJFUE9TSVRPUklPIElOU1RJVFVDSU9OQUwgVU4sIEVYUEVESURBIFBPUiBMQSBTRUNSRVRBUsONQSBHRU5FUkFMLgoqTEEgVEVTSVMgQSBQVUJMSUNBUiBERUJFIFNFUiBMQSBWRVJTScOTTiBGSU5BTCBBUFJPQkFEQS4KUGFyYSB0cmFiYWpvcyBkZXBvc2l0YWRvcyBwb3Igc3UgcHJvcGlvIGF1dG9yOiBBbCBhdXRvYXJjaGl2YXIgZXN0ZSBncnVwbyBkZSBhcmNoaXZvcyBkaWdpdGFsZXMgeSBzdXMgbWV0YWRhdG9zLCBZbyBnYXJhbnRpem8gYWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBVTiBlbCBkZXJlY2hvIGEgYWxtYWNlbmFybG9zIHkgbWFudGVuZXJsb3MgZGlzcG9uaWJsZXMgZW4gbMOtbmVhIGRlIG1hbmVyYSBncmF0dWl0YS4gRGVjbGFybyBxdWUgZGljaG8gbWF0ZXJpYWwgZXMgZGUgbWkgcHJvcGllZGFkIGludGVsZWN0dWFsIHkgcXVlIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwgVU4gbm8gYXN1bWUgbmluZ3VuYSByZXNwb25zYWJpbGlkYWQgc2kgaGF5IGFsZ3VuYSB2aW9sYWNpw7NuIGEgbG9zIGRlcmVjaG9zIGRlIGF1dG9yIGFsIGRpc3RyaWJ1aXIgZXN0b3MgYXJjaGl2b3MgeSBtZXRhZGF0b3MuIChTZSByZWNvbWllbmRhIGEgdG9kb3MgbG9zIGF1dG9yZXMgYSBpbmRpY2FyIHN1cyBkZXJlY2hvcyBkZSBhdXRvciBlbiBsYSBww6FnaW5hIGRlIHTDrXR1bG8gZGUgc3UgZG9jdW1lbnRvLikgRGUgbGEgbWlzbWEgbWFuZXJhLCBhY2VwdG8gbG9zIHTDqXJtaW5vcyBkZSBsYSBzaWd1aWVudGUgbGljZW5jaWE6IExvcyBhdXRvcmVzIG8gdGl0dWxhcmVzIGRlbCBkZXJlY2hvIGRlIGF1dG9yIGRlbCBwcmVzZW50ZSBkb2N1bWVudG8gY29uZmllcmVuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgdW5hIGxpY2VuY2lhIG5vIGV4Y2x1c2l2YSwgbGltaXRhZGEgeSBncmF0dWl0YSBzb2JyZSBsYSBvYnJhIHF1ZSBzZSBpbnRlZ3JhIGVuIGVsIFJlcG9zaXRvcmlvIEluc3RpdHVjaW9uYWwsIHF1ZSBzZSBhanVzdGEgYSBsYXMgc2lndWllbnRlcyBjYXJhY3RlcsOtc3RpY2FzOiBhKSBFc3RhcsOhIHZpZ2VudGUgYSBwYXJ0aXIgZGUgbGEgZmVjaGEgZW4gcXVlIHNlIGluY2x1eWUgZW4gZWwgcmVwb3NpdG9yaW8sIHBvciB1biBwbGF6byBkZSA1IGHDsW9zLCBxdWUgc2Vyw6FuIHByb3Jyb2dhYmxlcyBpbmRlZmluaWRhbWVudGUgcG9yIGVsIHRpZW1wbyBxdWUgZHVyZSBlbCBkZXJlY2hvIHBhdHJpbW9uaWFsIGRlbCBhdXRvci4gRWwgYXV0b3IgcG9kcsOhIGRhciBwb3IgdGVybWluYWRhIGxhIGxpY2VuY2lhIHNvbGljaXTDoW5kb2xvIGEgbGEgVW5pdmVyc2lkYWQgY29uIHVuYSBhbnRlbGFjacOzbiBkZSBkb3MgbWVzZXMgYW50ZXMgZGUgbGEgY29ycmVzcG9uZGllbnRlIHByw7Nycm9nYS4gYikgTG9zIGF1dG9yZXMgYXV0b3JpemFuIGEgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcGFyYSBwdWJsaWNhciBsYSBvYnJhIGVuIGVsIGZvcm1hdG8gcXVlIGVsIHJlcG9zaXRvcmlvIGxvIHJlcXVpZXJhIChpbXByZXNvLCBkaWdpdGFsLCBlbGVjdHLDs25pY28gbyBjdWFscXVpZXIgb3RybyBjb25vY2lkbyBvIHBvciBjb25vY2VyKSB5IGNvbm9jZW4gcXVlIGRhZG8gcXVlIHNlIHB1YmxpY2EgZW4gSW50ZXJuZXQgcG9yIGVzdGUgaGVjaG8gY2lyY3VsYSBjb24gdW4gYWxjYW5jZSBtdW5kaWFsLiBjKSBMb3MgYXV0b3JlcyBhY2VwdGFuIHF1ZSBsYSBhdXRvcml6YWNpw7NuIHNlIGhhY2UgYSB0w610dWxvIGdyYXR1aXRvLCBwb3IgbG8gdGFudG8sIHJlbnVuY2lhbiBhIHJlY2liaXIgZW1vbHVtZW50byBhbGd1bm8gcG9yIGxhIHB1YmxpY2FjacOzbiwgZGlzdHJpYnVjacOzbiwgY29tdW5pY2FjacOzbiBww7pibGljYSB5IGN1YWxxdWllciBvdHJvIHVzbyBxdWUgc2UgaGFnYSBlbiBsb3MgdMOpcm1pbm9zIGRlIGxhIHByZXNlbnRlIGxpY2VuY2lhIHkgZGUgbGEgbGljZW5jaWEgQ3JlYXRpdmUgQ29tbW9ucyBjb24gcXVlIHNlIHB1YmxpY2EuIGQpIExvcyBhdXRvcmVzIG1hbmlmaWVzdGFuIHF1ZSBzZSB0cmF0YSBkZSB1bmEgb2JyYSBvcmlnaW5hbCBzb2JyZSBsYSBxdWUgdGllbmVuIGxvcyBkZXJlY2hvcyBxdWUgYXV0b3JpemFuIHkgcXVlIHNvbiBlbGxvcyBxdWllbmVzIGFzdW1lbiB0b3RhbCByZXNwb25zYWJpbGlkYWQgcG9yIGVsIGNvbnRlbmlkbyBkZSBzdSBvYnJhIGFudGUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgeSBhbnRlIHRlcmNlcm9zLiBFbiB0b2RvIGNhc28gbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgc2UgY29tcHJvbWV0ZSBhIGluZGljYXIgc2llbXByZSBsYSBhdXRvcsOtYSBpbmNsdXllbmRvIGVsIG5vbWJyZSBkZWwgYXV0b3IgeSBsYSBmZWNoYSBkZSBwdWJsaWNhY2nDs24uIGUpIExvcyBhdXRvcmVzIGF1dG9yaXphbiBhIGxhIFVuaXZlcnNpZGFkIHBhcmEgaW5jbHVpciBsYSBvYnJhIGVuIGxvcyDDrW5kaWNlcyB5IGJ1c2NhZG9yZXMgcXVlIGVzdGltZW4gbmVjZXNhcmlvcyBwYXJhIHByb21vdmVyIHN1IGRpZnVzacOzbi4gZikgTG9zIGF1dG9yZXMgYWNlcHRhbiBxdWUgbGEgVW5pdmVyc2lkYWQgTmFjaW9uYWwgZGUgQ29sb21iaWEgcHVlZGEgY29udmVydGlyIGVsIGRvY3VtZW50byBhIGN1YWxxdWllciBtZWRpbyBvIGZvcm1hdG8gcGFyYSBwcm9ww7NzaXRvcyBkZSBwcmVzZXJ2YWNpw7NuIGRpZ2l0YWwuIFNJIEVMIERPQ1VNRU5UTyBTRSBCQVNBIEVOIFVOIFRSQUJBSk8gUVVFIEhBIFNJRE8gUEFUUk9DSU5BRE8gTyBBUE9ZQURPIFBPUiBVTkEgQUdFTkNJQSBPIFVOQSBPUkdBTklaQUNJw5NOLCBDT04gRVhDRVBDScOTTiBERSBMQSBVTklWRVJTSURBRCBOQUNJT05BTCBERSBDT0xPTUJJQSwgTE9TIEFVVE9SRVMgR0FSQU5USVpBTiBRVUUgU0UgSEEgQ1VNUExJRE8gQ09OIExPUyBERVJFQ0hPUyBZIE9CTElHQUNJT05FUyBSRVFVRVJJRE9TIFBPUiBFTCBSRVNQRUNUSVZPIENPTlRSQVRPIE8gQUNVRVJETy4KUGFyYSB0cmFiYWpvcyBkZXBvc2l0YWRvcyBwb3Igb3RyYXMgcGVyc29uYXMgZGlzdGludGFzIGEgc3UgYXV0b3I6IERlY2xhcm8gcXVlIGVsIGdydXBvIGRlIGFyY2hpdm9zIGRpZ2l0YWxlcyB5IG1ldGFkYXRvcyBhc29jaWFkb3MgcXVlIGVzdG95IGFyY2hpdmFuZG8gZW4gZWwgUmVwb3NpdG9yaW8gSW5zdGl0dWNpb25hbCBVTikgZXMgZGUgZG9taW5pbyBww7pibGljby4gU2kgbm8gZnVlc2UgZWwgY2FzbywgYWNlcHRvIHRvZGEgbGEgcmVzcG9uc2FiaWxpZGFkIHBvciBjdWFscXVpZXIgaW5mcmFjY2nDs24gZGUgZGVyZWNob3MgZGUgYXV0b3IgcXVlIGNvbmxsZXZlIGxhIGRpc3RyaWJ1Y2nDs24gZGUgZXN0b3MgYXJjaGl2b3MgeSBtZXRhZGF0b3MuCkFsIGhhY2VyIGNsaWMgZW4gZWwgc2lndWllbnRlIGJvdMOzbiwgdXN0ZWQgaW5kaWNhIHF1ZSBlc3TDoSBkZSBhY3VlcmRvIGNvbiBlc3RvcyB0w6lybWlub3MuCg==