ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory

This paper presents a variation in the algorithm EMODS (Evolutionary Metaheuristic of Deterministic Swapping), at the level of its mutation stage in order to train algorithms for each problem. It should be noted that the EMODS metaheuristic is a novel framework that allows multi-objective optimizati...

Full description

Autores:
Ruiz-Rangel, Jonathan
Ardila Hernandez, Carlos Julio
Maradei Gonzalez, Luis
Jabba Molinares, Daladier
Tipo de recurso:
Fecha de publicación:
2018
Institución:
Universidad Simón Bolívar
Repositorio:
Repositorio Digital USB
Idioma:
eng
OAI Identifier:
oai:bonga.unisimon.edu.co:20.500.12442/1863
Acceso en línea:
http://hdl.handle.net/20.500.12442/1863
Palabra clave:
Finite Deterministic Automaton
Artificial Neural Networks
Genetic Algorithm
EMODS
Backpropagation Algorithm
Conjugate Gradient Algorithm
Levenberg-Marquardt Algorithm
Rights
License
Licencia de Creative Commons Reconocimiento-NoComercial-CompartirIgual 4.0 Internacional
id USIMONBOL2_e35c430692bbdd9405f55922c4540c53
oai_identifier_str oai:bonga.unisimon.edu.co:20.500.12442/1863
network_acronym_str USIMONBOL2
network_name_str Repositorio Digital USB
repository_id_str
dc.title.eng.fl_str_mv ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
title ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
spellingShingle ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
Finite Deterministic Automaton
Artificial Neural Networks
Genetic Algorithm
EMODS
Backpropagation Algorithm
Conjugate Gradient Algorithm
Levenberg-Marquardt Algorithm
title_short ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
title_full ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
title_fullStr ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
title_full_unstemmed ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
title_sort ERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theory
dc.creator.fl_str_mv Ruiz-Rangel, Jonathan
Ardila Hernandez, Carlos Julio
Maradei Gonzalez, Luis
Jabba Molinares, Daladier
dc.contributor.author.none.fl_str_mv Ruiz-Rangel, Jonathan
Ardila Hernandez, Carlos Julio
Maradei Gonzalez, Luis
Jabba Molinares, Daladier
dc.subject.eng.fl_str_mv Finite Deterministic Automaton
Artificial Neural Networks
Genetic Algorithm
EMODS
Backpropagation Algorithm
Conjugate Gradient Algorithm
Levenberg-Marquardt Algorithm
topic Finite Deterministic Automaton
Artificial Neural Networks
Genetic Algorithm
EMODS
Backpropagation Algorithm
Conjugate Gradient Algorithm
Levenberg-Marquardt Algorithm
description This paper presents a variation in the algorithm EMODS (Evolutionary Metaheuristic of Deterministic Swapping), at the level of its mutation stage in order to train algorithms for each problem. It should be noted that the EMODS metaheuristic is a novel framework that allows multi-objective optimization of combinatorial problems. The proposal for the training of neural networks will be named ERNEAD (training of Evolutionary Neural Networks through Evolutionary Strategies and Finite Automata). The selection process consists of five phases: the initial population generation phase, the forward feeding phase of the network, the EMODS search phase, the crossing and evaluation phase, and finally the verification phase. The application of the process in the neural networks will generate sets of networks with optimal weights for a particular problem. ERNEAD algorithm was applied to two typical problems: breast cancer and flower classification, the solution of the problems were compared with solutions obtained by applying the classical Backpropagation, Conjugate Gradient and Levenberg-Marquardt algorithms. The analysis of the results indicated that ERNEAD produced more precise solutions than the ones thrown by the classic algorithms.
publishDate 2018
dc.date.accessioned.none.fl_str_mv 2018-03-13T16:08:10Z
dc.date.available.none.fl_str_mv 2018-03-13T16:08:10Z
dc.date.issued.none.fl_str_mv 2018
dc.type.spa.fl_str_mv article
dc.type.coar.fl_str_mv http://purl.org/coar/resource_type/c_6501
dc.identifier.issn.none.fl_str_mv 09740635
dc.identifier.uri.none.fl_str_mv http://hdl.handle.net/20.500.12442/1863
identifier_str_mv 09740635
url http://hdl.handle.net/20.500.12442/1863
dc.language.iso.spa.fl_str_mv eng
language eng
dc.rights.coar.fl_str_mv http://purl.org/coar/access_right/c_16ec
dc.rights.license.spa.fl_str_mv Licencia de Creative Commons Reconocimiento-NoComercial-CompartirIgual 4.0 Internacional
rights_invalid_str_mv Licencia de Creative Commons Reconocimiento-NoComercial-CompartirIgual 4.0 Internacional
http://purl.org/coar/access_right/c_16ec
dc.publisher.spa.fl_str_mv Editorial Board
dc.source.eng.fl_str_mv International Journal of Artificial Intelligence
Vol. 16, No.1 (2018)
institution Universidad Simón Bolívar
dc.source.uri.spa.fl_str_mv http://www.ceser.in/ceserp/index.php/ijai/article/view/5456
bitstream.url.fl_str_mv https://bonga.unisimon.edu.co/bitstreams/d4b1816b-4580-4b65-ae88-db683b500e46/download
bitstream.checksum.fl_str_mv 8a4605be74aa9ea9d79846c1fba20a33
bitstream.checksumAlgorithm.fl_str_mv MD5
repository.name.fl_str_mv DSpace UniSimon
repository.mail.fl_str_mv bibliotecas@biteca.com
_version_ 1814076085590032384
spelling Licencia de Creative Commons Reconocimiento-NoComercial-CompartirIgual 4.0 Internacionalhttp://purl.org/coar/access_right/c_16ecRuiz-Rangel, Jonathan6bcede93-7c56-43e9-af54-4ace50e87aad-1Ardila Hernandez, Carlos Julio6376bedb-3192-4777-9e34-c269ee81d567-1Maradei Gonzalez, Luis0d1df5a0-18d3-42e2-a1f0-5e92ec42b5b4-1Jabba Molinares, Daladiera6449168-983a-4292-aef6-a59465add02a-12018-03-13T16:08:10Z2018-03-13T16:08:10Z201809740635http://hdl.handle.net/20.500.12442/1863This paper presents a variation in the algorithm EMODS (Evolutionary Metaheuristic of Deterministic Swapping), at the level of its mutation stage in order to train algorithms for each problem. It should be noted that the EMODS metaheuristic is a novel framework that allows multi-objective optimization of combinatorial problems. The proposal for the training of neural networks will be named ERNEAD (training of Evolutionary Neural Networks through Evolutionary Strategies and Finite Automata). The selection process consists of five phases: the initial population generation phase, the forward feeding phase of the network, the EMODS search phase, the crossing and evaluation phase, and finally the verification phase. The application of the process in the neural networks will generate sets of networks with optimal weights for a particular problem. ERNEAD algorithm was applied to two typical problems: breast cancer and flower classification, the solution of the problems were compared with solutions obtained by applying the classical Backpropagation, Conjugate Gradient and Levenberg-Marquardt algorithms. The analysis of the results indicated that ERNEAD produced more precise solutions than the ones thrown by the classic algorithms.engEditorial BoardInternational Journal of Artificial IntelligenceVol. 16, No.1 (2018)http://www.ceser.in/ceserp/index.php/ijai/article/view/5456Finite Deterministic AutomatonArtificial Neural NetworksGenetic AlgorithmEMODSBackpropagation AlgorithmConjugate Gradient AlgorithmLevenberg-Marquardt AlgorithmERNEAD: Training of Artificial Neural Networks Based on a Genetic Algorithm and Finite Automata Theoryarticlehttp://purl.org/coar/resource_type/c_6501Brownlee, J. 2011. Clever Algorithms. Nature Inspired Programming Recipes, Vol. 5 of Machine Learning, Prentice Hall, Swinburne University, Melbourne, Australia.Cardie, C. 1993. Using decision trees to improve case-based learning, Proceedings of the Tenth International Conference on Machine Learning 1993, Morgan Kaufmann X(10): 25–32.Center for Machine Learning and Intelligent Systems 23 de Agosto de 2013. http://cml.ics.uci.edu/.Cirstea, M. N., Dinu, A., Khor, J. and Malcolm, M. 2002. Neural and Fuzzy Logic Control of Drives and Power Systems, Vol. I of Elsevier Science Linacre House, Newnes, Oxford OX2 8DP 225 Wildwood Avenue, Woburn, USA.Duda, R. O., Hart, P. E. and Nilsson, N. J. 1976. Subjective bayesian methods for rule-based inference systems, National computer conference and exposition SRI International 333 76(333): 1075–1082.Engelbrecht, A. P. 2002. Computational Intelligence An Introduction, Vol. 1 of Computer Science, John Wiley and Sons Ltd, University of Pretoria, Pretoria, South Africa.Floreano, D. and Mattiussi, C. 2008. Bio-Inspired Artificial Intelligence: Theories, Methods, and Technologies, Vol. I of Intelligent Robotics and Autonomous Agents, The MIT Press, One Rogers Street Cambridge MA 02142-1209, USA.Gutiérrez Peña, P. A. 2009. Nuevos modelos de redes neuronales evolutivas y regresión logística generalizada utilizando funciones de base. aplicaciones.Guzmán, L., G ́ omez, A., Ardila Hern ́ andez, C. J. and Jabba Molinares, D. 2013. Adaptation of ́ the grasp algorithm to solve a multiobjective problem using the pareto concept, International Journal of Artificial Intelligence 11(A13): 222–236.Haykin, S. 1999. Neural networks. A comprehensive foundation, Vol. 2 of Computer Science, Prentice Hall, McMaster University Hamilton, Ontario, Canada.Hestenes, M. R. and Stiefe, E. 1952. Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards 49(6): 409–436.Kasabov, N. K. 1998. Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering, Vol. I of Computational Intelligence, MIT Press, Institute of Technology Massachusetts London, England.Lenat, D. B. 1976. Am: An artificial intelligence approach to discovery in mathematics as heuristic search.Marquardt, D. 1963. An algorithm for least-squares estimation of nonlinear parameters, SIAM Journal on Applied Mathematics 11(2): 431–441.McNelis, P. D. 2005. Neural networks in finance : Gaining predictive edge in the market, Vol. 30 of Academic Press Advanced Finance, Academic Press Edicin: New., Elsevier Academic Press 30 Corporate Drive, Suite 400, Burlington, MA01803, USA.Niño Ruiz, E. D. 2011. Evolutionary algorithms based on the automata theory for the multi- objective optimization of combinatorial problems, Real-World Applications of Genetic Algorithms I(1): 81–108.Niño Ruiz, E. D. 2012. Samods and sagamods: Novel algorithms based on the automata theory for the multiobjective optimization of combinatorial problems, International Journal Artificial Intelligence 8(S12): 147–165.Niño Ruiz, E. D., Ardila Hernández, C. J., Jabba Molinares, D., Barrios Sarmiento, A. and Donoso Meisel, Y. 2010. Mods: A novel metaheuristic of deterministic swapping for the multi-objective optimization of combinatorials problems, Computer Technology and Application vol. 2 2(4): 280–292.Ruiz, R. and Maroto, C. 2005. A genetic algorithm for hybrid flowshops with sequence dependent setup times and machine ligibility, European Journal of Operational Research 169(3): 781– 800.Samarasinghe, S. 2007. Neural Networks for Applied Sciences and Engineering. From Funda- mental to Complex Pattern Recognition, Vol. 1 of Computer Science, Aurbach Publications- Taylor and Francis Group, New York, USA.Sierra Araujo, B. 2006. Aprendizaje Automatico: Conceptos básicos utilizados y avanzados. Aspectos prácticos utilizados el software WEKA , Vol. 1 of Computer Science, Pearson Ed- ucation, Departamento de Ciencias de la computación e inteligencia artificial, universidad del País Vasco, España.Taylor, B. J. 2006. Methods and Procedures for the Verification and Validation of Artificial Neural Networks, Vol. 2005933711 of Computer Science, Springer, Institute for Scientific Research, Inc, Fairmont, WV, USA.Twomey, J. M. and Smith, A. E. 1995. Performance measures, consistency, and power for artificial neural network models, Math. Comput. Modelling 21(1/2): 243–258.Wagman, M. 2000. Scientific Discovery Processes in Humans and Computers: Theory and Research in Psychology and Artificial Intelligence, Vol. CT of Greenwood Publishing Group Inc. Westport, Praeger, Westport,USA.Yao, X. 1999. Evolving artificial neural networks, Proceedings of the IEEE 87(9): 1423–1447.Medina Alfonzo, E. L. 2011. Hibridizacion de lógica difusa y algoritmos genéticos en la predicción de registros de velocidad de ondas. campo guafita.Nieto Parra, H. 2011. Diseño e implementacióon de una metaheurística hibrida basada en recocido simulado, algoritmos geneticos y teoría de autómatas para la optimización bi- objetivo de problemas combinatorios.Ruiz-Rangel, J. R. 2011. Entrenamiento de redes neuronales artificiales basado en algoritmo evolutivo y teoría de autómatas finitos. Rumelhart, D. E., Hinton, G. E. and Williams, R. J. 1986. Learning internal representations by error propagation, Parallel distributed processing: explorations in the microstructure of cognition 1(1): 318–362.LICENSElicense.txtlicense.txttext/plain; charset=utf-81748https://bonga.unisimon.edu.co/bitstreams/d4b1816b-4580-4b65-ae88-db683b500e46/download8a4605be74aa9ea9d79846c1fba20a33MD5220.500.12442/1863oai:bonga.unisimon.edu.co:20.500.12442/18632019-04-11 21:51:42.752metadata.onlyhttps://bonga.unisimon.edu.coDSpace UniSimonbibliotecas@biteca.comTk9URTogUExBQ0UgWU9VUiBPV04gTElDRU5TRSBIRVJFClRoaXMgc2FtcGxlIGxpY2Vuc2UgaXMgcHJvdmlkZWQgZm9yIGluZm9ybWF0aW9uYWwgcHVycG9zZXMgb25seS4KCk5PTi1FWENMVVNJVkUgRElTVFJJQlVUSU9OIExJQ0VOU0UKCkJ5IHNpZ25pbmcgYW5kIHN1Ym1pdHRpbmcgdGhpcyBsaWNlbnNlLCB5b3UgKHRoZSBhdXRob3Iocykgb3IgY29weXJpZ2h0Cm93bmVyKSBncmFudHMgdG8gRFNwYWNlIFVuaXZlcnNpdHkgKERTVSkgdGhlIG5vbi1leGNsdXNpdmUgcmlnaHQgdG8gcmVwcm9kdWNlLAp0cmFuc2xhdGUgKGFzIGRlZmluZWQgYmVsb3cpLCBhbmQvb3IgZGlzdHJpYnV0ZSB5b3VyIHN1Ym1pc3Npb24gKGluY2x1ZGluZwp0aGUgYWJzdHJhY3QpIHdvcmxkd2lkZSBpbiBwcmludCBhbmQgZWxlY3Ryb25pYyBmb3JtYXQgYW5kIGluIGFueSBtZWRpdW0sCmluY2x1ZGluZyBidXQgbm90IGxpbWl0ZWQgdG8gYXVkaW8gb3IgdmlkZW8uCgpZb3UgYWdyZWUgdGhhdCBEU1UgbWF5LCB3aXRob3V0IGNoYW5naW5nIHRoZSBjb250ZW50LCB0cmFuc2xhdGUgdGhlCnN1Ym1pc3Npb24gdG8gYW55IG1lZGl1bSBvciBmb3JtYXQgZm9yIHRoZSBwdXJwb3NlIG9mIHByZXNlcnZhdGlvbi4KCllvdSBhbHNvIGFncmVlIHRoYXQgRFNVIG1heSBrZWVwIG1vcmUgdGhhbiBvbmUgY29weSBvZiB0aGlzIHN1Ym1pc3Npb24gZm9yCnB1cnBvc2VzIG9mIHNlY3VyaXR5LCBiYWNrLXVwIGFuZCBwcmVzZXJ2YXRpb24uCgpZb3UgcmVwcmVzZW50IHRoYXQgdGhlIHN1Ym1pc3Npb24gaXMgeW91ciBvcmlnaW5hbCB3b3JrLCBhbmQgdGhhdCB5b3UgaGF2ZQp0aGUgcmlnaHQgdG8gZ3JhbnQgdGhlIHJpZ2h0cyBjb250YWluZWQgaW4gdGhpcyBsaWNlbnNlLiBZb3UgYWxzbyByZXByZXNlbnQKdGhhdCB5b3VyIHN1Ym1pc3Npb24gZG9lcyBub3QsIHRvIHRoZSBiZXN0IG9mIHlvdXIga25vd2xlZGdlLCBpbmZyaW5nZSB1cG9uCmFueW9uZSdzIGNvcHlyaWdodC4KCklmIHRoZSBzdWJtaXNzaW9uIGNvbnRhaW5zIG1hdGVyaWFsIGZvciB3aGljaCB5b3UgZG8gbm90IGhvbGQgY29weXJpZ2h0LAp5b3UgcmVwcmVzZW50IHRoYXQgeW91IGhhdmUgb2J0YWluZWQgdGhlIHVucmVzdHJpY3RlZCBwZXJtaXNzaW9uIG9mIHRoZQpjb3B5cmlnaHQgb3duZXIgdG8gZ3JhbnQgRFNVIHRoZSByaWdodHMgcmVxdWlyZWQgYnkgdGhpcyBsaWNlbnNlLCBhbmQgdGhhdApzdWNoIHRoaXJkLXBhcnR5IG93bmVkIG1hdGVyaWFsIGlzIGNsZWFybHkgaWRlbnRpZmllZCBhbmQgYWNrbm93bGVkZ2VkCndpdGhpbiB0aGUgdGV4dCBvciBjb250ZW50IG9mIHRoZSBzdWJtaXNzaW9uLgoKSUYgVEhFIFNVQk1JU1NJT04gSVMgQkFTRUQgVVBPTiBXT1JLIFRIQVQgSEFTIEJFRU4gU1BPTlNPUkVEIE9SIFNVUFBPUlRFRApCWSBBTiBBR0VOQ1kgT1IgT1JHQU5JWkFUSU9OIE9USEVSIFRIQU4gRFNVLCBZT1UgUkVQUkVTRU5UIFRIQVQgWU9VIEhBVkUKRlVMRklMTEVEIEFOWSBSSUdIVCBPRiBSRVZJRVcgT1IgT1RIRVIgT0JMSUdBVElPTlMgUkVRVUlSRUQgQlkgU1VDSApDT05UUkFDVCBPUiBBR1JFRU1FTlQuCgpEU1Ugd2lsbCBjbGVhcmx5IGlkZW50aWZ5IHlvdXIgbmFtZShzKSBhcyB0aGUgYXV0aG9yKHMpIG9yIG93bmVyKHMpIG9mIHRoZQpzdWJtaXNzaW9uLCBhbmQgd2lsbCBub3QgbWFrZSBhbnkgYWx0ZXJhdGlvbiwgb3RoZXIgdGhhbiBhcyBhbGxvd2VkIGJ5IHRoaXMKbGljZW5zZSwgdG8geW91ciBzdWJtaXNzaW9uLgo=