Boosted local dimensional mutation and all-dimensional neighborhood slime mould algorithm for feature selection

The slime mould algorithm (SMA) is a population-based optimization algorithm that mimics the foraging behavior of slime moulds with a simple structure and few hyperparameters. However, SMA has some limitations, such as getting trapped in local optima when dealing with multimodal or combinatorial fun...

Full description

Autores:
Zhou, Xinsen
Chen, Yi
Wu, Zongda
Heidari, Ali Asghar
Chen, Huiling
Alabdulkreem, Eatedal
Escorcia-Gutierrez, José
Wang, Xianchuan
Tipo de recurso:
Article of investigation
Fecha de publicación:
2023
Institución:
Corporación Universidad de la Costa
Repositorio:
REDICUC - Repositorio CUC
Idioma:
eng
OAI Identifier:
oai:repositorio.cuc.edu.co:11323/13556
Acceso en línea:
https://hdl.handle.net/11323/13556
https://repositorio.cuc.edu.co/
Palabra clave:
All-dimension neighborhood search
Classification
Feature selection
Local dimensional mutations
Meta-heuristic
Optimization
Slime mould algorithm
SMA
Rights
openAccess
License
Atribución-NoComercial-SinDerivadas 4.0 Internacional (CC BY-NC-ND 4.0)
Description
Summary:The slime mould algorithm (SMA) is a population-based optimization algorithm that mimics the foraging behavior of slime moulds with a simple structure and few hyperparameters. However, SMA has some limitations, such as getting trapped in local optima when dealing with multimodal or combinatorial functions. To overcome these limitations and improve the algorithm’s exploration and exploitation abilities, a local dimensional mutation strategy and an all-dimensional neighborhood search strategy for SMA, known as LASMA, were introduced. To evaluate the performance of LASMA, experiments were conducted on 30 benchmark functions from the CEC2014 competition, and the results were compared with up to 27 peers. The experimental results were then synthesized, and the Wilcoxon signed-rank test was used to evaluate the performance of LASMA. The results showed that LASMA outperformed other algorithms in terms of solution accuracy, stability, and convergence speed, with at least a 53.3% improvement in optimization performance on the 30 tested functions. Moreover, to demonstrate the applicability of LASMA to feature selection problems, a binary version of LASMA called bLASMA was developed and compared with eight binary classification algorithms on 18 datasets from the UCI repository. The experimental results showed that bLASMA not only had faster convergence speed and higher convergence accuracy in handling optimization problems but also performed well in feature selection applications. Thus, LASMA is a promising optimization tool for handling global and binary optimization problems, and its binary version, bLASMA, can be used for feature selection tasks. By addressing the limitations of SMA and improving the algorithm’s exploration and exploitation abilities, LASMA provides a robust and effective solution for various optimization problems.