Aquila Optimizer for Hyperparameter Metaheuristic Optimization in ELM

Vasquez-Iglesias, Philip; Zabala-Blanco, David; Pizarro, Amelia E.; Fuentes-Concha, Juan; Gonzalez, Paulo

Abstract

This paper introduces the adaptation of the Aquila Optimizer (AO) metaheuristic to optimize the hyperparameters of the Extreme Learning Machine (ELM). The AO algorithm is a metaheuristic based on swarm intelligence that optimizes the objective function by simulating the hunting behavior of aquilas. The ELM belongs to the family of single hidden layer feed-forward network algorithms, where the hidden layer weights are randomly initialized and whose training is based on the Moore-Penrose pseudoinverse. It is known for faster convergence than traditional methods, providing promising performance with minimal programmer intervention. The proposed method focuses on optimizing the hidden neurons of the ELM by maximizing the most popular performance metrics, namely Accuracy and G-Mean. This method offers an alternative to the classic grid search method by avoiding the need to go through all possible combinations in search of the optimal value. We evaluated three typical datasets and found that our proposal achieves an average efficacy of 99% compared to the global maximum found by the grid search, reducing the search time to an average of 20%. In other words, our method can achieve performance close to the global maximum in a fraction of the time required by the brute-force methodology.

Más información

Título según SCOPUS: ID SCOPUS_ID:85210231164 Not found in local SCOPUS DB
Título de la Revista: Lecture Notes in Computer Science
Volumen: 15369 LNCS
Editorial: Springer, Cham
Fecha de publicación: 2025
Página de inicio: 244
Página final: 258
DOI:

10.1007/978-3-031-76604-6_18

Notas: SCOPUS