ISSN: 3048-6815

Adaptive Hybrid Metaheuristic Optimization Framework for Enhanced Feature Selection and Classification in High-Dimensional Biomedical Datasets

Abstract

The analysis of high-dimensional biomedical datasets presents significant challenges due to the curse of dimensionality, leading to increased computational complexity and reduced classification accuracy. Feature selection, a crucial preprocessing step, aims to identify a subset of relevant features, thereby mitigating these issues. This paper proposes an Adaptive Hybrid Metaheuristic Optimization Framework (AHMOF) for feature selection and classification in high-dimensional biomedical datasets. AHMOF synergistically integrates the strengths of Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) with an adaptive control mechanism to dynamically adjust the balance between exploration and exploitation. The framework employs a novel fitness function that considers both classification accuracy and the number of selected features. Experimental results on several benchmark biomedical datasets demonstrate that AHMOF consistently outperforms traditional feature selection methods and standalone metaheuristic algorithms in terms of classification accuracy, feature subset size, and computational efficiency. The adaptive nature of AHMOF allows it to effectively navigate the complex search space, leading to robust and generalizable feature subsets for improved biomedical data analysis.

References

  1. Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of Machine Learning Research, 3(Mar), 1157-1182.
  2. Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial intelligence, 97(1-2), 273-324.
  3. Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 58(1), 267-288.
  4. Yang, J., & Honavar, V. (1998). Feature subset selection using a genetic algorithm. Feature Extraction, Construction and Selection: A Data Mining Perspective, 117-136.
  5. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of ICNN'95-International Conference on Neural Networks, 4, 1942-1948.
  6. Dorigo, M., & Stützle, T. (2004). Ant colony optimization. MIT press.
  7. Kirkpatrick, S., Gelatt, C. D., & Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220(4598), 671-680.
  8. El-Ela, A. A., El-Sayed, S. M., & Tolba, M. F. (2011). Feature selection using hybrid GA/PSO algorithm for microarray data classification. International Journal of Computer Applications, 32(7).
  9. Hancer, E., Xue, B., Zhang, M., & Browne, W. N. (2013). Feature selection based on hybrid ant colony optimization and simulated annealing. Applied Soft Computing, 13(1), 127-140.
  10. Dash, M., & Liu, H. (1997). Feature selection for classification. Intelligent data analysis, 1(3), 131-156.
  11. Blum, C., & Roli, A. (2003). Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM computing surveys (CSUR), 35(3), 268-308.
  12. Holland, J. H. (1975). Adaptation in natural and artificial systems: An introductory analysis with applications to biology, control, and artificial intelligence. University of Michigan press.
  13. Eberhart, R. C., & Shi, Y. (2001). Particle swarm optimization: developments, applications and resources. Proceedings of the 2001 congress on evolutionary computation (IEEE Cat. No. 01TH8546), 1, 81-86.
  14. Goldberg, D. E. (1989). Genetic algorithms in search, optimization, and machine learning. Addison-Wesley Professional.
  15. Siedlecki, W., & Sklansky, J. (1988). A note on genetic algorithms for feature selection. Pattern recognition letters, 8(5), 335-347.
Download PDF

How to Cite

Akash Verma, (2025-05-02 10:38:17.286). Adaptive Hybrid Metaheuristic Optimization Framework for Enhanced Feature Selection and Classification in High-Dimensional Biomedical Datasets. JANOLI International Journal of Artificial Intelligence and its Applications, Volume EOCMPeqBj5R9ZDur0Rlk, Issue 3.