Hybrid Binary Dragonfly Algorithm with Simulated Annealing for Feature Selection
File version
Author(s)
Tubishat, Mohammad
Essgaer, Mansour
Mirjalili, Seyedali
Griffith University Author(s)
Primary Supervisor
Other Supervisors
Editor(s)
Date
Size
File type(s)
Location
License
Abstract
There are various fields are affected by the growth of data dimensionality. The major problems which are resulted from high dimensionality of data including high memory requirements, high computational cost, and low machine learning classifier performance. Therefore, proper selection of relevant features from the set of available features and the removal of irrelevant features will solve these problems. Therefore, to solve the feature selection problem, an improved version of Dragonfly Algorithm (DA) is proposed by combining it with Simulated Annealing (SA), where the improved algorithm named BDA-SA. To solve the local optima problem of DA and enhance its ability in selecting the best subset of features for classification problems, Simulated Annealing (SA) was applied to the best solution found by Binary Dragonfly algorithm in attempt to improve its accuracy. A set of frequently used data sets from UCI repository was utilized to evaluate the performance of the proposed FS approach. Results show that the proposed hybrid approach, named BDA-SA, has superior performance when compared to wrapper-based FS methods including a feature selection method based on the basic version of Binary Dragonfly Algorithm.
Journal Title
SN Computer Science
Conference Title
Book Title
Edition
Volume
2
Issue
4
Thesis Type
Degree Program
School
Publisher link
Patent number
Funder(s)
Grant identifier(s)
Rights Statement
Rights Statement
Item Access Status
Note
Access the data
Related item(s)
Subject
Artificial intelligence
Dragonfly algorithm
Feature selection
Optimization
Simulated annealing algorithm
Persistent link to this record
Citation
Chantar, H; Tubishat, M; Essgaer, M; Mirjalili, S, Hybrid Binary Dragonfly Algorithm with Simulated Annealing for Feature Selection, SN Computer Science, 2021, 2 (4), pp. 295