

<?xml version="1.0" encoding="UTF-8"?>
<record>
  <title>A Hybrid Feature Selection Algorithm Based on Information Gain and Sequential Forward Floating Search</title>
  <journal>Journal of Intelligent Computing</journal>
  <author>Jianli Ding, Liyang Fu</author>
  <volume>9</volume>
  <issue>3</issue>
  <year>2018</year>
  <doi></doi>
  <url>http://www.dline.info/jic/fulltext/v9n3/jicv9n3_1.pdf</url>
  <abstract>As an important pre-processing method in machine learning, feature selection eliminates data redundancy and
reduces feature dimensions and computational time complexity effectively. In order to further reduce the number of iterations
of the feature selection algorithm and improve the classification accuracy, a new hybrid feature selection algorithm is proposed,
which combines the filter algorithm based on information gain and the wrapper algorithm based on Sequential
Forward Floating Search (SFFS) and Decision Tree (DT). The optimal candidate feature subset is quickly found by ranking
the features using information gain. In order to avoid the nesting effect of features, SFFS algorithm is used to reduce feature
dimensions of the optimal candidate feature subset. The experiments show that the maximum ratio of the number of reduced
features and the number of initial features is 92.86%. Compared with other feature selection algorithms, the maximum decline
of the number of iterations is about 67.8%, and the maximum increase of the classification accuracy is about 10.5%. The results
prove that the hybrid algorithm possesses higher computational efficiency and classification accuracy.</abstract>
</record>
