Review On Feature Selection Techniques in Data Mining

Authors

  • S Ramadass Department of Computer Science, Government Arts College, Dharmapuri, India
  • M Gunasekaran Department of Computer Science, Government Arts College, Dharmapuri, India

DOI:

https://doi.org/10.26438/ijcse/v5i11.187191

Keywords:

Feature selection, PSO, ACO, GA, Data mining

Abstract

Feature selection is a data pre-processing technique specially used for classification problems. It aims at identifying the minimal reduct with less number of features without affecting the classification accuracy of the data set. Its goal is to choose a negligible subset of features as indicated by some sensible criteria with the goal that the first undertaking can be accomplished similarly well, if worse. By picking an insignificant subset of features, unimportant and repetitive features are evacuated by the paradigm. Rough set theory is a technique that has been used for feature selection. It is utilizing to find the basic relationship from the uproarious data, which is utilizing the discritization strategy on discrete-esteemed properties and proceeds with values quality. It depends on making the equalance classes with in the given data, every one of the data tupels are making an equalance classes are indiscernalbe with the regard of the properties depicting data. Though there is many rough set based approaches like quick reduct, relative reduct entropy based reduct, these approaches are able to identify a reduct set. This paper presents a survey on various methods and techniques of feature selection and its advantages and disadvantages.

References

[1] Bin Hu, Yongqiang Dai, Yun Su, Philip Moore, Xiaowei Zhang, Chengsheng Mao, Jing Chen,Lixin Xu “Feature Selection for Optimized High-dimensional Biomedical Data using an Improved Shuffled Frog Leaping Algorithm” IEEE 1545-5963 ©2016.

[2] Jiye Liang, Feng Wang, Chuangyin Dang, Yuhua Qian “ A Group Incremental Approach to Feature Selection Applying Rough Set Technique” IEEE Transactions on knowledge and data engineering, vol. 26, No. 2 , February ©2014.

[3] Xiaohui Lin, Huanhuan Song, Meng Fan, Weijie Ren, Lishuang Li, Weihong Yao “The Feature Selection Algorithm Based on Feature Overlapping and Group” IEEE International Conference on Bioinformatics and Biomedicine(BIBM) ©2016.

[4] Guoqing Cui, Jie Yang, Masoumeh Zareapoor, Jiechen Wang “Unsupervised Feature Selection Algorithm Based on Sparse Representation” The 2016 3rd International Conference on System and Informatics (ICSAI 2016).

[5] Hong Wang, Xingjian Jing, Ben Niu “Bacterial - Inspired Feature Selection Algorithm and Its Application in Fault Diagnosis of Complex Structures” IEEE © 2016

[6] Hossam M. Zawbaa, E. Emary, B. PARV , Marwa Sharawi “Feature Selection Approach based on Moth-Flame Optimization Algorithm” IEEE Congress on Evolutionary Computation (CEC) ©2016

[7] Qian Guo, Yanpeng Qu, Ansheng Deng, Longzhi Yang “A New Fuzzy-rough Feature Selection Algorithm for Mammographic Risk Analysis” 12th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD) 2016

[8] Sun jiongjiong,liu jun, wei xuguang “Feature Selection algorithm based on SVM” 35th Chinese Control Conference July 27-29, 2016

[9] Chunyong Yin, Luyu Ma, Lu Feng, Jin Wang, Zhichao Yin “A Hybrid Feature Selection Algorithm” 4th International Conference on Advanced Information Technology and Sensor Application 2015

[10] Kilho Shin, Tetsuji Kuboyama, Takako Hashimoto “Super-CWC and Super-LCC: Super Fast Feature Selection Algorithms” IEEE International Conference on Big Data(Big Data) 2015

[11] H. Hannah Inbarania, Ahmad Taher Azarb, G. Jothic “Supervised hybrid feature selection based on PSO and rough sets for medical diagnosis” computer methods and programs in biomedicine 113(2014) 175 – 185.

Downloads

Published

2025-11-12
CITATION
DOI: 10.26438/ijcse/v5i11.187191
Published: 2025-11-12

How to Cite

[1]
S. Ramadass and M. Gunasekaran, “Review On Feature Selection Techniques in Data Mining”, Int. J. Comp. Sci. Eng., vol. 5, no. 11, pp. 187–191, Nov. 2025.