Improved Genetic Particle Swarm Optimization and Feature Subset Selection for Extreme Learning Machine
DOI:
https://doi.org/10.26438/ijcse/v6si1.4854Keywords:
Feature Subset Selection Problem, Pattern Classification Problem, Particle Swarm Optimization, Extreme Learning MachineAbstract
Particle Swarm Optimization (PSO) is a heuristic global optimization method, which is most commonly used for feature subset selection problem. However, PSO requires the fixed number of optimal features as an input. It is a very critical task to analyze initially that how many features are relevant and non-redundant present in the given dataset. To solve the said problem this paper has proposed Improved Genetic – PSO (IG-PSO) algorithm for Extreme Learning Machine (ELM) which returns optimal features as well as an optimal number of features. The IG-PSO algorithm is experimented on six benchmarked dataset for handling medical dataset classification which improves the classification accuracy by using optimal features. Also, the simulation results demonstrate that IG-PSO algorithm has the capability to handle optimization, dimensionality reduction and supervised binary classification problems.
References
L. Yu and H. Liu, “Efficient feature selection via analysis of relevance and redundancy”, Journal of machine learning research, pp. 1205-1224, 2014.
Kittler,J. and aan den Rijn, Netherlands, “Feature Set Search Algorithms”, , Pattern Recognition and Signal Processing, Chapter pp. 41-60, 1978.
D. Koller and M. Sahami, Toward optimal feature selection, Tech.rep. Stanford InfoLab, 1996.
Zhi-Hui Zhan, Jun Zhang, Yun Li and Henry Shu-Hung Chung, “ Adaptive Particle Swarm Optimization”, IEEE Trans. On Systems, Man, and Cybernetics- Part B, vol. 39,no. 6, December 2009.
Iftikhar Ahmad,”Feature Selection Using Particle Swarm Optimization in Intrusion Detection”,International Journal of Distributed Sensor Network, January 2015.
G.-B. Huang, Q.-Y. Zhu and C.-K. Siew, “Extreme learning machine: a new learning scheme of feedforward neural networks”, In proceedings. IEEE International Joint Conference., vol. 2., pp. 985-990, 2004.
G.-B. Huang, Q.-Y. Zhu and C.-K. Siew, “Extreme learning machine:theory and applications”, Neurocomputing 70 (1), pp. 4
89501, 2006.
G.-B. Huang, L. Chen, and C.-K. Siew, “Universal approximation using incremental constructive feedforward networks with random hidden node”, IEEE Tranactions on. Neural Network, vol. 17., no. 4., pp. 879-892, 2006.
G.-B. Huang, H. Zhou, X. Ding and R. Zhang, “Extreme learning machine for regression and multiclass classification”, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42 (2), pp. 513-529, 2012.
G.-B. Huang, “An insight into extreme learning machines: Randomneurons, random features and kernels”, Cognit. Computat, vol. 6., no.3., pp. 376-390, 2014.
R. Eberhart and J. Kennedy, “New optimizer using particle swarm theory,” In Proceedings International Symposium on Micro Machine and Human Science, pp. 39–43,October 1995.
B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimization for feature selection in classification: amulti-objective approach,” IEEE Transactions on Cybernetics, vol. 43, no. 6, pp. 1656–1671, 2013.
B. Xue, M. Zhang, and W. N. Browne, “Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms”, Applied Soft Computing Journal, vol. 18, pp. 261–276, 2014.
M. Lichman, UCI machine learning repository. URL http://archive.ics.uci.edu/ml, 2013. UCI repository
Akusok, A., Bj¨ork, K.-M., Miche, Y., Lendasse, A., “High-performance extreme learning machines: a complete toolbox for big data applications”. IEEE Access 3, 1011–1025, 2015.
G. Karakaya, S. Galelli, S. D. Ahipa sao glu and R. Taormina, “Identifying (quasi) equally informative subsets in feature selection problems for classification: a max-relevance min-redundancy approach”, IEEE transactions on cybernetics 46 (6), pp. 1424-1437,2016.
Nahato, K. B., Nehemiah, K. H., Kannan, A, “ Hybrid approach using fuzzy sets and extreme learning machine for classifying clinical datasets”, Elsevier Journal of Informatics in Medicine Unlocked 2, 1–11, 2016.
Han, J., Pei, J., Kamber, M., 2011. Data mining: concepts and techniques. Elsevier.
Mahdiyah, U., Irawan, M. I., Imah, E. M., “Integrating data selection and extreme learning machine for imbalanced data”. Procedia Computer Science 59, 221–229, 2015.
Parikh, R., Mathai, A., Parikh, S., Sekhar, G. C., Thomas, R., Understanding and using sensitivity, specificity and predictive values. Indian journal of ophthalmology 56 (1), 45, 2008.
Archana Kale and Shefali Sonavane, “Optimal Feature Subset Selection for Fuzzy Extreme Learning Machine using Genetic Algorithm with Multilevel Parameter Optimization”, IEEE International conference on Signal and Image Processing Applications pp.445-450, Septmber 2017.
A. Kale and S. Sonavane, “Hybrid Feature Subset Selection Approach for Fuzzy-Extreme Learning Machine”, Springer journal of Computational Intelligence and Complexity - Data Enabled and Discovery Applications, September 2017.
D. C¸ alis¸ir, E. Do˘gantekin, “An automatic diabetes diagnosis system based on LDA-wavelet support vector machine classifier”, Expert Syst. Appl. 38(7), 8311–8315, 2011.
H. Temurtas, N. Yumusak, F. Temurtas, “A comparative study on diabetes disease diagnosis using neural networks”. Expert Syst. Appl. 36(4), 8610–8615, 2009.
C.V. Subbulakshmi, S.N. Deepa, “Medical dataset classification: a machine learning paradigm integrating particle swarm optimization with extreme learning machine classifier”, Scientific World Journal 2015.
F.J. Marti´ınez-Estudillo, C. Herv´as-Mart´ınez, P.A. Guti´errez, A.C. Mart´ınez-Estudillo, “Evolutionary product-unit neural networks classifier”s. Neurocomputing 72(1), 548–561, 2008.
C. Herv´as-Mart´ınez, F.J. Mart´ınez-Estudillo, M. Carbonero-Ruz, “Multilogistic regression by means of evolutionary product-unit neural networks”. Neural Netw. 21(7), 951–961, 2008.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
