Designing a Classifier Using Unsupervised Learning and Rough Set Theory

Authors

  • Gurusamy V Department of Computer Applications, School of IT, Madurai Kamaraj University, Madurai, India
  • K Nandhini Technical Support Engineer, Concentrix India Pvt Ltd, Chennai, India

DOI:

https://doi.org/10.26438/ijcse/v5i10.226230

Keywords:

Inconsistency, Rough Set, Unsupervised Neural Network

Abstract

Dataset collected from multiple sources is often inconsistent and generates different label of decisions for the same conditional attribute values. A method for handling inconsistency has been proposed here using Kohonen Self organizing neural network, an unsupervised learning approach. After removing inconsistency, the minimum subset of attributes in the dataset called reducts are selected using Rough Set Theory, which effectively reduces dimensionality of the dataset. Unlike most of the existing reduct generation algorithms where all attributes are examined, here evaluation of all attributes is not required and therefore, time complexity has been improved considerably. In the next step, considering core attribute as root node of a decision tree, all possible rules are generated which are pruned based on information entropy and coverage of the rule set. The classifier is built using the reduced rule set demonstrating comparable results with the classifier consisting of all attributes.

References

[1] Zdzislaw Pawlak, “Rough sets”, International Journal of Computer and Information Sciences, 11, 341-356, 1982.

[2] I. Düntsch, and G. Gediga, “Algebraic aspects of attribute dependencies in information systems”, Fundamental Informaticae, Vol. 29, 1997, pp. 119-133.

[3] A. Øhrn, “Discernibility and rough sets in medicine: tools and applications”, PhD thesis, Department of Computer and information science, Norwegian University of Science and Technology, 1999.

[4] J.G. Bazan, M.S. Szczuka, and J. Wroblewski, “A new version of rough set exploration system,” Lecture notes in artificial intelligence, Vol.2475, 2002, pp. 397-404.

[5] N.Ttow, D.R. Morse, and D.M. Roberts, “Rough set approximation as formal concept,” Journal of advanced computational intelligence and intelligent informatics, Vol.10, No.5, 2006, pp. 606-611.

[6] P.Guo, and H. Tanaka, “Upper and lower possibility distributions with rough set concepts,”, In Rough set theory and granular computing, Springer, 2002, pp. 243-250.

[7] Kohonen, T. “Self-Organizing Maps”, 3rd edition, Berlin: Springer-Verlag, 2001.

[8] Zhangyan Xu, Liyu Huang, Wenbin Qian, Bingru Yang, “Quick Attribute Reduction Algorithm Based on Improved Frequent Pattern Tree”.

[9] Ganesan G, Raghavendra Rao C., Latha D., “An overview of rough sets”, proceedings of the National Conference on the emerging trends in Pure and Applied Mathematics, Palayamkottai, India, pp: 70-76, 2005

[10] [Han, 2001] Han J and Kamber M, “Data Mining: Concepts and Techniques”, Morgan Kaufmann, 2001, 279-325.

[11] C.R.Rao and P.V.Kumar. “Functional Dependencies through Val”,

[12] ICCMSC ’99, India, TMH publications 116-123, 1999.

[13] Pawlak,1991] Zdzislaw Pawlak, “RoughSets- Theoretical Aspects and Reasoning about Data”, Kluwer Academic Publications, 1991

[14] Quinlan J.R. “Induction of decision trees”. Machine Learning 181-106.

[15] Ramadevi Y, C.R.Rao, “Knowledge Extraction Using Rough Sets –Gpcr – Classification”,International conference on Bioinformatics and diabetes mellitus, India, 2006.

[16] [ [Starzyk, 1999] Starzyk J, Nelson D.E., SturtzK, “Reduct Generation in Information Systems”,Bulletin of International Rough Set Society, 3(1/2), 1999.

Downloads

Published

2025-11-12
CITATION
DOI: 10.26438/ijcse/v5i10.226230
Published: 2025-11-12

How to Cite

[1]
V. Gurusamy and K. Nandhini, “Designing a Classifier Using Unsupervised Learning and Rough Set Theory”, Int. J. Comp. Sci. Eng., vol. 5, no. 10, pp. 226–230, Nov. 2025.

Issue

Section

Research Article