Analyzing Machine learning Algorithm for Predicting an Accuracy of Meteorological Data
DOI:
https://doi.org/10.26438/ijcse/v6i10.895899Keywords:
Randon Forest, C4.5, C4.5 with Bootstrap Algorithm, Meterological Data, Accurac, Time efficiencyAbstract
Meteorological data analysis in the form of data mining is concerned to predict the knowledge of weather condition. To make an accurate prediction is one of the challenging of meteorologist to survey the weather condition efficiently. Decision tree algorithms are suitable for analyzing the data of meteorological behavior. By evaluates three algorithm of decision tree such as Random Forest, C4.5, C4.5 with Bootstrap aggregation, to analyse the time efficiency and accuracy of classification. These accuracy of algorithm when it operates on trained weather data of selected location. Those locations are selected through monsoon condition based on India country
References
[1] Hall, Mark, et al. "The WEKA data mining software: an update." ACM SIGKDD Explorations Newsletter 11.1 (2009): 10-18.
[2] T.F. Gonzales. “Clustering to minimize the maximum inter cluster distance”. Theoretical Computer Science,1985,38(2-3):293-306.
[3] Kannan, M., S. Prabhakaran, and P. Ramachandran. "Rainfall forecasting using data mining technique."(2010)
[4] Arun K Pujari, “Data mining techniques”, University Press (India). 2003.
[5] Jiawei Han Micheline Kamber, “Data Mining: Concepts and Techniques”, Morgan Kaufmann Publisher an imprint of Elsevier, 2006.
[6] L. Breiman, J. Friedman, R. Olshen and C. Stone. “Classification and Regression Trees”, Wadsworth International Group, Belmont, CA, 1984.
[7] Quinlan, J.R.. “C5.0 Online Tutorial”, (2003)
[8] Wu, X., Kumar, V., Quinlan, J. R., Ghosh, J., Yang, Q., Motoda, Z., Steinbach,M., Hand, D. J and Steinberg, D (2008). “Top 10 Algorithms in Data Mining”, Knowledge and Information Systems, 14 (1): 1-37.
[9] Schapire, R. “The strength of weak learnability”, Machine Learning,(1990) 5(2): 197-227.
[10] Breiman, L . "Random Forests". Machine Learning 45 (1): 5–32. (2010)
[11] Freund, Y. Schapire, R. “Experiments with a new boosting algorithm”, In Proceedings of the Thirteenth International Conference on Machine Learning, 148-156 Bari, Italy. (1996)
[12] Dietterich, T. G.. “An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting and randomization”. Machine learning, 40: 139-157. (2000).
[13] Opitz, D and Maclin, R "Popular Ensembl Methods: An Empirical Study", 11: 169-198. (1999)
[14] Quinlan, J. R. “Bagging, Boosting and C4.5”, AAAI/IAAI, 1: 725-730. (1996)
[15] M. Mayilvaganan, D. Kalpanadevi, “Comparison of Classification Techniques for predicting the performance of Students Academic Environment” in (2014)
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
