Evolution of Machine Learning Methods for Memography Classification
DOI:
https://doi.org/10.26438/ijcse/v6i3.499502Keywords:
Deep learning, Machine Learning, Revisiting SVM, SVMAbstract
In Healthcare and Biomedical sectors, the data is growing more and more, analysing of such medical data accurately will benefits disease detection and early diagnosis. Mammography is the process toward utilizing low-energy X-rays to look at the human cancer for diagnosis and screening. The objective of mammography is the early detection of breast cancer , ordinarily through recognition of trademark masses or macrocalcifications. Low positive predictive model of mammogram will lead to more no unnecessary biopsies with benign outcomes. The accuracy and reliability of prediction mechanisms is important to reduce the number of biopsies. In this paper, we look at different machine learning algorithms with a specific end goal to predict the performance accuracy. By comparing different algorithms, it has been concluded that deep learning algorithm and Revisiting SVM have highest prediction accuracy among other algorithms studied. Experimental results show this prediction approach is more effective.
References
[1] K. Nigam, A. Mccallum and T. Mitchell, “Text classification from labelled and unlabelled documents using EM”, Machine Learning, vol. 39, no. 2.
[2] W. Zhang, Y. Yang and Q. Wang, “Handling missing data in software effort prediction with naive Bayes and EM algorithm”, PROMISE ’11, Banff, Canada, September 20-21,.
[3] C. L. Martín, A. Chavoya and M. E. M. Campaña, “Use of a Feed Forward Neural Network for Predicting the Development Duration of Software Projects”, 2013 12th International Conference on Machine Learning and Applications.
[4] S. K. Shevade, S. S. Keerthi, C. Bhattacharyya and K. R. K. Murthy, “Improvements to the SMO Algorithm for SVM Regression”, IEEE Transactions ON Neural Networks, vol. 11, no. 5, September.
[5] Short RD, Fukunaga K. The optimal distance measure for nearest neighbour classification. IEEE Transactions on Information Theory 1981; 27:622-7. 10.1109/TIT.1981.1056403
[6] Weinberger KQ, Saul LK. Distance metric learning for large margin nearest neighbour classification. The Journal of Machine Learning Research 2009; 10:207- 44.
[7] Cost S, Salzberg S. A weighted nearest neighbour algorithm for learning with symbolic features. Machine Learning 1993; 10:57-78. 10.1007/BF00993481
[8] Breiman L. Random forests. Machine Learning. 2001; 45:5-32. 10.1023/A:1010933404324
[9] Zhang Z. Too much covariates in a multivariable model may cause the problem of overfitting. J Thorac Dis 2014;6: E196-7.
[10] Lantz B. Machine learning with R. 2nd ed. Birmingham: Packt Publishing; 2015:1.
[11] J. A. Lopez, J. L. Berral, R. Gavalda and J. Torres, “Adaptive on-line software aging prediction based on machine learning”, in Procs. 40th IEEE/IFIP Intl. Conf. on Dependable Systems and Networks, June 28, July 1, pp. 497-506.
[12] Y. Wang, and I. H. Witten, “Inducing Model Trees for Continuous Classes”, the European Conf. on Machine Learning Poster Papers.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors contributing to this journal agree to publish their articles under the Creative Commons Attribution 4.0 International License, allowing third parties to share their work (copy, distribute, transmit) and to adapt it, under the condition that the authors are given credit and that in the event of reuse or distribution, the terms of this license are made clear.
