Student Performance Prediction Using A Cascaded Bi-level Feature Selection Approach
Source: By:Author(s)
DOI: https://doi.org/10.30564/jcsr.v3i3.3534
Abstract:Features in educational data are ambiguous which leads to noisy features and curse of dimensionality problems. These problems are solved via feature selection. There are existing models for features selection. These models were created using either a single-level embedded, wrapperbased or filter-based methods. However single-level filter-based methods ignore feature dependencies and ignore the interaction with the classifier. The embedded and wrapper based feature selection methods interact with the classifier, but they can only select the optimal subset for a particular classifier. So their selected features may be worse for other classifiers. Hence this research proposes a robust Cascade Bi-Level (CBL) feature selection technique for student performance prediction that will minimize the limitations of using a single-level technique. The proposed CBL feature selection technique consists of the Relief technique at first-level and the Particle Swarm Optimization (PSO) at the second-level. The proposed technique was evaluated using the UCI student performance dataset. In comparison with the performance of the single-level feature selection technique the proposed technique achieved an accuracy of 94.94% which was better than the values achieved by the single-level PSO with an accuracy of 93.67% for the binary classification task. These results show that CBL can effectively predict student performance.
References:[1] J. M. Adán-Coello and C. M. Tobar, ‘Using Collaborative Filtering Algorithms for Predicting Student Performance’, in Electronic Government and the Information Systems Perspective, vol. 9831, A. Kő and E. Francesconi, Eds. Cham: Springer International Publishing, 2016, pp. 206-218. DOI: https://doi.org/10.1007/978-3-319-44159-7_15 [2] E. Jembere, R. Rawatlal, and A. W. Pillay, ‘Matrix Factorisation for Predicting Student Performance’, in 2017 7th World Engineering Education Forum (WEEF), Kuala Lumpur, Nov. 2017, pp. 513-518. DOI: https://doi.org/10.1109/WEEF.2017.8467150 [3] K. David Kolo, S. A. Adepoju, and J. Kolo Alhassan, ‘A Decision Tree Approach for Predicting Students Academic Performance’, Int. J. Educ. Manag. Eng., vol. 5, no. 5, pp. 12-19, Oct. 2015. DOI: https://doi.org/10.5815/ijeme.2015.05.02 [4] S. Hussain, N. Abdulaziz Dahan, F. M. Ba-Alwi, and N. Ribata, ‘Educational Data Mining and Analysis of Students’ Academic Performance Using WEKA’, Indones. J. Electr. Eng. Comput. Sci., vol. 9, no. 2, p. 447, Feb. 2018. DOI: https://doi.org/10.11591/ijeecs.v9.i2.pp447-459 [5] M. Imran, S. Latif, D. Mehmood, and M. S. Shah, ‘Student Academic Performance Prediction using Supervised Learning Techniques’, Int. J. Emerg. Technol. Learn. IJET, vol. 14, no. 14, p. 92, Jul. 2019. DOI: https://doi.org/10.3991/ijet.v14i14.10310 [6] M. A. Amoo, O. B. Alaba, and O. L. Usman, ‘Predictive modelling and analysis of academic performance of secondary school students: Artificial Neural Network approach’, Int. J. Sci. Technol. Educ. Res., vol. 9, no. 1, pp. 1-8, May 2018. DOI: https://doi.org/10.5897/IJSTER2017.0415 [7] Z. M. Hira and D. F. Gillies, ‘A Review of Feature Selection and Feature Extraction Methods Applied on Microarray Data’, Adv. Bioinforma., vol. 2015, pp. 1-13, Jun. 2015. DOI: https://doi.org/10.1155/2015/198363 [8] A. Daud, N. R. Aljohani, R. A. Abbasi, M. D. Lytras, F. Abbas, and J. S. Alowibdi, ‘Predicting Student Performance using Advanced Learning Analytics’, in Proceedings of the 26th International Conference on World Wide Web Companion - WWW ’17 Companion, Perth, Australia, 2017, pp. 415-421. DOI: https://doi.org/10.1145/3041021.3054164 [9] B. K. Francis and S. S. Babu, ‘Predicting Academic Performance of Students Using a Hybrid Data Mining Approach’, J. Med. Syst., vol. 43, no. 6, p. 162, Jun. 2019. DOI: https://doi.org/10.1007/s10916-019-1295-4 [10] E. T. Lau, L. Sun, and Q. Yang, ‘Modelling, prediction and classification of student academic performance using artificial neural networks’, SN Appl. Sci., vol. 1, no. 9, p. 982, Sep. 2019. DOI: https://doi.org/10.1007/s42452-019-0884-7 [11] A. M. Olalekan, O. S. Egwuche, and S. O. Olatunji, ‘Performance Evaluation Of Machine Learning Techniques For Prediction Of Graduating Students In Tertiary Institution’, in 2020 International Conference in Mathematics, Computer Engineering and Computer Science (ICMCECS), Ayobo, Ipaja, Lagos, Nigeria, Mar. 2020, pp. 1-7. DOI: https://doi.org/10.1109/ICMCECS47690.2020.240888 [12] Y. K. Salal, S. M. Abdullaev, and M. Kumar, ‘Educational Data Mining: Student Performance Prediction in Academic’, vol. 8, no. 4, p. 6, 2019. [13] A. Magbag and R. R. Jr, ‘Prediction Of College Academic Performance Of Senior High School Graduates Using Classification Techniques’, vol. 9, no. 04, p. 6, 2020. [14] F. Ünal, ‘Data Mining for Student Performance Prediction in Education’, IntechOpen, p. 12, 2020. DOI: http://dx.doi.org/10.5772/intechopen.91449 [15] P. Cortez and A. Silva, ‘Using data mining to Predict Secondary School Student Performance’, p. 9, 2008. [16] C. Seger, ‘An investigation of categorical variable encoding techniques in machine learning: binary versus one-hot and feature hashing’, Bachelor Degree, KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE, Sweden, 2018. [17] J. Wang, S. Zhou, Y. Yi, and J. Kong, ‘An Improved Feature Selection Based on Effective Range for Classification’, Sci. World J., vol. 2014, pp. 1-8, 2014. DOI: https://doi.org/10.1155/2014/972125 [18] B. Kumari and T. Swarnkar, ‘Filter versus Wrapper Feature Subset Selection in Large Dimensionality Micro array: A Review’, vol. 2, p. 6, 2011. [19] R. P. L. Durgabai and Y. Ravi Bhushan, ‘Feature Selection using ReliefF Algorithm’, IJARCCE, pp. 8215-8218, Oct. 2014. DOI: https://doi.org/10.17148/IJARCCE.2014.31031 [20] R. J. Urbanowicz, R. S. Olson, P. Schmitt, M. Meeker, and J. H. Moore, ‘Benchmarking Relief-Based Feature Selection Methods for Bioinformatics Data Mining’, ArXiv171108477 Cs, Apr. 2018, Accessed: Jul. 13, 2021. [Online]. Available: http://arxiv.org/abs/1711.08477 [21] S. Talukder, ‘Mathematical Modelling and Applications of Particle Swarm Optimization’, Master’s Thesis, Blekinge Institute of Technology, 2011. [22] S. Sengupta, S. Basak, and R. A. P. Ii, ‘Particle Swarm Optimization: A survey of historical and recent developments with hybridization perspectives’, p. 34, 2019. [23] B. Sahu and D. Mishra, ‘A Novel Feature Selection Algorithm using Particle Swarm Optimization for Cancer Microarray Data’, Procedia Eng., vol. 38, pp. 27-31, 2012. DOI: https://doi.org/10.1016/j.proeng.2012.06.005 [24] G. Armano, C. Chira, and N. Hatami, ‘Error-Correcting Output Codes for Multi-Label Text Categorization’, p. 12, 2013. [25] S. Escalera, O. Pujol, P. Radeva, and P. Ivanova, ‘Error-Correcting Ouput Codes Library’, J. Mach. Learn. Res., vol. 11, p. 4, 2010. [26] A. S. Olaniyi, S. Y. Kayode, H. M. Abiola, S.-I. T. Tosin, and A. N. Babatunde, ‘STUDENT’S PERFORMANCE ANALYSIS USING DECISION TREE ALGORITHMS’, Int. J. Comput. Eng. Res.,vol. 08, no. 9, p. 8, Sep. 2018. [27] I. Bukovsky, W. Kinsner, and N. Homma, ‘Learning Entropy as a Learning-Based Information Concept’, Entropy, vol. 21, no. 166, pp. 1-14, 2019. DOI: https://doi.org/10.3390/e21020166 [28] Z. Zhang, ‘Introduction to machine learning: k-nearest neighbors’, Ann. Transl. Med., vol. 4, no. 11, pp. 218-218, Jun. 2016. DOI: https://doi.org/10.21037/atm.2016.03.37 [29] S. P. Arade and J. K. Patil, ‘COMPARATIVE STUDY OF DIABETIC RETINOPATHY USING K-NN AND BAYESIAN CLASSIFIER’, Int. J. Innov. Eng. Res. Technol., vol. 4, no. 5, pp. 55-61, 2017. [30] A. Kataria and M. D. Singh, ‘A Review of Data Classification Using K-Nearest Neighbour Algorithm’, Int. J. Emerg. Technol. Adv. Eng., vol. 3, no. 6, pp. 354-360, 2013. [31] X. Gu, L. Akoglu, and A. Rinaldo, ‘Statistical Analysis of Nearest Neighbor Methods for Anomaly Detection’, in 33rd Conference on Neural Information Processing Systems, canada, 2019, p. 11. [32] E. A. Amrieh, T. Hamtini, and I. Aljarah, ‘Mining Educational Data to Predict Student’s academic Performance using Ensemble Methods’, Int. J. Database Theory Appl., vol. 9, no. 8, pp. 119-136, Aug. 2016. DOI: https://doi.org/10.14257/ijdta.2016.9.8.13 [33] O. W. Adejo and T. Connolly, ‘Predicting student academic performance using multi-model heterogeneous ensemble approach’, J. Appl. Res. High. Educ., vol. 10, no. 1, pp. 61-75, Feb. 2018. DOI: https://doi.org/10.1108/JARHE-09-2017-0113 [34] A. Almasri, E. Celebi, and R. S. Alkhawaldeh, ‘EMT: Ensemble Meta-Based Tree Model for Predicting Student Performance’, Sci. Program., vol. 2019, pp. 1-13, Feb. 2019. DOI: https://doi.org/10.1155/2019/3610248 [35] M. B. Shah, M. Kaistha, and Y. Gupta, ‘Student Performance Assessment and Prediction System using Machine Learning’, in 2019 4th International Conference on Information Systems and Computer Networks (ISCON), Mathura, India, Nov. 2019, pp. 386- 390. DOI: https://doi.org/10.1109/ISCON47742.2019.9036250