Refining Student Marks based on Enrolled Modules’ Assessment Methods using Data Mining Techniques

Authors

  • M. A. Alsuwaiket Department of Computer Science and Engineering Technology, Hafar Al Batin University, Saudi Arabia
  • A. H. Blasi Department of Computer Information Systems, Mutah University, Jordan
  • K. Altarawneh Department of Computer Science, Mutah University, Jordan
Volume: 10 | Issue: 1 | Pages: 5205-5210 | February 2020 | https://doi.org/10.48084/etasr.3284

Abstract

Choosing the right and effective way to assess students is one of the most important tasks of higher education. Many studies have shown that students tend to receive higher scores during their studies when assessed by different study methods - which include units that are fully assessed by varying the duration of study or a combination of courses and exams - than by exams alone. Many Educational Data Mining (EDM) studies process data in advance through traditional data extraction, including the data preparation process. In this paper, we propose a different data preparation process by investigating more than 230,000 student records for the preparation of scores. The data have been processed through diverse stages in order to extract a categorical factor through which students’ module marks are refined during the data preparation stage. The results of this work show that students’ final marks should not be isolated from the nature of the enrolled module’s assessment methods. They must rather be investigated thoroughly and considered during EDM’s data pre-processing stage. More generally, educational data should not be prepared in the same way normal data are due to the differences in data sources, applications, and error types. The effect of Module Assessment Index (MAI) on the prediction process using Random Forest and Naive Bayes classification techniques were investigated. It was shown that considering MAI as attribute increases the accuracy of predicting students’ second year averages based on their first-year averages.

Keywords:

EDM, data mining, machine learning, Naïve Bayes, random forest, module assessment

Downloads

Download data is not yet available.

References

J. T. E. Richardson, “Coursework versus examinations in end-of-module assessment: a literature review”, Assessment & Evaluation in Higher Education, Vol. 40, No. 3, pp. 439-455, 2015 DOI: https://doi.org/10.1080/02602938.2014.919628

P. Bridges, A. Cooper, P. Evanson, C. Haines, D. Jenkins, D. Scurry, H. Woolf, M. Yorke, “Coursework marks high, examination marks low: Discuss”, Assessment & Evaluation in Higher Education, Vol. 27, No. 1, pp. 35-48, 2002 DOI: https://doi.org/10.1080/02602930120105045

J. Heywood, Assessment in higher education: Student learning, teaching programmes and institutions, Jessica Kingsley, 2000

G. Gibbs, C. Simpson, “Conditions under which assessment supports students’ learning”, Journal of Learning and Teaching in Higher Education, Vol. 1, pp. 3-31, 2005

M. Alsuwaiket, A. H. Blasi, R. A. Al-Msie'deen, R. A, “Formulating module assessment for improved academic performance predictability in higher education”, Engineering, Technology & Applied Science Research, Vol. 9, No. 3, pp. 4287-4291, 2019 DOI: https://doi.org/10.48084/etasr.2794

C. Romero, S. Ventura, M. Pechenizkiy, R. S. J. D. Baker, “Handbook of educational data mining, CRC Press, 2010 DOI: https://doi.org/10.1201/b10274

C. Romero, S. Ventura, “Educational data mining: A survey from 1995 to 2005”, Expert Systems with Applications, Vol. 33, pp. 135-146, 2007 DOI: https://doi.org/10.1016/j.eswa.2006.04.005

R. S. J. D. Baker, K. Yacef, “The state of educational data mining in 2009: A review and future visions”, Journal of Educational Data Mining, Vol. 1, pp. 3-17, 2009

The Dearing Report, Higher education in the learning society: Main report, Crown, 1997

G. Gibbs, “Using assessment strategically to change the way students learn”, in: Assessment matters in higher education, McGraw-Hill Education, 1999

P. E. Morris, C. O. Fritz, “Conscientiousness and procrastination predict academic coursework marks rather than examination performance”, Learning and Individual Differences, Vol. 39, pp. 193-198, 2015 DOI: https://doi.org/10.1016/j.lindif.2015.03.007

W. Hamalainen, M. Vinni, “Classifiers for educational data mining”, in: Handbook of educational data mining, CRC Press, 2011

J. M. Hellerstein, Quantitative data cleaning for large databases, United Nations Economic Commission for Europe, 2008

H. Hsu, P. A. Lachenbruch, “Paired t Test”, in: Wiley Encyclopedia of Clinical Trials, John Wiley & Sons, 2007 DOI: https://doi.org/10.1002/9780471462422.eoct969

B. K. Bhardwaj, S. Pal, “Data mining: A prediction for performance improvement using classification”, available at: https://arxiv.org/ftp/

arxiv/papers/1201/1201.3418.pdf, 2012

M. Alghobiri, “A comparative analysis of classification algorithms on diverse datasets”, Engineering, Technology & Applied Science Research, Vol. 8, No. 2, pp. 2790-2795, 2018 DOI: https://doi.org/10.48084/etasr.1952

S. Kotsiantis, K. Patriarcheas, M. Xenos, “A combinational incremental ensemble of classifiers as a technique for predicting students’ performance in distance education”, Knowledge-Based Systems, Vol. 23, No. 6, pp. 529-535, 2010 DOI: https://doi.org/10.1016/j.knosys.2010.03.010

Downloads

How to Cite

[1]
Alsuwaiket, M.A., Blasi, A.H. and Altarawneh, K. 2020. Refining Student Marks based on Enrolled Modules’ Assessment Methods using Data Mining Techniques. Engineering, Technology & Applied Science Research. 10, 1 (Feb. 2020), 5205–5210. DOI:https://doi.org/10.48084/etasr.3284.

Metrics

Abstract Views: 742
PDF Downloads: 434

Metrics Information

Most read articles by the same author(s)