ITEM ANALYSIS FOR TEST TO EXAMINE THE EFFECT OF E-MODULE ON THE ACADEMIC PERFORMANCE OF 7TH CLASS SCIENCE STUDENTS IN ISLAMABAD

##plugins.themes.academic_pro.article.main##

Arshad Mehmood Qamar, Dr.Wajiha Kanwal, Hamid Ali Nadeem

Abstract

The main aim of this research was  to investigate quantitative characteristics ( difficulty level, discrimination index and distractor efficiency)  of research tool  regarding the first four units of 7th class general science to be used for pretest on the research on “effect of e-module on the academic performance of 7th class general science students in Islamabad”. The researchers got the data from 210 randomly selected students both from urban and rural area schools  of Islamabad Capital Territory. Researchers used 33% students from high achievers and 33% from the lower achievers by taking 70 students from HE and 70 students from LEs. In difficulty level (Diff I) and discrimination index (DI) of total 140 students were taken both from group of high marks from top corner and group of lower marks for the bottom corner. Distractor analysis (DE) was done on the data of 210 students.  Descriptive statistics was used to analyses the data. On the basis of findings from the analyses of difficulty level and discrimination index that two test items (16 & 17) were eliminated from the tool due to very low values of Diff. I (<29) and DI values (<20).  Four items were difficult (revised and improved) and 17 items were moderately difficult retained and sustained. There was only one item which was easy.  Discrimination index (DI) of 22 items was discriminatory and retained after minor improvements. Distractor analysis (DE) determines the functionality of distractors. If some distractors are not selected by the students/respondents they are considered for revisions. In this research only one distractor was found to have DE value less than 5%, and three distractors with DE value less than 7%. It was recommended to eliminate and change distractor with value of DE less than 5% and revise and improve distractors with DE value less than 7%. It was also found that except two test items all items were valid and reliable. It was further recommended that item analyses is very useful technique to find the reliability and validity of a tool .


 

##plugins.themes.academic_pro.article.details##

References

Atalmiş E, Kingston N.(2017) Three, four, and none of the above options in multiple-choice items. Turkish Journal of Education. 2017; 10;6(4):143-57. (doi. org/10.19128/turje.333687.)
Burud, I., Nagandla, K., & Agarwal, P. (2019). Impact of distractors in item analysis of multiple choice questions. International Journal of Research in Medical Sciences, 7 (4), 1136-1139.
Cizek, G.J., & O'Day, D.M. (1994). Further investigation of nonfunctioning options in multiplechoice test items. Educational Psychol Measurement, 54(4),861-872.
Considine, J., Botti, M., & Thomas, S. (2005). Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian. 12 (1),19-24.
Gronlund, N. E. (1998). Assessment of student achievement. 6th edition. Boston: Allyn and Bacon.
Jannah, R., Hidayat, D. N., Husna, N., & Khasbani, I. (2021). An item analysis on multiple-choice questions: a case of a junior high school English try-out test in Indonesia. Leksika: Jurnal Bahasa, Sastra dan Pengajarannya, 15(1), 9-17.
Kumar, D., Jaipurkar, R., Shekhar, A., Sikri, G., & Srinivas, V. (2021). Item analysis of multiple choice questions: A quality assurance test for an assessment tool. medical journal armed forces india, 77, S85-S89.
Mahjabeen, W., Alam, S., Hassan, U., Zafar, T., Butt, R., Konain, S., Rizvi, M. (2017). Difficulty index, discrimination index and distractor efficiency in multiple choice questions. Annals of PIMS, 310-315.
Malau-Aduli, B. S., & Zimitat, C. (2012). Peer review im¬proves the quality of MCQ examinations . Assessment & Evaluation in Higher Education, 37( 8), 919-931.
Mehta, G., & Mokhasi, V. (2014). Item analysis of multiple choice questions: An assessment of the assessment tool. International Journal Health Science Research, 4,197-202.
Namdeo S.K.,& Sahoo B. (2016).Item analysis of multiple choice questions from an assessment of medical students in Bhubaneswar, India. International Journal of Research in Medical Sciences, 4 (5),1716-1719.
Namdeo, S.K., & Rout, S.D. (2016). Assessment of functional and nonfunctional distractor in an item analysis. International Journal of Contemporary Medical Research, 3 (7),1891-1893.
Patil V. C, & Patil, H. V. (2015). Item analysis of medicine multiple choice questions (MCQs) for under graduate (3rd year MBBS) Students. Research Journal of Pharmaceut Biol Chem Sci, 6,1242-1251.
Patil, R., Palve, S.B., Vell, K., & Boratne, A.V. (2016). Evaluation of multiple choice questions by item analysis in a medical college at Pondicherry, India. International Journal of Community Medicine and Public Health, 3(6),1612-1616.
Popham, W. J. (2008). Classroom assessment: What teachers need to Know (5th ed.). Boston: Allyn and Bacon.
Sharma, L. R. (2021). Analysis of difficulty index, discrimination index and distractor efficiency of multiple choice questions of speech sounds of English. International Research Journal of MMC, 2(1), 15-28.
Sim S.M.& Rasiah R. I. (2006). Relationship between item difficulty and discrimination indices in true/falsetype multiple choice questions of a para-clinical multidisciplinary paper. Annals Academy of Medicine Singapore, 35, 67-71.
Singh T, Gupta P, Singh D. (2009). Test and item analysis . Principles of Medical Education 3rd ed, 70-77.
Tarrant, M., Ware, J., & Mohammed, A.M. (2009). An assessment of functioning and non- functioning distractors in multiple choice questions: a descriptive analysis. BMC Medicine Education, 9(40),2-8.
Trice, A. D. (2000). A handbook of classroom assessment. New York: Longman.
Vyas, R., & Supe, A. (2008). Multiple choice questions: A literature review on the optimal number of options. National Medical Journal, India, 21, 130-133.
Zubairi A.M. & Kassim N.L. (2006). Classical and Rasch analysis of dichotomously scored reading comprehension test items. Malaysain Journal of English Language Teaching Research, 2, 1-20.