Reusing Written Test Items

A case study based on analysis of 132 five-option (528 distractors) multiple-choice questions (MCQs) developed at MUW included in five in-house exams. Questions repeated at least twice during the period considered and constituted 42.4% of all MCQs. Each MCQ was assessed on the basis of the following three indicators: difficulty index (DI), discrimination power (DP), and the number of non-functioning distractors (N-FD). The change in psychometric indicators of test items was assessed using Krippendorff alpha coefficient (αk).

Together with each MCQs repetition, a decrease in the number of questions that would maintaining the analogical DI value towards the initial level of easiness was observed. However, the level of DI compliance was significantly higher, even when there were five consecutive repetitions (coefficient αk for the consecutive repetitions was 0.90, 0.85, 0.78 and 0.75). N-FD number in consecutive repetitions remained on a satisfactory level (good and very good compliance), although there was a significant decrease in this range when there were three or more repetitions (coefficient αk was 0.80, 0.69, 0.66 and 0.65, respectively). Whereas the level of similarity as for DP for consecutive repetitions was significantly lower in comparison with those noted for DI and DE (DP coefficient αk was 0.28, 0.23, 0.25 and 0.10, respectively).

Comments

Popular posts from this blog

uses of exams for problem solving skill

The Power of Wh Questions

How can students achieve more