D2L - Quiz Item Analysis (UW-Madison)
UW-Madison is adopting Canvas as the single, centrally supported learning management system, and discontinuing support for Desire2Learn (D2L) and Moodle. Access to D2L and Moodle will permanently end June 1, 2018. For information about retaining materials from D2L or Moodle, please refer to this document.
Instructors can use the quiz reports function in D2L to gauge the difficulty of multiple choice/select questions, as well as visually see what answers served as effective distracters (or incorrect answers).
To generate the question statistics report, click the statistics icon from the "action menu" dropdown. Select the desired tab to view graphical reports, and click Export to CSV file for a download of the data.
The D2L Question Statistics report displays averages scores for the quiz as well as individual questions. The question statistics section of this report shows the percentage of students that got a particular question correct (also known as a difficulty index).
Typically, questions that have a difficulty index of around .5 are seen as valid, as they will create the greatest distribution of scores. While questions with an index below .2 may be too difficult, questions at .7 or higher may be desired if mastery of a topic is the goal. Any questions that do not match the desired difficulty may need revision.
Note: exporting to a csv file and sorting the “%” row in Excel may make interpreting these data easier by highlighting questions out of the acceptable boundaries.
The question detail report is a visual summary of how students answered each question. From this view it is possible to roughly estimate what answer options were effective distracters, as well as if any incorrect options were selected to an intolerably high degree. Note that this is not the same as an “item discrimination” analysis, which requires further calculation not supported in D2L. A full “item discrimination” analysis would ensure that students who generally do well on a test are more likely to select the correct answer, and that students who do poorly are more likely to select an incorrect answer.
For every multiple choice/select question, a bar graph and percentage is given for the total number of times an item choice was selected as an answer. This will show the distribution of students’ answers, and may reveal answers which served as distracters. This can be an indicator of topics which warrant more discussion to increase student understanding.
For more advanced item analysis options, please contact the DoIT Helpdesk for support.