USING RUBRICS AND CONTENT ANALYSIS FOR EVALUATING ONLINE DISCUSSION: A CASE STUDY FROM AN ENVIRONMENTAL COURSE
Abstract
This paper presents a case study of using course-specific rubrics combined with content analysis, together with instructor and student feedback, to assess learning via online discussion. Student feedback was gathered via Small Group Instructional Diagnosis, and instructor feedback was collected through formal interviews. Content analysis used emergent coding with different assessment criteria for each phase of the online discussion. Student participation was high, with a number of students feeling they learned beyond what was discussed in class. Some students however were overloaded by the large number of postings and repetitiveness during some of the phases of the discussion. The instructor was pleased to find students who were quiet in class being active in the online discussion. However, he found that student contributions demonstrated insufficient reflection and critical thinking. Content analysis showed that students met, on average, 59-82% of the essential assessment criteria in their postings, and that their contributions significantly improved as the online discussion progressed. However, a limited number of postings reflected critical thinking. In using online discussion, the use of assessment criteria is therefore commendable, as it was found that content analysis gave an insight beyond student and instructorperceptions. The insights gleaned from the methodology indicate its usefulness in assessing online discussion activities more objectively, and with respect to specific learning objectives.Date
2019-02-11Type
info:eu-repo/semantics/articleIdentifier
oai:ojs.ec2-52-6-73-98.compute-1.amazonaws.com:article/1713https://olj.onlinelearningconsortium.org/index.php/olj/article/view/1713
10.24059/olj.v11i4.1713