M-CAFE 1.0: Motivating and Prioritizing Ongoing Student Feedback During MOOCs and Large on-Campus Courses using Collaborative Filtering
Author(s)
Zhou, MoCliff, Alison
Krishnan, Sanjay
Nonnecke, Brandie
Crittenden, Camille
Uchino, Kanji
Goldberg, Ken
Keywords
EngineeringPhysical Sciences and Mathematics
Social and Behavioral Sciences
Education
Course evaluation
Collaborative filtering
MOOC
Full record
Show full item recordOnline Access
http://www.escholarship.org/uc/item/61c9d9tzAbstract
During MOOCs and large on-campus courses with limited face-toface interaction between students and instructors, assessing and improving teaching effectiveness is challenging. In a 2014 study on course-monitoring methods for MOOCs [30], qualitative (textual) input was found to be the most useful. Two challenges in collecting such input for ongoing course evaluation are insuring student confidentiality and developing a platform that incentivizes and manages input from many students. To collect and manage ongoing (“just-in-time”) student feedback while maintaining student confidentiality, we designed the MOOC Collaborative Assessment and Feedback Engine (M-CAFE 1.0). This mobile-friendly platform encourages students to check in weekly to numerically assess their own performance, provide textual ideas about how the course might be improved, and rate ideas suggested by other students. For instructors, M-CAFE 1.0 displays ongoing trends and highlights potentially valuable ideas based on collaborative filtering. We describe case studies with two EdX MOOCs and one on-campus undergraduate course. This report summarizes data and system performance on over 500 textual ideas with over 8000 ratings. Details at http://m-cafe.org.Date
2015-09-01Type
ArticleIdentifier
oai:qt61c9d9tzoai:qt61c9d9tz
http://www.escholarship.org/uc/item/61c9d9tz