Developing an instrument for evaluating the quality of E-Learning courses for nursing education
AbstractThe aim of this study was to develop an instrument for evaluating the quality of e-learning courses for nursing education. In order to achieve the research aim, the following questions were answered: (1) What e-learning quality assurance tools exist? (2) Are these existing tools suitable to be directly used to evaluate the quality of e-learning courses for nursing education? (3) What elements of e-learning courses for nursing education should be measured? (4) What questions should be asked to measure each element of e-learning courses for nursing education? And (5) How can the validity of the measurement instrument be tested? Colton and Covert’s (2007) framework of instrument development was selected to support the aim of this study. Thirty elements that should be measured to evaluate the quality of e-learning course for nursing education were identified through a comprehensive literature review and dimensional analysis. These elements were: motivation, computer skills, time issue, inexperience with e-learning, accessibility, self-pacing study, written communication, computer equipment and Internet connection, privacy issues, qualification, accessibility, responsiveness, student per teacher, accuracy, update, relevancy, content presentation, the format of text, pedagogical format, interaction, discussion, navigation, technical support, self-help service, advising and counseling service, download and printout service, technology requirement, time zone issues and security. These elements were categorized into six dimensions of the framework of e-learning quality assurance for nursing education: learner, teacher, content, delivery mode, service and technology. Based on this framework, a questionnaire instrument for measuring these quality elements was developed. To ensure the instrument was able to provide credible and accurate information, it was tested for the face validity and content validity through consultation with ten experts. However, until further validity and reliability testing is done, the instrument cannot be employed.