While most faculty stick with the tried-and-true quiz and paper assessment strategies for their online courses, the wide range of technologies available today offers a variety of assessment options beyond the traditional forms. But what do students think of these different forms?
Scott Bailey, Stacy Hendricks, and Stephanie Applewhite of Stephen F. Austin State University experimented with different assessment strategies in two online courses in educational leadership, and surveyed students afterward on their impressions of each one. The students were asked to score the strategies using three criteria: 1) enjoyment, 2) engagement with the material, and 3) transferability of knowledge gained to practice. The resulting votes allowed investigators to rank the various strategies from least to most preferred by students.
Interestingly, scores for the three criteria were remarkably consistent within each strategy. Students who found an activity highly enjoyable normally found it engaging and with a high degree of transferability of knowledge and vice versa. Moreover, traditional forms of assessment tended to score near the bottom.
The rankings provide a guide for any faculty member looking to develop engaging online content. Below are the different strategies grouped from lowest to highest in preference.
Lowest
- Quizzes were by far the lowest-ranked assessments on the list. Very few students found the information transferable to other environments.
- Traditional papers between two and eight pages long scored higher than quizzes, but were still near the bottom of the list.
- Group projects were also ranked low on the list. Students were asked to collaborate on a series of tasks and submit a written paper on the outcome of their efforts. While faculty assign these to teach collaboration skills, students often see them as creating additional coordination work and the free rider problem.
Middle
- Audio recordings fell into a middle category. Students created recordings of themselves explaining course concepts as they would to parents, faculty, or other groups. While some students were initially hesitant about the technology, they quickly picked up the systems and generally enjoyed the activity.
- Open discussion involved students posting to the traditional course discussion forum on their LMS. Students generally valued open discussion, but it is important to structure it in a way that provides interesting and thought-provoking questions.
- Paired discussion was a variation on the traditional discussion in which students posted messages on course room boards in groups of two to five. The ratings for these group discussions were similar to those for course-wide discussions.
Highest
- Response to video was at the top of the list. Here students watched a video documentary and responded with a written analysis. Students found the video documentary inspiring and moving. They connected with it on a more emotional level than they would a reading, as it provided a real-world connection to the material. An instructor can find a wide range of excellent documentaries to include in courses from sites such as Free Documentaries TV.
- Twitter summaries came in as the second most preferred form of assessment activity. Students were required to summarize in a tweet each of the chapters that they read. By being limited to 140 characters or fewer, the exercise helped students distill main points down to central themes, which is important for synthesizing points in the material.
- Screencasts were the next most highly ranked types of assessments. The students created mock presentations that they would give as new administrators to the faculty of a school. The screencasts included both the presentation material and a corner webcam video of the students themselves delivering the narration. Free systems such as Screencast-o-matic are ideal for creating screencasts that combine computer display with webcam videos. The basic format can be applied to a variety of subjects and assignments, such as students in a history course doing a mock presentation to a local historical society on a famous event.
- Field experiences involved students taking part in an experience related to the course content. This serves as a reminder that even online students can be given assignments that require some sort of fieldwork. It might be to catalog fauna in a local park for a biology class, or to report on local bridge structures for a civil engineering course.
- Interviews of local school administrators were also popular. Students interviewed two administrators and created a reflection on what they learned about the positions. Once again, these interviews connected the course material with practice.
- Work samples provided students with an opportunity to take a given data set or scenario and produce a document similar to one they would create as practitioners in their field. These could be professional development plans for their faculty, campus needs assessments, etc. These provided an opportunity for students to apply what they learn to a professional situation.
Themes
The student preferences suggest a few principles that can guide an instructor in choosing assessments for an online course. For one, the mere fact that students were given something beyond the same old papers and quizzes created engagement. Novelty itself can be a reason for choosing an assessment activity. Two, assignments that allow students to apply their knowledge to real or hypothetical scenarios are preferred over academic exercises that just ask students to repeat what they know. These application activities are often called “authentic” assessments, as they mimic how the student will be using their knowledge in the future. Three, an engaging assessment is often a result of engaging content. Even good old-fashioned written assignments were viewed favorably when they were based on an engaging video.
Reference
Bailey, S., Hendricks, S., Applewhite, S. (2015). Student Perspectives of Assessment Strategies in Online Courses, Journal of Interactive Online Learning, v.13, n. 3, 112-25.
Reprinted from Online Classroom, 16.1 (2016): 1,4. © Magna Publications. All rights reserved.