I hope you won’t stop reading once you find out the idea being proposed here involves automating the feedback provided students on papers, projects, and presentations. If you were to look at a graded set of papers and make a list of the comments offered as feedback, how many of those comments have you written more than once? Is the answer many? If so, you should read on.
The author proposing this idea points out how rubrics have expedited the grading process for many faculty and also clarified expectations for students, but when the paper is returned, the student gets the rubric with a check next to quality level attained and maybe a few brief remarks squeezed into a small space provided for comments. What this assumes is that students will look at their paper and see why it merited that particular quality rating. That assumption is questionable, based on student levels of skill and their motivation to attend to feedback.
What the author has done is create a large collection of detailed comments that he imports into the grading rubric. He doesn’t show students all the levels—they see those when the rubric is distributed at the time the assignment is made. They see the level their assignment has been given and then a detailed set of comments that explain why that level was earned and how the student can improve for a higher level on the next assignment.
It may take a while to develop the collection of comments, but you can start using them before the collection is complete. The quality of these comments can be significantly higher than those we dash off after a full day of teaching, cleaning up the kitchen, and helping the kids with homework. They can be prepared and revised when we aren’t tired. Once the collection gets large enough, comments can be categorized, and any given comment may exist in several different versions. The author categorizes according the levels that appear on the rubric. So, if the assignment meets the top criteria, he has a collection of top-criteria comments he can make. The author recommends storing comments in an Excel spreadsheet.
What if students figure out they are getting “canned” feedback? Many are already inclined not to pay much attention to our careful comments. Wouldn’t the fact the comments aren’t written exclusively for them give them an excuse to ignore the feedback even more thoroughly? Technology makes it easy to personalize any comment. You can use the student’s name, insert an example pulled from their assignment, or think of the comment as a canned shell that you can slightly revise as you use it. All of a sudden, the feedback is personal. The author maintains his students never figured out they were getting “canned” comments.
This approach may not be for everyone, but with so much on our plates, we need to be open to time-saving possibilities. The author of the article referenced below was able to document some positive impacts on student work and attitudes with the system of automated comments he developed.
Reference: Czaplewski, A. J. (2009). Computer-assisted grading rubrics: Automating the process of providing comments and student feedback. Marketing Education Review, 19 (1), 29-36.
Reprinted from Expediting Feedback to Students. The Teaching Professor, 25.4 (2011): 4.