The goal of any well-constructed test is to test students’ expertise on a topic and not their test-taking skills. We need to eliminate as many flaws in our questions as we can to “provide a level playing field for testwise and not-so-testwise students. The probability of answering a question correctly should relate to an examinee’s expertise on the topic and should not relate to their expertise on test-taking strategies.” (NMBE, 2001, p 19)
In this article, we will examine seven common flaws in the construction of multiple-choice questions that students can exploit to help them select the correct answer based on their testwiseness rather than content knowledge. By recognizing these common flaws, you can learn to write better questions for your tests and quizzes.
Grammatical Cues
Issue: This occurs when one or more of the distractors don’t follow grammatically from the question stem. This often relates to the use of a/an, verbs/nouns, or singular/plural.
Solution: Check for grammatical consistency.
- When using “a” or “an” make sure there is agreement with the wording of the options.
- Don’t ask for a noun answer and then have a verb as an option.
- Don’t ask for a plural answer and then have a singular option.
Distractor Length Cues: “too long to be wrong”
Issue: We often feel we should include more detail in the correct answer to ensure that it is clearly the best choice. This is called the “too long to be wrong” rule. Unfortunately testwise students can use this to their advantage.
Solution: Keep the options similar in length and level of detail.
Logical Cues
Issue: This most often occurs when one of the options contains all of the other options. Testwise students will select amongst similar options not outlier options. We often have writing difficulty when trying to generate the last option and often that last option is unrelated to the other options and the discrimination criteria that is provided in question stem.
Solution: The question should ask students to discriminate the options along the same discrimination dimension related to the specific criteria described in the question stem. All incorrect answer should be reasonable responses that do not fully meet the discrimination criteria specified in the question stem.
Repeating Words
Issue: Students look for words or phrases that are repeated in the question and options. The more repeating, the more likely it is the right answer.
Solution: Move repeating words to the stem, if possible. Review and revise questions to eliminate word repeats, if possible.
Using Absolute Terms
Issue: Many higher-level ideas are not absolutes. “Never True” and “Always True” are often too absolute. Students are experts at arguing the outlier idea that is outside such absolutes.
Solution: Avoid absolute words and substitute words that require more judgment – which is the best, which is least likely. Try to use wording that lets students discriminate between the various options using the criteria that is stated in the question stem.
Not Random Distractor/Options Order
Issue: The most common location for the correct answer is C or D. The trouble with this is that if the testwise student is reduced to guessing, they will guess C or D to maximize their chances. “In numeric options, the correct answer is more often the middle number than an extreme value.” (NMBE, 2001, p 22).
Solution: Review the entire test to ensure that C and D do not appear as the correct answer too often. Pay careful attention to numerical questions that have C or D as the correct answer.
Convergence strategy
Issue: The idea here is a testwise student will select the answer that combines the most elements of all the options. The correct answer most likely is the one that combines the most number of common elements from other distractors. If a word occurs in most of the options, mostly likely the correct answer will contain that word. If a term appears only in one option, then that probably is not the correct answer.
Solution: This most often happens when we write the correct answer first and then begin rearranging it to create the options. Review your finished question to see if you can use the convergence strategy to get inferences of the correct answer.
You can use question-writing rules (like those found in NMBE manual) to create great questions, but you also should use rules like those outlined here to check your work. Questions that are clear to one person often are imprecise to another. Find a friend or colleague who is willing to review your questions. It is much less painful to discover a question is bad in the comfort of your office, rather than in front of your students. The ironic thing about multiple-choice question-writing is that with more experience, more time is required to write a question, as our standards for what constitutes a good question increases.
You can download a free copy of the National Medical Board Examiners (NMBE) manual “Constructing Written Test Questions for the Basic and Clinical Sciences” here >>
Jim Sibley is the director of the Centre for Instructional Support at the Faculty of Applied Science, University of British Columbia.