Multiple-choice exams in a liberal-arts curriculum (Part II)

Notwithstanding the urging of the speaker at the 1984 AALS workshop for new teachers (see Part I below), multiple-choice questions would seem, at first blush, to be inappropriate in courses that emphasize synthesis of highly abstract concepts and application of that knowledge to intentionally ambiguous facts. Consider, for example, this on-line post by a bemused law student:

I’ve never, ever been asked to take a multiple choice exam in law school. The closest I’ve come is a very few exams that have asked me to to give yes/no answers followed by justifications of my answer in fewer than 50 words. If a professor gave a multiple choice exam, I’d immediately think less of him/her and less of the class. Isn’t one of the main points of law school to teach students that very few legal issues are black and white?

Put otherwise, we spend the semester trying to persuade our students that there almost always arguments but no answers, and then we give them a test in which they are expected to identify the one answer out four that is undeniably correct.

I imagine we can find students in every department in the humanities and most or all of the social sciences with the same complaint. Despite the criticism, however, perhaps a second blush is in order.

There is no doubt that the use of multiple-choice exams is on the rise in law schools (and, I suspect, in other parts of the university, as well.) The experience of a recent UT Law grad was that “maybe three” classes used essay exams in his three years there. More and more of my colleagues here at SMU are using multiple-choice questions for all or some of their final exams. Some limit their use of multiple-choice questions to testing “black-letter law” (“Which of the following is the best statement of the doctrine of res ipsa loquitur“), which at least addresses one of my quarrels with multiple-choice exams. If the instructor can break down the analysis of an essay question into its constituent parts (issue-spotting, black-letter law, analysis and evaluation of competing arguments), then students can get credit for what they know or can do well and lose credit for what they haven’t learned or cannot do well and the multiple-choice form can transcend the all-or-nothing nature of most multiple-choice exams.

That last point really gets to the nub of the problem. Is it, in fact, possible to construct a series of multiple-choice questions that are a fair test of all the skills we think we are teaching and want to test for? Or at least a test that does a reasonably good job of test for those skills with the added advantage that the evaluation will be consistent from student to student and over time? Many in legal education believe so. Relying heavily on a 1984 essay by UT professors Marilla Svinicki and Bill Koch, Professor Stephen Bainbridge writes:

I first used a multiple choice exam when I was teaching at Illinois. The Associate Dean had assigned me two sections of Business Associations in the same semester, with a total enrollment of 230 students. Unable to face that many bluebooks, I worked up a set of objective questions and haven’t looked back since.

Professor Bainridge goes on to praise multiple-choice exams for their flexibility, ability to produce a more comprehensive test of a semester’s worth of material, and reliability. Bainbridge admits that multiple-choice questions provide a poor test of a student’s ability to “think like a lawyer,” but he concludes that after the first year of law school, that is not terribly important. In the second and third years, he argues, “we are teaching – and testing – knowledge more than basic reasoning and communication skills.” I don’t know how many of my colleagues would agree with that statement. Thinking like a lawyer may mean different things to different law professors, and I for one believe that there are lower and higher orders of such thinking that I try to teach throughout all three years of law school and therefore want to evaluate on a final exam.  It is interesting, though, that even an enthusiastic fan of multiple-choice exams perceives such a severe limitation.

The question remains: Is it possible to write a series of multiple-choice questions that test fairly for all the types of mental activity we believe we are testing for with our essay questions? (That assumes, by the way, that essay questions test fairly for all those types of mental activity. Setting aside the grading problems of subjectivity and inconsistency, that form of testing might have its own built-in flaws.) For now, I don’t have an answer. Unless something changes in a radical way, I will probably stick to MC + short essay as my preferred format. But I will also be consulting the enormous resources available to anyone who is, like me, looking for an answer. In my field, that quest will resume with Case SM, Donahue BE, Developing High-Quality Multiple-Choice Questions for Assessment in Legal Education, J Leg Educ, 2008;58:372-87 (available in the E-journal library maintained by our Central University Libraries). (I am willing to bet that comparable pieces exist in every one of our respective disciplines.) The following passage from this article gives me hope [emphasis added]:

The format selected for a test should be tied to the knowledge and skills one hopes to measure. An oral examination generally is inappropriate to assess reading skill. Similarly, a multiple-choice exam generally is inappropriate to measure writing skill. But where the goal is to measure an examinee’s knowledge of a particular field or to measure an examinee’s ability to apply legal reasoning to a variety of fact patterns, the multiple-choice format has several distinct advantages over other formats, most notably in content coverage, grading [e]ase, grading  consistency, and reliability of scores.

Finally, a Google search (yes, I know . . . ) yields a treasure-trove of articles on constructing multiple-choice exams, as well as some useful guides to taking them, which, by reverse-engineering, also provides test-writers with some useful insights.

About Thomas Mayo

AA-Law(Faculty)
This entry was posted in Assessment, Pedagogical Theory, Technology. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>