Monthly Archives: September 2013

Testing the Test of Legal Problem Solving

ImageA student preparing for the Multistate Bar Exam could liken the exam to posing 200 Rubik’s cube-type questions—okay, they’re not as hard as solving really hard puzzles–but examinees may sometimes feel that the MBE can be that difficult.

Bonner and D’Agostino (2012) in their research study on the Multistate Bar Exam (MBE) test the test by asking:
• How important is a test-taker’s knowledge of solving legal problems to performance on the MBE?
• To what degree is performance on the MBE dependent on general knowledge, which doesn’t have anything to do with the law or legal reasoning? For instance, are test-wise strategies and common-sense reasoning important to doing well on the MBE?


With thousands of examinees taking this exam annually since 1972, you would think that the answer is a resounding yes to the first and a tepid yes to the second. Why else would most of the states rely on a test if it were not proved valid? But surprisingly, there hasn’t been any published evidence that this high-stake exam does in fact “assess the extent to which an examinee can apply fundamental legal principles and legal reasoning to analyze a given pattern of facts” as one article characterized the MBE — in other words, the MBE tests skills in legal problem solving.

Establishing test validity is the worth of the any test; validity is not an all-purpose quality, for it only has reference to its purpose. Establishing test validity is not uncommon in other fields such as medical licensure, where there are studies establishing validity of medical clinical competence , internal medicine in training exam, and the family physician’s examination.

A finding that common sense reasoning and general test-wise strategies are important to MBE performance would in fact indicate that the test lacks validity as a measure of legal problem solving. There is indirect support that the general common sense and test-wise strategies aren’t important. In his review of research Klein (1993) refers to a study in which  law students outperformed college students, untrained in the law, on the MBE. But we can’t conclude that common sense and test-wise strategies are critical for doing well on the MBE since maturation effects (just being older and smarter as a result) not legal training, cannot be ruled out as a reason for the superior performance of law students.

The Study

In devising a study to measure validity (construct and criterion) of the MBE, Bonner et al., (2012) drew on the very large body of research about novice-expert differences in problem solving. This research looks at the continuum of expertise suggested by these extremities and the spaces in between. Expert-novice studies compare how people with different levels of experience in a particular field of endeavor go about solving problems in that field. By doing this, scientists hope to see what beginners and intermediates need to do to go to the next level. Cognitive scientists have looked at these differences in many different areas—mathematics, physics, chess playing and medical diagnosis—but legal problem solving hasn’t got a lot of attention.

So what does this research tell us? Experts draw on a wealth of substantive knowledge to solve a problem in their area of expertise. They know more about the field and have a deeper understanding of its subtleties. An expert’s knowledge is organized into complex schemas (abstract mental structures) that allow experts to quickly hone in on the relevant information. Their knowledge isn’t limited to the substance of the area (called declarative knowledge); experts have better executive control of the processes they go about solving a problem. They have better “software” that allow them to use subroutines that weed out good and bad paths to a solution. In his article on legal reasoning skills, Krieger (2006) found that legal experts engage in “forward-looking processing.”

By contrast, novices and intermediates are rookies of varying degrees of experience in the domain in question. Intermediates are a little better than novices because they have more knowledge of the substance of a field, but their knowledge structures—schemas—aren’t as complex or accurate. Intermediates, though, are less rapid in weeding out wrong solutions and less efficient in honing in on the right set of possible solutions.

Bonner et al., classified law graduates as intermediates and devised a study to peer into the processes involved in answering MBEs. They anticipated that law school graduates needed an amalgam of general as well as domain-specific reasoning skills and thinking-about-thinking (metacognitive) skills to do well on the MBE.

Bonner used a “think-aloud” procedure –online verbalizations made during the very act of answering an MBE question– to peer into the students’ mental processes. A transcript of the verbalizations was then laboriously coded into types which, in turn fell into broader categories of legal reasoning (classifying issues, using legal principles and drawing early conclusions.), general problem solving (referencing facts and using common sense), and metacognitive (statements noting self-awareness of learning acts). The data was then evaluated with statistical procedures to see which of the behaviors were associated with choosing the correct answer on MBE questions.


Here are the results of Bonner’s and D’Agostino’s study:

Using legal principles had a strong positive correlation (r= .66) with performance. When students organized decisions by checking all options and marked the ones of that were most relevant and irrelevant, they were more likely to use the correct legal principles.
Using common sense and test-wise strategies had a negative, but not significant, correlation with performance. In fact test strategizing had a negative relationship with performance. Using “deductive elimination, guessing, and seeking test clues was associated with low performance on the selected items.”
Among the metacognitive skills, organizing (reading through and methodically going through all of options) had a strong positive correlation (r= .74) with performance. (Note to understand the meaning of a correlation, square the r and the resulting number is the degree to which the variation in one variable is explained by the other; in this example, 55% of the variation in MBE performance is accounted for by organizing.) When students are self-aware and monitor what they are doing, they increase their chances for picking the right answers.

The Takeaway

The bottom line is that the MBE does possess relevance to the construct of legal problem solving –at least with respect to the questions on which the think-aloud was performed. In short, Bonner et al., demonstrate that the MBE has validity for the purpose of testing legal problem solving.

Unsurprisingly, using the correct legal principle was correlated strongly with picking the right answer. This means a thorough understanding and recall of the legal principles are all critical to performance. That puts a premium not only on knowing the legal doctrine well, but exposing yourself to as many different fact patterns will help students to spot and instantiate the legal rules to the new facts. Analogical reasoning research says that the more diverse exposure to applications of the principles should help trigger a student’s memory and retrieval of the appropriate analogs that match the fact pattern of the question. Students should take a credit bearing bar related review course, and they should take this course seriously.

If you want to do well on the MBE, don’t jump to conclusions and pick what seems to be the first right answer. The first seemingly right answer could be a distractor, designed to trick impulsive test-takers. Because of the time constraints, examinees are especially susceptible to pick the first seemingly right answer. But the MBE punishes impulsivity and rewards thoroughness– so examinees should go through all the possible choices and mark the good and wrong choices. They shouldn’t waste their time with test-wise strategies of eliminating choices based on pure common sense or deduction that has no reference to legal knowledge and looking for clues in the stem of the question. Remember, test-wise strategies unconnected with legal knowledge had a negative relationship to MBE performance.

Although errors in facts that result from poor comprehension were not prevalent among the participants in Bonner and D’Agostino’s study, that doesn’t mean that good reading comprehension is unimportant. If a student has trouble with reading comprehension, the student should start the practice of more active engagement with fact patterns and answer options as Elkin suggests in his guide.

Finally, more training in metacognitive skills should improve performance. Metacognition, which is often confused with cognitive skills such as study strategies, is practiced self-awareness. As Bonner and D’Agostino, “practice in self-monitoring, reviewing, and organizing decisions may help test-takers allocate resources effectively and avoid drawing conclusions without verification or reference to complete information.”

Leave a comment

Filed under Educational Psychology, legal education

Driven to Distraction?

From the back of any classroom in law school, you’re likely to see students surfing the web, checking emails, or texting while a lecture or class discussion is going on. A recent National Law Journal article reports a study’s finding that 87% of the observed law students were using their computers for apparently non-class related purposes—playing solitaire, checking Facebook and looking at sports scores, and the like—for more than 5 minutes during class. The study by St. John’s Law Professor Jeff Sovern was published in a recent edition of the Louisville Law Review. 2 and 3L’s –not 1Ls–were the groups most likely to be engaged in non-class related activities.


Conventional wisdom and common sense tell us that media multitasking is bad for learning and instruction. For stretches of time, students in class aren’t paying attention to what’s happening in class. But it’s unclear whether the lack of engagement in the class is for any perceived good reasons. Perhaps the class discussion or portion of the lecture was perceived as boring or not relevant to what will be on a test.

Scientists have termed this student behavior to be “media multitasking,” and it isn’t confined to the classroom. Many students probably do this while studying. And media multitasking isn’t just a student problem—we all do it. Psychologist Maria Konnikova in her New Yorker blog writes that the internet is a like a “carnival barker” summoning us to take a look.

Konnikova points to a study that suggests a possible adverse side effect of heavy media multitasking. Ophir, Nass, and Wagner (2009) conducted an experiment where they asked the question: Are there any cognitive advantages or disadvantages associated with being a heavy media multitasker? This is one of the few studies that look at the impact of multitasking on academic performance.

In the experiment, participants were classified and assigned to two groups—heavy media multitaskers or light media multitaskers. Participants in the two groups had to make judgments about visual data such as colored shapes, letters or numbers. Ophir et al., found that when both groups were presented with distractors, heavy users were significantly slower to make judgments about the visual data than light users and that heavy users were more likely to make mistakes than the light users. Also heavy users had a more difficult time switching tasks than light users when the distractor condition was presented. The researchers explained the results with the theory that the light users had a more top-down executive control in filtering in relevant information and thus were less prone to being distracted by irrelevant stimuli. By contrast, heavy users had a more bottom-up approach of taking in all stimuli and thus were more likely to be taken off the wrong track by irrelevant stimuli.

In short, the researchers concluded that being a heavy media multitasker has adverse aftereffects. If you’re a heavy user, you’re more likely to be taking in irrelevant stimuli than if you are a light user. In other words, heavy media multitasking promotes the trait of being over-attentive to everything, relevant or irrelevant. That could mean that if you’re a heavy user, your lecture or book notes are more likely to contain irrelevant information – irrespective of whether you’re actually online when you were taking notes. It’s as though heavy media use has infected your working memory with a bad trait. Because the effects are on working memory, heavy media multitaskers should be advised to go back and deliberate over their notes, and revise accordingly.

But a note of caution. The study’s conclusions are a long way from being applicable to what actually happens in academic life. The study involved exposures to relevant and irrelevant visual and letter or number recognition stimuli. So it’s unclear whether the heavy user is equally distracted when she or he is presented with semantic information—textual or oral information that has meaning. And that’s an important qualification of the study’s conclusions because semantic information constitutes the bulk of what we work on as students. As far as I can tell, there is no such study where participants are asked to make judgments about semantic information.

Also the direction of causality is unclear. Does heavy multitasking make a person a poor self-regulator of attention? Or does heavy media multitasking happen because a person is a poor self-regulator?

Although the research is far from conclusive, we shouldn’t wait to act on what exists. Here are a few suggestions on what students and instructors can do.

• First, manage your distractions. Common sense dictates that being distracted is bad for studying—whether the source of the distraction is the internet, romantic relationships or television. Take a behavioral management approach. Refrain from using the internet, until you’ve spent some quality undistracted time with your studying or listening to what’s happening in class and then reward yourself with a limited amount of internet time – after class.

• Second, assess your internet use. Monitoring your use alone will have a reactive effect. Just being aware of what you’re doing will cut down on the amount of irrelevant stimuli you’re exposed to.

• Third, if you identify yourself as a moderate to heavy internet consumer of non-academic content while studying or in class, be more vigilant about the quality of your lecture and study notes. These notes are more likely to contain a lot of irrelevant material, and you’ll need to excise irrelevancies.

• Fourth, instructors take note that if they see their students tuning out in class, they should re-evaluate their lesson plans and classroom management to more fully engage the majority of their students. Banning the internet from the lecture halls won’t necessarily make students more engaged. If the class doesn’t engage the students, they will find other ways to tune out—with or without the internet.

1 Comment

Filed under Educational Psychology, legal education

George Miller’s Magical Number 7, Novice Law Students and Miller’s Real Legacy

Today’s NY Times Science section featured a story entitled Seven Isn’t the Magic Number for Short-Term Memory on psychologist’s George Miller classical paper on the limits of human’s short-term memory. His theory established in a 1956 article is that there is a numerical limit of 7 items that humans can retain in “short-term” memory. The gist of the NYT story is that the limit is quaint but outdated. So what’s interesting is that the same point in the Times article was made almost 21 years ago by prominent theorist Alan Baddley (1994) in The Magical Number Seven: Still Magic After All These Years. The point made in his article is that there isn’t strictly a limit of 7 on short term memory, and that there are many exceptions or qualifications to the limit of 7.

Short-term memory, in contrast to long-term memory, storage—is our workbench or working memory that we use in active processing of information. Short-term memory limits have a big impact on novices in a new field or endeavor—e.g., novice learners of the law. In fact, novice law students have greater difficulty reading case law and solving problems using case law because novices lack the automaticity in cognitive functioning that those with greater legal expertise have. Novice law students experience a kind of cognitive overload—as they have to exert great effort on mental-intensive processes to read and interpret legal cases and solve problems based on the cases they have read.

The NY Times blurb article misses the point about Miller’s legacy. The real legacy behind his theory is that it was important to moving psychology to information-processing models of human thinking. Miller’s article was published in the mid-1950’s when the prevailing learning theory was based on a behaviorism (learning based on stimulus and response). As Baddeley points out about Miller’s theory,

“Miller pointed the way ahead for the information-processing approach to cognition.”

That’s the real enduring magical legacy of Miller.

Leave a comment

Filed under Educational Psychology

The Evidence Behind Adding Civ Pro to the Multistate Bar Exam

In February 2013, the National Conference of Bar Examiners (NCBE) made an announcement of a major change to the Multistate Bar Exam (MBE).  Starting with the February 2015 administration of the MBE, civil procedure will be added the list of subject areas being tested. Currently constitutional law, contracts, criminal law/criminal procedure, evidence, real property and torts are tested on the MBE. Since 1972, the MBE has been a high stakes licensing exam for attorneys, and a large aspect of passing the bar in most U.S. jurisdictions. (Today, Louisiana and Puerto Rico are the only exceptions as Washington state was added this year.) The new MBE will still consist of 200 multiple-choice items (of which 190 are actually scored), but there will be fewer items for each of the current subjects in order to make room for civil procedure.  photo

The addition of civil procedure is of little surprise, for experimental civil procedure-like questions had appeared in recent past administrations of the MBEs.  But the justification for this change was not apparent. A review of the NCBE’s 2012 “job study” makes it clear why civil procedure was added. The heart of the “job study” was the results of a survey to recent law graduates with 1 to 3 years of law practice experience.  Survey participants were asked to rate the significance of specified legal tasks, knowledge areas, and skills/abilities. The laws of civil procedure had the highest rating (3.08 in a scale from 1 to 4) among all the knowledge areas, and 86% of the participants of the survey said that civil procedure was both significant and frequently used in their work.

NCBE had contracted a private consultant to do the  job study  “to determine what new lawyers do, and what knowledge, skills, and abilities newly licensed lawyers believe that they need to carry out their work.”  The job study and its survey was part of NCBE’s effort to establish “content validity” for future and current versions of the MBE–in other words,  is the content of the MBE reflective of the actual knowledge or skills that is required of newly admitted attorneys.

Survey participants consisted of law graduates with 1 to 3 years of practice experience, and the makeup of the participants is reflective of the racial composition of law graduates nationally (78% were non-minority while 22%, minority) and the majority (about 52%) worked in private law firms.  The survey methodology for the job study is sound, and law schools should think about using the NCBE’s survey results and survey methodology to evaluate their own curriculum. Take a look at the study full study and the summary of the results. However the jobs study was only a first step toward establishing the MBE’s test validity; the study does establish relevance of the knowledge areas being tested—in other words, content validity.

But the bigger issue is that the evidence for the criterion and construct validity of the MBE hasn’t been strong. These types of validity are essential qualities of a good test    Validity raises the following issues: How well does the MBE predict whether an individual will engage in  a specified level of legal thinking? What exactly are the constructs that the MBE seeks to measure and how well does it measure these constructs? Educational psychologist Sarah Bonner (2012) points out  “the existing criterion and construct-related validity evidence in support of the argument that the MBE is a measure of domain-specific legal thinking is not strong, suggesting a need for an inquiry into processes underlying performance.”  Bonner then proceeded to conduct a study on whether the MBE measures domain specific and domain general skills. But more about the study in a future blog.

1 Comment

Filed under Educational Psychology