Skip to main content
Cold Calling and Bad Pizza
Andrew Watson
Andrew Watson

When I was in grad school, a well-known professor announced that — given everything we know about the effects of stress — it is professional malpractice to “cold call” on students. (To “cold call” means to call on a student who hasn’t raised her hand.)

Imagine the cascade of bad results.

When cold-called, the student feels stress. Cortisol levels go up. Excess cortisol interferes with learning. In fact, long-term excess cortisol damages the hippocampus. (You can check out this video here.)

My professor’s claim struck me as shocking, because Doug Lemov argues so strongly for cold calling in his much admired Teach Like a Champion:

“If I was working with a group of teachers and had to help them make the greatest possible improvements in the rigor, ratio, and level of expectations in their classroom with one technique, the technique I’d choose might well be cold call.”

That is: if we want students themselves to be doing cognitive work — a.k.a. “active learning” — Lemov thinks cold calling is the way to go. It serves four key functions:

First, it lets the teacher check students’ understanding,

Second, it creates a culture of “engaged accountability,”

Third, it helps the teacher speed up or slow down the pace, and

Fourth, it supplements other teaching strategies, like “turn and talk.”

Little wonder Lemov champions it so heartily.

Breaking the Tie?

We’ve got an expert in the neurobiology of stress saying cold calling is professional malpractice. We’ve got an expert in classroom teaching saying that cold calling is profession best practice.

How to we decide?

On this blog, we try always to find relevant research. In this case, the best study I can find was undertaken by Dallimore, Hertenstein, and Platt.

Team Dallimore — aware of both sides of this debate — looked at 16 sections of a college accounting course, including well over 600 students.

They kept track of the professors’ discussion techniques: in particular, did they cold call or not?

And, they followed a number of variables: in particular, how much did students voluntarily participate? And, how comfortable were the students in class discussion? (In other words: what happened to those cortisol levels my professor worried about?)

If the answers to those questions show a clear pattern, that might help us decide to follow my prof’s guidance, or Lemov’s.

The Envelope Please

In brief: cold calling produced good thinking results, and lowered (apparent) stress levels.

That is: in classes with infrequent cold calling, students’ voluntary participation remained the same throughout the term. In classes with high cold calling, their voluntary participation rose from 68% to 86%.

Dallimore’s team saw the same results with the number of questions students volunteered to answer. That number remained flat in the low cold calling classes, and rose in the high cold calling classes.

And, how about stress?

When asked to report their comfort level with class discussion, that level remained constant in low cold calling sections. Comfort levels rose in high cold calling sections.

So: when teachers cold called, their students voluntarily participated more, and they felt more comfortable in class.

Always with the Limitations

Dallimore’s study — combined with Lemov’s insight, guidance, and wisdom — suggests that cold calling really can benefit students.

However, any good teaching technique can be used badly. If it’s possible to make a bad pizza, it’s possible to make a bad version of any great thing.

So, if we’ve got students who have experienced ongoing trauma, we should make reasonable accommodations. If a student has an IEP that warns against cold calling, we should — of course! — heed that warning.

Also, I should acknowledge the limitations of this research. The study I’ve described was published in 2012, and it’s the most recent one I have located. Simply put: we don’t have much research on the topic.

And: research done with accounting students — most of whom are college sophomores — might not apply to your students.

Of course, Lemov works mostly with K-12 students, especially those who attend schools that have relatively high poverty rates. In other words: Dallimore’s research + Lemov’s research shows a wide range of effectiveness for this technique.

In sum: I’m sure teachers can use cold calling techniques badly — resulting in raised stress and reduced learning. But, done well, this technique offers real benefits.

If we create a respectful, supportive, and challenging classroom climate — including cold call — students can learn splendidly. This video shows the technique in action.

Are “Retrieval Practice” and “Spacing” Equally Important? [Updated]
Andrew Watson
Andrew Watson

If you follow research in the world of long-term memory, you know you’ve got SO MANY GOOD STRATEGIES.

Agarwal and Bain’s Powerful Teaching, for instance, offers a delicious menu: spacing, interleaving, retrieval practice, metacognition.

Inquiring minds want to know: how do we best choose among those options? Should we do them all? Should we rely mostly on one, and then add in dashes of the other three? What’s the idea combination?

One Important Answer

Dr. Keith Lyle and his research team wanted to know: which strategy has greater long-term impact in teaching college math: retrieval practice or spacing?

That is: in the long term, do students benefit from more retrieval? From greater spacing? From both?

To answer this really important question, they carefully designed weekly quizzes in a college precalculus class. Some topics, at “baseline,” were tested with three questions at the end of the week. That’s a little retrieval practice, and a few days of spacing.

Some topics were tested with six quiz questions at the end of the week. That’s MORE retrieval practice, but the same baseline amount of spacing.

Some topics were tested with three quiz questions spread out over the semester. That’s baseline retrieval practice, but MUCH GREATER spacing.

And, some topics were tested with six quiz questions spread out over the semester. That’s extra retrieval AND extra spacing.

They then measured: how did these precalculus students do when tested on those topics on the final exam? And — hold on you hats — how did they do when tested a month later, when they started taking the follow-up class on calculus?

Intriguing Answers…

Lyle and Co. found that — on the precalculus final exam…

…extra retrieval practice helped (about 4% points), and

…extra spacing helped (about 4% points), and

…combining extra retrieval with extra spacing helped more (about 8% points).

So, in the relatively short term, both strategies enhance learning. And, they complement each other.

What about the relatively longer term? That is, what happened a month later, on the pre-test for the calculus class? In that case…

…extra retrieval practice didn’t matter

…extra spacing helped (about 4% points).

…combining extra retrieval with extra spacing produced no extra benefit (still about 4% points).**

For enduring learning, then, extra spacing helped, but extra retrieval practice didn’t.

…Important Considerations

First: as the researchers note, it’s important to stress that this research comes from the field of math instruction. Math — more than most disciplines — already has retrieval practice built into in.

That is: when I do math homework, every problem I solve requires me (to some degree) to recall the math task at hand. (And, probably, lots of other relevant math info as well.)

But, when I do my English homework, the paper I’m writing about Macbeth might not remind me about Grapes of Wrath. Or, when I do my History homework, the time I spend studying Aztec civilization doesn’t necessarily require me to recall facts or concepts from the Silk Road unit. (It might, but might not.)

So, this study shows that extra retrieval practice didn’t help over and above the considerable retrieval practice the math students were already doing.

Second: notice that the “spacing” in this case was a special kind of spacing. It was, in fact, spacing of retrieval practice. Of course, that counts as spacing.

But, we have lots of other ways to space as well. For instance, Dr. Rachael Blasiman testing spacing by taking time in lectures to revisit earlier concepts. That strategy did create spacing, but didn’t include retrieval practice.

So, this research doesn’t necessarily apply to other kinds of spacing. It might, but we don’t yet know.

Practical Classroom Applications

Lyle & Co.’s study gives us three helpful classroom reminders.

First: as long as we’ve done enough retrieval practice to establish ideas (as math homework does almost automatically), we can redouble our energies to focus on spacing.

Second: Lyle mentions in passing that students do (very slightly) worse on quizzes that include spacing — because spacing is harder. (Regular readers know, we call this “desirable difficulty.”)

This reminder gives us an extra reason to be sure that quizzes with spacing are low-stakes or no-stakes. We don’t want to penalize students for participating in learning strategies that benefit them.

Third: In my own view, we can ask/expect our students to join us in retrieval practice strategies. Once they reach a certain age or grade, they should be able to make flashcards, or use quizlet, or test one another.

However, I think spacing requires a different perspective on the full scope of a course. That is: it requires a teacher’s perspective. We have the long view, and see how all the pieces best fit together.

For those reasons, I think we can (and should) ask students to do retrieval practice (in addition to the retrieval practice we create). But, we ourselves should take responsibility for spacing. We — much more than they they — have the big picture in mind. We should take that task off their to do list, and keep it squarely on ours.


** This post has been revised on 3/7/30. The initial version did not include the total improvement created by retrieval practice and spacing one month after the final exam.

Where Should Students Study?
Andrew Watson
Andrew Watson

We’ve got lots of advice for the students in our lives:

How to study: retrieval practice

When to study: spacing effect

Why study: so many answers

Where to study: …um, hold please, your call is very important to us…

As can happen, research provides a counter-intuitive — and sometimes contradictory — answers to that last question.

I grew up hearing the confident proclamation that we should create a perfect study environment in one place, and always study there. (The word “library” was spoken in reverent tones.)

As I think about the research I’ve seen in the last ten years, my own recommendations to students have been evolving.

Classic Beginnings

In a deservedly famous study, Smith, Glenberg and Bjork (1978) tried to measure the effect on environment on memory.

They found that, in the short run, I associate the words that I learn in this room with the room itselfThat is: if I learn words in room 27, I’ll do better on a test of those words in room 27 than in room 52.

One way to interpret those findings is that we should teach in the place where students will be tested.

If the final exam, inevitably, is in the gym, I should teach my students in the gym. And they should study in the gym. This approach ensures that they’ll associate their new knowledge with the place they have to demonstrate that knowledge.

In this theory, students should learn and study in the place they’ll ultimately be tested.

Priority Fix #1

This interpretation of Smith’s work makes sense if — and only if — the goal of learning is to do well on tests.

Of course, that’s not my goal. I don’t want my students to think carefully about literature for the test; I want them to think carefully about literature for life.

I want them to have excellent writing skills now, and whenever in the future they need to write effectively and clearly.

We might reasonably worry that a strong association between the room and the content would limit transfer. That is: if I connect the material I’ve learned so strongly with room 27, or the gym, I might struggle to remember or use it anywhere else.

Smith worried about that too. And, sure enough, when he tested that hypothesis, his research supported it.

In other words, he found that students who study material in different locations can use it more flexibly elsewhere. Students who study material in only one location can’t transfer their learning so easily. (By the way: Smith’s research has been replicated. You can read about this in Benedict Carey’s How We Learn. Check out chapter 3.)

This finding leads to a wholly different piece of advice. Don’t do what my teachers told me to do when I was a student. Instead, study material in as many different places as reasonably possible. That breadth of study will spread learning associations as widely as possible, and benefit transfer.

That’s what I’ve been telling students for the last several years.

Voila. Generations of teaching advice overturned by research!

Priority Fix #2

Frequent readers have heard me say: “Researchers work by isolating variables. Schools work by combining variables.”

The longer I do this work, the longer I think that this “where to study” advice makes sense only if I focus exclusively on that one variable.

If I start adding in other variables, well, maybe not so much.

True enough, research shows that I’ll remember a topic better if I study it in different places … as long as all other variables being held constant. But, in life, other variables aren’t constant.

Specifically, some study locations are noisier than others. Starbucks is louder than the library: it just is. And, some locations are visually busier than others.

And, as you would expect, noise — such as music — distracts from learning. So, too, do visually busy environments.

So, a more honest set of guidelines for students goes like this:

You should review material in different places. But, you want each of those places to be quiet. And, you don’t want them to have much by way of visual distraction.

You know what that sounds like to me? The library.

I suppose it’s possible for students to come up with several different study locations that are equally quiet and visually bland. Speaking as a high school teacher, I think it’s unlikely they’ll actually do that.

So, unless they’ve got the bandwidth to manage all those demands even before they sit down to study, then I think the traditional advice (“library!”) is as good as anything.

Final Thoughts

People occasionally ask me where I am in the “traditional vs. progressive” education debate.

The honest answer is: I’m indifferent to it. I (try to) focus on practical interpretations of pertinent psychology and neuroscience research.

If that research leads to a seemingly innovative suggestion (“study in many locations!”), that’s fine. If it leads to a traditional position (“library”), that’s equally fine.

I think that, for the most part, having teams in education (prog vs. trad) doesn’t help. If we measure results as best we can, and think humbly and open-mindedly about the teaching implications, we’ll serve our students best.

“How We Learn”: Wise Teaching Guidance from a Really Brainy Guy
Andrew Watson
Andrew Watson

Imagine that you ask a neuro-expert: “What’s the most important brain information for teachers to know?”

The answer you get will depend on the expertise of the person you ask.

If you ask Stanislas Dehaene, well, you’ll get LOTS of answers — because he has so many areas of brain expertise.

He is, for example,  a professor of experimental cognitive psychology at the Collège de France; and Director of the NeuroSpin Center, where they’re building the largest MRI gizmo in the world. (Yup, you read that right. IN THE WORLD.)

He has in fact written several books on neuroscience: neuroscience and reading, neuroscience and math, even neuroscience and human consciousness.

He’s also President of a newly established council to ensure that teacher education in all of France has scientific backing: the Scientific Council for Education. (If the United States had such a committee, we could expunge Learning Styles myths from teacher training overnight.)

If that’s not enough, Dehaene is interested in artificial intelligence. And statistics. And evolution.

So, when he writes a book called How We Learn: Why Brains Learn Better than Any Machine…for Now, you know you’re going to get all sorts of wise advice.

Practical Teaching Advice

Dehaene wants teachers to think about “four pillars” central to the learning process.

Pillar 1: Attention

Pillar 2: Active engagement

Pillar 3: Error feedback

Pillar 4: Consolidation

As you can see, this blueprint offers practical and flexible guidance for our work. If we know how to help students pay attention (#1), how to help them engage substantively with the ideas under discussion (#2), how to offer the right kind of feedback at the right time (#3), and how to shape practice that fosters consolidation (#4), we’ll have masterful classrooms indeed.

Learning, of course, begins with Attention: we can’t learn about things we don’t pay attention to. Following Michael Posner’s framework, Dehaene sees attention not as one cognitive process, but as a combination of three distinct cognitive processes.

Helpfully, he simplifies these processes into three intuitive steps. Students have to know:

when to pay attention

what to pay attention to, and

how to pay attention.

Once teachers start thinking about attention this way, we can see all sorts of new possibilities for our craft. Happily, he has suggestions.

Like other writers, Dehaene wants teachers to focus on active engagement (pillar #2). More than other writers, he emphasizes that “active” doesn’t necessarily mean moving. In other words, active engagement requires not physical engagement but cognitive engagement.

This misunderstanding has led to many needlessly chaotic classroom strategies, all in the name of “active learning.” So, Dehaene’s emphasis here is particularly helpful and important.

What’s the best way to create cognitive (not physical) engagement?

“There is no single miraculous method, but rather a whole range of approaches that force students to think for themselves, such as: practical activities, discussions in which everyone takes part, small group work, or teachers who interrupt their class to ask a difficult questions.”

Error Feedback (pillar #3) and Consolidation (#4) both get equally measured and helpful chapters. As with the first two, Dehaene works to dispel myths that have muddled our approaches to teaching, and to offer practical suggestions to guide our classroom practice.

Underneath the “Four Pillars”

These four groups of suggestions all rest on a sophisticated understanding of what used to be called the “nature/nurture” debate.

Dehaene digs deeply into both sides of the question to help teachers understand both brain’s adaptability (“nurture”) and the limits of that adaptability (“nature”).

To take but one example: research with babies makes it quite clear that brains are not “blank slates.” We come with pre-wired modules for processing language, numbers, faces, and all sorts of other things.

One example in particular surprised me: probability. Imagine that you put ten red marbles and ten green marbles in a bag. As you start drawing marbles back out of that bag, a 6-month-old will be surprised — and increasingly surprised — if you draw out green marble after green marble after green marble.

That is: the baby understands probability. They know it’s increasingly likely you’ll draw a red marble, and increasingly surprising that you don’t. Don’t believe me? Check out chapter 3: “Babies’ Invisible Knowledge.”

Of course, Dehaene has fascinating stories to tell about the brain’s plasticity as well. He describes several experiments — unknown to me — where traumatized rats were reconditioned to prefer the room where the traumatizing shock initially took place.

He also tells the amazing story of “neuronal recycling.” That is: the neural real-estate we train to read initially housed other (evolutionarily essential) cognitive functions.

Human Brains and Machine Learning

Dehaene opens his book by contemplating definitions of learning — and by contrasting humans and machines in their ability to do so.

By one set of measures, computers have us beat.

For instance, one computer was programmed with the rules of the game Go, and then trained to play against itself. In three hours, it became better at the game than the human Go champion. And, it got better from there.

However, Dehaene still thinks humans are the better learners. Unlike humans, machines can’t generalize their learning. In other words: that Go computer can’t play any other games. In fact, if you changed the size of the Go board even slightly, it would be utterly stumped.

And, unlike humans, it can’t explain its learning to anyone else.

And, humans need relatively little data to start learning. Machines do better than us when they can crank millions of calculations. But, when they calculate as slowly as we do, they don’t learn nearly as much as we do.

As his subtitle reassures us, brains learn better than any machine. (And, based on my conversation with him, it’s clear that “…for now” means “for the long foreseeable future.”)

Final Thoughts

At this point, you see what I mean when I wrote that Dehaene has an impressive list of brain interests, and therefore offers an impressive catalog of brain guidance.

You might, however, wonder if this much technical information ends up being a little dry.

The answer is: absolutely not.

Dehaene’s fascination with all things brain is indeed palpable in this book. And, his library of amazing studies and compelling anecdotes keeps the book fresh and easy-to-read. I simply lost track of the number of times I wrote “WOW” in the margin.

This has been a great year for brain books. Whether you’re new to the field, or looking to deepen your understanding, I recommend How We Learn enthusiastically.

https://www.youtube.com/watch?time_continue=62&v=23KWKoD8xW8&feature=emb_logo

An Unexpected Strategy to Manage Student Stress
Andrew Watson
Andrew Watson

School includes lots of stress. And, sometimes that stress interferes with academic life.

It might make it harder for students to encode new information. It might make it harder for them to show what they know — on tests, for example.

So, how can we help students manage their stress?

We’ve got some research suggesting that mindfulness helps. Can we do anything else?

Rethinking Our First Instinct

Imagine that a student comes to me and says, “Whoa! I’m really stressed out about this test…

My gut instinct might be to say something reassuring: “No worries — you totally got this. Just stay calm and I’m sure you’ll do fine.

This instinct, however, has a built-in problem. An anxious student experiences well-known physiological symptoms: a racing heart, sweaty palms, dry mouth, etc.

My student might try to persuade himself that he’s calm. But, all that physiological evidence reminds him — second by second — that he really isn’t calm.

Researcher Alison Wood Brooks wondered: could she encourage students to adopt a positive emotional framework with those same physiological signs?

Rather than encouraging a student to “be calm,” Brooks thought she might encourage him to “get excited.” After all, the bodily signs of excitement are a lot like those of stress. And, whereas stress feels mostly negative, excitement is (obviously) positive.

Testing (and Retesting) the Hypothesis

Brooks tested out this hypothesis in an impressive variety of stressful situations.

She started by having participants sing in a karaoke contest. One group prepped by saying “I am anxious.” A second group said “I am excited.” A third didn’t say either of those things.

Sure enough, the “excited” group sang their karaoke song considerably more accurately (81%) than their “anxious” peers (53%).

She then tried the ultimate in stress-inducing situations: public speaking.

Half of the speakers prepped by declaring themselves “calm” (which was my go-to suggestion above). The other half declared themselves “excited.”

As Brooks expected, independent judges rated the “excited” speakers superior to the “calm” speakers in persuasiveness, competence, and confidence.

One more approach may be most interesting to classroom teachers: a math test.

When getting reading for a “very difficult” test including eight math questions, students were told either “try to remain calm” or “try to get excited.”

You know how this story ends.

The students instructed to “get excited” scored, on average, about 1/2 point higher than their “calm” peers.

Every way that Brooks could think to measure the question, the advice to “get excited” proved more beneficial than the traditional advice to “remain calm.”

Not Persuaded Yet?

Perhaps this video, which neatly recaps Brooks’s study, will persuade you. Check out the handy graphic at 1:30.

https://www.youtube.com/watch?v=1rRgElTeIqE

 

Balancing Direct Instruction with Project-Based Pedagogies
Andrew Watson
Andrew Watson

A month ago, I wrote about a Tom Sherrington essay proposing a truce between partisans of direct instruction and those of project-based learning (and other “constructivist pedagogies”).

In brief, Sherrington argues that both pedagogical approaches have their appropriate time in the learning process.

EARLY in schema formation, direct instruction helps promote learning for novices.

LATER in schema formation, project-based pedagogies can apply, enrich, and connect concepts for experts.

Today’s Update

At the time I wrote about Sherrington’s essay, it was available in a book on Education Myths, edited by Craig Barton.

I do recommend that book–several of its essays offer important insights. (See this post on Clare Sealy’s distinction between autobiographical and semantic memory.)

If you’d like to read Sherrington’s essay right away, I have good news: he has published it on his website.

Happily, his contribution to the debate is now more broadly available.

A Final Note

Like other thinkers in this field, Sherrington proposes the novice/expert divide as the most important framework for understanding when to adapt pedagogical models.

In my own thinking, I’m increasingly interested in understanding and defining the transition points from one to the other.

That is: how can we tell when our novices have become experts?

What are the signs and symptoms of expertise? How can we describe those signs and symptoms so that 3rd grade teachers and 7th grade teachers can make sense of them?

Or, science teachers and history teachers?

Or, soccer coaches as well as dance instructors?

In other words: I agree with Sherrington’s framework, but I think it’s incomplete without clearer guidance about the novice/expert continuum.

Concrete + Abstract = Math Learning
Andrew Watson
Andrew Watson

Early math instruction includes daunting complexities.

We need our students to understand several sophisticated concepts. And, we need them to learn a symbolic language with which to represent those concepts.

Take, for example, the concept of equivalence. As adults, you and I can readily solve this problem :   3+4 = 4 + __

Early math learners, however, can easily stumble. Often, they take the equals sign to mean “add up all the numbers,” and believe the correct answer to that question is “10.”

How can we help them through this stage of understanding?

Strategy #1: Switch from Abstract to Concrete

The first answer to the question seems quite straightforward. If the abstract, symbolic language of math (“3+4=___”) confuses students, let’s switch to a more concrete language.

For instance: “If my frog puppet has three oranges, and your monkey puppet has four oranges, how many oranges do they have together?”

It just seems logical: the switch from abstract to concrete ought to help.

Alas, those concrete examples have a hidden downside.

As Dan Willingham argues in Why Don’t Students Like School?, humans naturally focus on surface features of learning.

When children see monkeys and frogs and oranges, they associate the lesson with those specific entities–not with the underlying mathematical properties we want them to learn.

In edu-lingo, concrete examples can inhibit transfer. Students struggle to transfer a lesson about oranges and puppets to anything else.

Strategy #2: “Fade” from Concrete to Abstract

Taking their cue from Jerome Bruner, psychology researchers wondered if they could start with concrete examples and then, over time, switch to more abstract examples.

For instance, students might start learning about mathematical equivalence by using a balance. When they put an equal number of tokens on both sides, the balance is level.

In the second step, they do practice problems with pictures of a balance and tokens.

And, in the final step, they see abstract representations: 2 + 5 = 5 + __.

They describe this technique as “concreteness fading.”

And, sure enough, it worked. In this case, “worked” meant that students who learned equivalence though a concreteness fading method transferred their knowledge to different–and more difficult–problems.

They did so better than students who learned in a purely abstract way. And, better than students who learned in a purely concrete way. (And even, as a control condition, better than students who started with an abstract representation, and then switched to concrete.)

By the way: these researchers tested their hypothesis both with students who had a relatively low level of knowledge in this area, and those who had a high level of knowledge. They got (basically) the same results both times.

An Essential Detail

When we teachers try to incorporate psychology research into our teaching, we can sometimes find that it conflicts with actual experience.

In this case, we might find that our young math learners just “get it” faster when we use frog puppets. Given that experience, we might hesitate to fade over to abstract teaching.

This research shows an intriguing pattern.

Sure enough, students who began with concrete examples made fewer mistakes on early practice problems. And, that finding was true for both the “concrete only” group and the “concreteness fading” groups.

In other words, the “abstract only” group did worse on the early practice problems than did those groups.

But…and this is a CRUCIAL “but”…the “concrete only” group didn’t do very well on the transfer test. Their raw scores were the lowest of the bunch.

However, the “concreteness fading” group did well on the early problems AND on the transfer test.

It seems that, as the researchers feared, too much concrete instruction reduced transfer.

 

In sum: “concreteness fading” gives young math learners both a helpfully clear introduction to math concepts and the abstract understanding that allows transfer.


Fyfe, E. R., McNeil, N. M., & Borjas, S. (2015). Benefits of “concreteness fading” for children’s mathematics understanding. Learning and Instruction35, 104-120.

When Good Classroom Assignments Go Bad
Andrew Watson
Andrew Watson

As an English teacher, I rather love this assignment for 9th graders reading Romeo and Juliet:

Choose a character from the play.

Write a short monologue–20 lines or so–exploring that character’s feelings about a particular moment, or another character.

Be sure to write in iambic pentameter.

This assignment lets my students explore a character’s point of view in thoughtful detail. It encourages empathy and imagination. And, it allows them to play with a poetic meter that’s been at the rhythmic heart of English literature since we had English literature.

So, again, as an English teacher I love it.

But as someone who knows from cognitive science, I fear it’s simply not going to work (for most 9th graders on the planet).

Good Intentions Meet Cognitive Limitations

Regular readers know that students use their working memory all the time to grok their classroom work.

Working memory is vital to all classroom learning. And, alas, we just don’t have very much of it.

And, this assignment (almost certainly) places far too great a demand on my students’ WM.

Students must use their WM to…

…choose among the characters of the play. (Yes: choices take up WM resources.)

…choose among the dramatic events their chosen character experiences.

…create a wisely empathetic response to a dramatic event. (Yes: creativity requires working memory.)

And, on top of that, to…

…express richly Shakespearean logic and emotion within a tightly structured, largely unpracticed poetic meter. (If you doubt that writing in iambic pentameter takes working memory, try rewriting this sentence in iambic pentameter. Your prefrontal cortex will be aching in no time.)

So much cognitive load will overwhelm all but the most inventive of students.

Solving the Problem

Given that this assignment could be so powerful, how might we adapt it to fit within working memory limitations?

Two strategies come quickly to mind.

Firstredistribute the working memory demands. That is: don’t have them do all the WM work at the same time.

In this case, that suggestion can be easily implemented.

First night’s homework: choose the character, and describe or outline the dramatic moment.

Second night’s homework: write the monologue in modern English.

This approach spreads out the working memory demands over time. All the choosing, and some of the creativity, happens on the first night. The rest of creativity happens on night #2.

Secondreduce the working memory demands. Unless your students have practiced with iambic pentameter A LOT more than my students have, they’re likely to struggle to compose 20 fresh lines.

My own teacherly instincts would be to have them experiment with existing poetry. For instance, a fun sonnet might serve as a scaffold for early, tentative work.

In sonnet 130, Shakespeare famously laments the use of extravagant metaphors to hyper-praise women:

My mistress’ eyes are nothing like the sun.

Coral is far more red than her lips’ red.

And yet, by heav’n, I think my love as rare

As any she belied with false compare.

Can my students devise their own version of these sentiments? And, can the preserve the meter?

My boyfriend’s eyes are not as blue as sky.

For reals, his abs just aren’t what you’d call “shredded.”

And yet, by heav’n, I think my guy as hott

As any bae that Beyoncé has got.

Of course, scaffolding is called “scaffolding” because we can take it down. So, once students can manage iambic pentameter with this level of support, we can prompt them to devise more and more free-form iambic creations.

With enough practice, they might–some day–be able to compose 20 fresh lines of their own.

Can Multiple-Choice Tests Really Help Students?
Andrew Watson
Andrew Watson

Multiple-choice tests have a bad reputation. They’re easy to grade, but otherwise seem…well…hard to defend.

After all, the answer is RIGHT THERE. How could the student possibly get it wrong?

Given that undeniable objection, could multiple-choice tests possibly be good for learning?

The Benefits of Distraction

A multiple-choice test includes one correct answer, and other incorrect answers called “distractors.” Perhaps the effectiveness of a multiple-choice question depends on the plausibility of the distractors.

So, a multiple choice question might go like this:

“Who was George Washington’s Vice President?”

a) John Adams

b) Mickey Mouse

c) Tom Brady

d) Harriet Tubman

In this case, none of the distractors could possibly be true. However, I could ask the same question a different way:

“Who was George Washington’s Vice President?”

a) John Adams

b) Thomas Jefferson

c) Alexander Hamilton

d) James Madison

In THIS case, each of the distractors could reasonably have held that role. In fact, all three worked closely with–and deeply admired–Washington. Two of the three did serve as vice presidents. (And the other was killed by a VP.)

Why would the plausibility of the distractor matter?

We know from the study of retrieval practice that pulling information out of my brain benefits memory more than repeatedly putting information into it.

So, we might hypothesize this way:

If the distractors are implausible, a student doesn’t have to think much to figure out the correct answer. No retrieval required.

But, if the distractors are plausible, then the student has to think about each one to get the answer right. That’s lots of retrieval right there.

In other words: plausible distractors encourage retrieval practice, and thereby might enhance learning.

Better and Better

This line of reasoning leads to an even more delicious possibility.

To answer that question about Washington’s VP, the student had to think about four people: Adams, Jefferson, Hamilton, Madison.

Presumably she’ll learn the information about Adams–who was the correct answer to the question.

Will she also learn more about the other three choices? That is: will she be likelier to answer a question about Alexander Hamilton correctly? (“Who created the first US National Bank as Washington’s Secretary of the Treasury?”)

If the answer to that question is YES, then one multiple-choice question can help students consolidate learning about several different facts or concepts.

And, according to recent research, the answer is indeed YES.

The research paradigm used to explore this question requires lots of complex details, and goes beyond the scope of a blog post. If you’re interested, check out the link above.

Classroom Implications

If this research holds up, we might well have found a surprisingly powerful tool to help students acquire lots of factual knowledge.

A well-designed multiple-choice question–that is: one whose plausible distractors require lots of careful thought–helps students learn four distinct facts or concepts.

In other words:

“Multiple-choice questions…

a) are easy to grade

b) help students learn the correct answer

c) help students learn information about the incorrect answers

or

d) all of the above.”

Me: I’m thinking d) sounds increasingly likely…

Default Image
Andrew Watson
Andrew Watson

Earlier this month, I wrote about the distinction between autobiographical memory and semantic memory.

Both kinds help us live meaningful lives.

But, schools focus on semantic memory: we want our students to know facts and skills over the long term.

We don’t really need them to remember the class or the exercise (or even the teacher) who taught them those facts and skills. That’s autobiographical memory.

That blog post was inspired by Clare Sealy’s recent essay ironically entitled “Memorable Experiences Are the Best Way to Help Children Remember Things.”

Happily, Sealy is the guest on a recent EdNext podcast: you can hear her in-depth explanation.

Equally happy, that podcast includes Sealy’s essay itself.

To understand Sealy’s argument, and its full implications, you can both have a look and have a listen.