As we gear up for the start of a new school year, we’re probably hearing two words over and over: retrieval practice.
That is: students have two basic options when they go back over the facts, concepts, and procedures they’ve learned.
Option 1: they could review it; that is, reread a passage, or rewatch a video, or review their notes.
Option 2: they could retrieve it; that is, ask themselves what they remember about a passage, a video, or a page of notes.
Well, the research verdict is clear: lots of research shows that OPTION 2 is the winner. The more that students practice by retrieving, the better they remember and apply their learning in the long term.
This clear verdict, however, raises lots of questions.
How, exactly, should we use retrieval practice in classrooms.
Does it work in all disciplines and all grades?
Is its effectiveness different for boys and girls?
Does retrieval practice help students remember material that they didn’t practice?
Do multiple choice questions count as retrieval practice?
And so forth.
Given that we have, literally, HUNDREDS of studies looking at these questions, we teachers would like someone to sort through all these sub-questions and give us clear answers.
Happily, a research team recently produced just such a meta-analysis. They looked at 222 studies including more than 48,000 students, and asked nineteen specific questions.
These numbers are enormous.
Studies often get published with a few dozen participants – which is to say, a lot less than 48,000.
Researchers often ask 2 or 3 questions – or even 1. I don’t recall ever seeing a study or meta-analysis considering nineteen questions.
As a result, we’ve got a lot to learn from this meta-analysis, and can feel more confidence than usual in its conclusions.
The Big Picture
For obvious reasons, I won’t discuss all nineteen questions in detail. Instead, I’ll touch on the big-picture conclusions, highlight some important questions about practical classroom implementation, and point out a few surprises.
The high-level findings of this meta-analysis couldn’t be more reassuring.
YES: retrieval practice enhances long-term memory.
YES: in fact, it enhances memory of facts and concepts, and improves subsequent problem solving. (WOW.)
YES: it benefits students from kindergarten to college, and helps in all 18 (!!) disciplines that the researchers considered.
NO: the student’s gender doesn’t matter. (I was honestly a little surprised they studied this question, but since they’ve got an answer I’m reporting it here.)
I should note that these statistical results mostly fall in the “medium effect size” range: a hedges g of something like 0.50. Because I’m commenting on so many findings, I won’t comment on statistical values unless they’re especially high or low.
So the easy headline here is: retrieval practice rocks.
Making Retrieval Practice Work in the Classroom
Once teachers know that we should use retrieval practice, we’ve got some practical questions about putting it into practice.
Here again, this meta-analysis offers lots of helpful guidance.
Does it help for students to answer similar questions over multiple days?
Yes. (Honestly, not really surprising – but good to know.)
More specifically: “There is a positive relationship between the number of [retrieval practice] repetitions and the [ultimate learning outcome], indicating that the more occasions on which class content is quizzed, the larger the learning gains.”
Don’t just use retrieval practice; REPEAT retrieval practice.
Is feedback necessary?
Feedback significantly increases the benefit of retrieval practice – but the technique provides benefits even without feedback.
Does the mode matter?
Pen and paper, clicker quizzes, online platforms: all work equally well.
Me: I write “do now” questions on the board and my students write down their answers. If you want to use quizlet or mini-white boards, those strategies will work just as well.
Does retrieval practice help students learn un–tested material?
This question takes a bit of explaining.
Imagine I design a retrieval exercise about Their Eyes Were Watching God. If I ask my students to recall the name of Janie’s first husband (Logan Killocks), that question will help them remember his name later on.
But: will it help them remember the name of her second husband? Or, her third (sort-of) husband?
The answer is: direct retrieval practice questions help more, but this sort of indirect prompt has a small effect.
In brief, if I want my students to remember the names Jody Starks and Vergible Woods, I should ask them direct questions about those husbands.
Shiver Me Timbers
So far, these answers reassure me, but they don’t surprise me.
However, the meta-analysis did include a few unexpected findings.
Does the retrieval question format matter? That is: is “matching” better than “short answer” or “free recall” or “multiple choice”?
To my surprise, “matching” and “fill-in-the-blank” produce the greatest benefits, and “free recall” the least.
This finding suggests that the popular “brain dump” approach (“write down everything you remember about our class discussion yesterday!”) produces the fewest benefits.
I suspect that “brain dumps” don’t work as well because, contrary to the advice above, they don’t directly target the information we want students to remember.
Which is more effective: a high-stakes or a low-stakes format?
To my astonishment, both worked (roughly) equally well.
So, according to this meta-analysis, you can grade or not grade retrieval practice exercises. (I will come back to this point below.)
Should students collaborate or work independently on retrieval practice answers?
The studies included in the meta-analysis suggest no significant difference between these approaches. However, the researchers note that they don’t have all that many studies on the topic, so they’re not confident about this answer. (For a number of reasons, I would have predicted that individual work helps more.)
Beyond the Research
I want to conclude by offering an opinion that springs not from research but from experience.
For historical reasons, “retrieval practice” had a different name. Believe it or not, it was initially called “the testing effect.” (In fact, the authors of this meta-analysis use this term.)
While I understand why researchers use it, I think we can agree that “the testing effect” is a TERRIBLE name.
No student anywhere wants to volunteer for more testing. No teacher anywhere either.
And – crucially – the benefits have nothing to do with “testing.” We don’t need to grade them. Students don’t need to study. The retrieving itself IS the studying.
For that reason, I think teachers and schools should focus as much as possible on the “retrieval” part, and as little as possible on the “testing.”
No, HONESTLY, students don’t need to be tested/graded for this effect to work.
Retrieval practice — in almost any form — helps almost everybody learn, remember, and use almost anything.
As long as we don’t call it “testing,” schools should employ retrieval strategically and frequently.
Yang, C., Luo, L., Vadillo, M. A., Yu, R., & Shanks, D. R. (2021). Testing (quizzing) boosts classroom learning: A systematic and meta-analytic review. Psychological Bulletin, 147(4), 399.