Beyond Retrieval Practice: The Benefits of Student-Generated Questions

Retrieval Practice has gotten a lot of press in recent years — especially at our conference last fall on Deeper Learning.

The short version: students don’t benefit much from simple review — say, rereading a passage. But, they benefit a lot from actively trying to recall information — say, answering questions about that passage.

Dr. Pooja Agarwal puts it this way: Students should practice not by trying to put information into their brains, but by trying to take information out.

(She and Patrice Bain have written a great book on the topic: Powerful Teaching.)

We have LOTS of research showing that retrieval practice yields great benefits. Can other strategies match it?

Here’s an idea: maybe instead of having students answer questions (retrieval practice), we should have them create questions to be answered. Just perhaps, generating questions might boost learning more than simple review. Or — let’s get crazy: maybe generating questions boosts learning as much as retrieval practice? Even more?

Generating Research

Over the years, the “generation effect” has been studied occasionally — alas, not as much as retrieval practice. Often, research in this area includes a training session where students learn how to ask good questions. That step makes sense … but it might discourage teachers from adopting this strategy. Who has the time?

Researchers in Germany had three groups of college students read slides from a lecture about infant developmental psychology.

The first group practiced the information by rereading it. Specifically, the were instructed to memorize the content of those slides.

Group two practiced by answering questions on each slide. They if they couldn’t remember the answer, they were allowed to go back and review the slide. In effect, this was “open-book retrieval practice.”

In group three,

“students were instructed to formulate one exam question in an open response format for the content of each slide [,] and also to provide an answer to that question.”

That is: they generated questions.

So, here’s the big question: when they took a surprise quiz, how did students in each group do?

Drum Roll Please…

First: Students who generated questions scored ~10% higher on that surprise quiz than those who tried to memorize information.

Second: Students who generated questions did as well as those who used retrieval practice.

Third: Questioners got these benefits even without explicit training in how to ask good questions.

Fourth: Question generators (and retrieval practicers) scored higher than mere reviewers on both factual question and transfer questions.

Fifth: Researchers got these impressive results even though the surprise quiz took place one week later. (In research like this, those quizzes often happen right away. Of  course, a week’s delay looks a lot more like genuine learning.)

We could hardly ask for better results than these. In this research paradigm, question generation worked as well as retrieval practice — which works better than almost anything else we know of to help students learn.

Explaining Amazing Results

Why would this be? Why does generating questions help students as much as answering them?

This study doesn’t answer that question directly, but it suggests a rough-n-ready answer.

Both common sense and lots o’ research tell us: students learn more when they think hard about something. (Obvi.)

If we increase the challenge of the thinking task, we prompt students to think harder and therefore to learn better.

Psychologists talk about “desirable difficulties”: a level of mental challenge that forces students to work their synapses but doesn’t overtax them.

In this case, we can reasonably hypothesize that students who must create a question on a topic have to think hard about it. To come up with a good question, they have to think at least as hard as students answering questions on that topic.

And, they have to think considerably harder than students who simply reread a passage.

Voila! Generating questions help students learn.

A Few Caveats

As always, research provides teachers with helpful guidance. But: we need to adapt it to our own circumstances.

First: this study took place with college students. We should take care that our students can — in fact — come up with good questions.

For instance, I’m a high-school English teacher. I would use this technique with Their Eyes Were Watching God or Passing or Sula. But I don’t think I’d use it with The Scarlet Letter or Hamlet. My students struggle to understand the basics with those texts; I’m not sure they’d do a good job coming up with resonant exam questions.

More precisely: I’d structure those assignments quite differently. I suspect I could be open-ended with an assignment to create Passing questions, but would offer a lot more guidance for Scarlet Letter questions.

Second: yes, this study found that retrieval practice and question generation resulted in additional learning. And, we have a reasonably hypothesis about why that might be so.

But, we have MUCH more research about retrieval practice. Before we invest too heavily in question generation, we should keep our eyes peeled for more studies.

Third: In this paradigm, trying to memorize resulted in less learning. However, we shouldn’t conclude that students should never try to memorize. At times, “overleaning” is essential for reducing working memory load — which facilitates learning.

As long as we keep these caveats in mind, we can be excited about trying out a new review technique.

And: this can work in online settings as well!

tags: category: L&B Blog

Leave a Reply

Your email address will not be published. Required fields are marked *