prequestions – Education & Teacher Conferences Skip to main content
Answering Questions *Before* Reading: Can AI Make This Strategy Work?
Andrew Watson
Andrew Watson

2025 included MANY grand claims about transformational potential for AI:

  • “AI will enhance education (and civilization) in these magnificent ways,” or
  • “AI will destroy education (and civilization) in these grisly ways.”

Here’s a question that got less air time:

“Can you name me just one specific — even mundane — way that AI can make teaching and learning simpler and better?”

As of today, my answer to that question is YES.

Here’s the story.

Regular readers have heard me talk about the benefits of “prequestions” before: for example, here, here, and here.

The prequestion process goes like this:

  • Students try to answer questions about a topic even though they haven’t learned about it yet.
  • Unsurprisingly, they answer most of these questions incorrectly.
  • Students then read a passage on the topic.
  • Result: these students remember more about the topic than others who didn’t answer prequestions.

By the way: these prequestions can be very straightforward. For instance: “What distinguishes hydraulic brakes from mechanical brakes in automobiles?”

Researchers are still trying to figure out WHY prequestions help learning; they’ve got several theories. But we’ve got enough research on this strategy for me to conclude that it’s a thing — not a well-intentioned research-based fluke.

As we try to apply this finding in actual classrooms, we come across a few stark problems:

  • For teachers: writing prequestions takes time. (Boo!)
  • For students: how can they write their own prequestions when they haven’t studied the material? (Paradox!)

A research team — led by Dr. Steven Pan — wondered: can we use AI to generate effective prequestions? That is: do AI-created prequestions benefit learning the same way that human-generated prequestions do?

Researching Step by Step

Team Pan’s questions sound simple. But when researchers approach a topic like this, they face several demands.

First: they have lots technical steps to follow: sample sizes and active control groups and intricate calculations and so forth. (In my estimation, they checked all these boxes.)

Second: when done well, research studies try to disprove their own hypotheses. Researchers don’t so much kick the tires as try to puncture them. (In my view, they explored plausible alternatives admirably.)

To meet these challenges, Pan’s crew undertook four related experiments. I won’t go through all the nitty gritty, but the highlights make for encouraging reading.

In Pan’s most basic experiment, one group of students read a passage about different kinds of brakes: air brakes, mechanical brakes, hydraulic brakes — you get the idea. A second group read two AI-generated prequestions about that passage before they read it. Sure enough: the students who read the AI-created prequestions scored higher on a follow-up quiz than those who didn’t.

Unsurprisingly, they scored higher when answering the same questions that they initially read as prequestions. They ALSO scored higher when answering novel questions. In other words: it seems that the benefits of answering prequestions goes beyond the precise questions themselves to the passage overall.

Don’t Stop Now

To make sure they have a persuasive case, Pan’s team didn’t stop there.

They asked: do AI-generated prequestions provide as much benefit as human-generated prequestions?

Short answer: “yes.” Technically speaking, in some cases the human-generated prequestions led to slightly higher quiz scores — but the differences were tiny.

They asked: do AI-generated prequestions help more or less than previewing an AI-generated outline?

Short answer: “trying to answer the prequestions helped considerably more than previewing an outline.”

They asked: do detailed prompts produce better questions, or does a basic prompt work well too?

Short answer: “a basic prompt worked just fine.” In one study, the basic prompt led to more effective prequestions than the detailed one.

To summarize the good news in this study:

  • AI-generated prequestions help students learn from reading new information
  • They help as much as human-generated prequestions
  • Prequestions improve memory of the entire passage — not just the answers to the questions themselves
  • They help more than some other kinds of warm-up activities, like studying an outline
  • Even basic prompts work just fine

Good News…Bad News?

With all that good news, is there any bad news?

Honestly, I don’t see much “bad news” here. But — as always — I do see limitations.

  1. As far as I know, prequestions have been studied for learning from reading passages. I don’t know if we have evidence showing they benefit students who, say, listen to a discussion or a teacher presentation. For that reason, this strategy isn’t obviously road-tested for younger grades.
  2. This study focused on adult learners; the average participant age for these studies was in the low 30s. We don’t know if AI-generated prequestions will help in 8th grade — although it’s not obvious to me why they wouldn’t.
  3. This study, like most research into prequestions, tests memory after a few minutes. Will it help over longer periods of time? I don’t think we know.
  4. Prequestions aren’t a panacea. We don’t need to use them all the time. They should be one strategy that we use judiciously, not a hard-and-fast requirement.

Once we acknowledge those limitations, I think we have a compelling case. We know that — under the right circumstances — prequestions can help students learn. And, thanks to Pan and his colleagues, we know that AI-generated prequestions provide the benefits we want.

At the top of this post, I asked this question: “Can you name me just one specific — even mundane — way that AI can make teaching and learning simpler and better?”

Team Pan answers: “AI can help both students and teachers generate prequestions. And those AI-created questions help learning.”

Perhaps you’ll begin new classroom year with a prequestion or two.


Pan, S. C., Schweppe, J., Teo, A. Z. J., Indrajaya, A., & Wenzel, N. (2025). Using ChatGPT-generated prequestions to improve memory and text comprehension. Journal of Applied Research in Memory and Cognition. Advance online publication. https://dx.doi.org/10.1037/mac0000254

How to Reduce Mind-Wandering During Class
Andrew Watson
Andrew Watson

I recently wrote a series of posts about research into asking questions. As noted in the first part of that series, we have lots of research that points to a surprising conclusion.

Let’s say I begin class by asking students questions about the material they’re about to learn. More specifically: because the students haven’t learned this material yet, they almost certainly get the answers wrong.

A college age student smiling and raising her hand to ask a question.

Even more specifically — and more strangely — I’m actually trying to ask them questions that they won’t answer correctly.

In most circumstances, this way of starting class would sound…well…mean. Why start class by making students feel foolish?

Here’s why: we’ve got a good chunk of research showing that these questions — questions that students will almost certainly get wrong — ultimately help them learn the correct answers during class.

(To distinguish this particular category of introductory-questions-that-students-will-get-wrong, I’m going to call them “prequestions.”)

Now, from one perspective, it doesn’t really matter why prequestions help. If asking prequestions promotes learning, we should probably ask them!

From another perspective, we’d really like to know why these questions benefit students.

Here’s one possibility: maybe they help students focus. That is: if students realize that they don’t know the answer to a question, they’ll be alert to the relevant upcoming information.

Let’s check it out!

Strike That, Reverse That, Thank You

I started by exploring prequestions; but we could think about the research I’m about to describe from the perspective of mind-wandering.

If you’ve ever taught, and ESPECIALLY if you’ve ever taught online, you know that students’ thoughts often drift away from the teacher’s topic to…well…cat memes, or a recent sports upset, or some romantic turmoil.

For obvious reasons, we teachers would LOVE to be able to reduce mind-wandering. (Check out this blog post for one approach.)

Here’s one idea: perhaps prequestions could reduce mind-wandering. That is: students might have their curiosity piqued — or their sense of duty highlighted — if they see how much stuff they don’t know.

Worth investigating, no?

Questions Answered

A research team — including some real heavy hitters! — explored these questions in a recent study.

Across two experiments, they had students watch a 26-minute video on a psychology topic (“signal detection theory”).

  • Some students answered “prequestions” at the beginning of the video.
  • Others answered those questions sprinkled throughout the video.
  • And some (the control group) solved unrelated algebra problems.

Once the researchers crunched all the numbers, they arrived at some helpful findings.

First: yes, prequestions reduced mind-wandering. More precisely, students who answered prequestions reported that they had given more of their attention to the video than those who solved the algebra problems.

Second: yes, prequestions promoted learning. Students who answered prequestions were likelier to get the answer correct on a final test after the lecture than those who didn’t.

Important note: this benefit applied ONLY to the questions that students had seen before. The researchers also asked students new questions — ones that hadn’t appeared as prequestions. The prequestion group didn’t score any higher on those new questions than the control group did.

Third: no, the timing of the questions didn’t matter. Students benefitted from prequestions asked at the beginning as much as those sprinkled throughout.

From Lab to Classroom

So, what should teachers DO with this information.

I think the conclusions are mostly straightforward.

A: The evidence pool supporting prequestions is growing. We should use them strategically.

B: This study highlights their benefts to reduce mind-wandering, especially for online classes or videos.

C: We don’t need to worry about the timing. If we want to ask all prequestions up front or jumble them throughout the class, either strategy (according to this study) gets the job done.

D: If you’re interested in specific suggestions on using and understanding prequestions, check out this blog post.

A Final Note

Research is, of course, a highly technical business. For that reason, most psychology studies make for turgid reading.

While this one certainly has its share of jargon heavy, data-laden sentences, its explanatory sections are unusually easy to read.

If you’d like to get a sense of how researchers think, check it out!


Pan, S. C., Sana, F., Schmitt, A. G., & Bjork, E. L. (2020). Pretesting reduces mind wandering and enhances learning during online lectures. Journal of Applied Research in Memory and Cognition9(4), 542-554.

Starting Class with “Prequestions”: Benefits, Problems, Solutions
Andrew Watson
Andrew Watson

We’ve known for many years now that retrieval practice works.

Hispanic student wearing a blue shirt raising his hand to ask a question in class

That is: after we have introduced students to a topic, we might REVIEW it with them the next day. However, they’ll remember it better if we ask them to try to RETRIEVE ideas and procedures about it.

As Dr. Pooja Agarwal and Patrice Bain write, we want students to “pull information out of their brains” (retrieve) not “put information back into their brains” (review).

Sadly, we know that students’ intuition contradicts this guidance. They really want to reread or review their notes, rather than ask themselves questions.

In this (very sad) study, for instance, Dr. Nate Kornell and Dr. Lisa Son found that students think review works better than retrieval even when they do better on quizzes following retrieval!

Yes, even the experience of learning more doesn’t persuade students that they learned more.

YIKES.

The More Things Change…

Let’s take this retrieval practice idea one step further.

I wrote above that answering questions helps students learn AFTER they have been introduced to a topic.

But: does answering questions help students learn a topic even BEFORE they study it?

On the one hand, this suggestion sounds very strange. Students can’t get these “prequestions” right, because they haven’t yet studied the topic.

On the other hand, we’ve got research showing that this strategy works!

In one of my favorite studies ever, Dr. Lindsay Richland found that “prequestions” help students learn. And, she then worked really hard to disprove her own findings. When she couldn’t explain away her conclusions, she finally accepted them. *

Similarly, a more recent study suggests that learning objectives framed as questions (“Where are mirror neurons located?”) helps students learn more than LOs framed as statements (“You will learn where mirror neurons are located.”).

Although this prequestion strategy hasn’t been studied as much as retrieval practice, I do think it has enough research behind it to merit teachers’ respectful attention.

However, I do think this approach has a practical classroom problem…

Sustaining Motivation

For the most part, my high-school students are an amiable lot. If I ask them to do something … say, answer retrieval practice questions … they’ll give it a go.

And, they almost certainly want to get those questions right.

In a class discussion about Their Eyes Were Watching God, for instance, we might compare Janie’s three “husbands.” If I ask a student the following day to list some points of comparison from memory (retrieval practice!), they’ll feel that they ought to remember an answer or two.

Let’s try this logic with prequestioning.

Imagine I ask my students this prequestion: “Why do you think the novel’s protagonist will have the nickname ‘Alphabet’?”

My students will gamely try some answers.

However, I worry that – over time – they’ll start losing interest.

They almost never get these answers right.

And, there’s no “penalty” for getting them wrong, or reward for getting them right. (We don’t want students to focus on rewards and penalties, but schools typically work this way…)

From the student perspective, in other words, the whole prequestion strategy feels like an exercise in futility.

Why should they bother to think seriously about these un-answerable questions? They feel like wasted mental effort…

Two Solutions

First: I’ve tried in the past to solve this problem by using the strategy infrequently.

If my students don’t experience this quirky frustration too often, I hope, they won’t mind participating in this odd ritual.

Recent research, however, offers a second solution – a more honorable solution than mine.

In this study, by Dr. Steven Pan and Dr. Michelle Rivers, prequestions consistently helped students learn.

However, students didn’t really notice the benefit of prequestions – even when they learned more from answering them. (This result sounds a lot like the Kornell and Son study about retrieval practice; students don’t register the benefits they experience.)

So, Pan and Rivers tried several solutions. Specifically, they found benefits to a multi-step approach:

Step 1: have students learn some info with prequestions, and some without.

Step 2: give them a no-stakes quiz on the info.

Step 3: let them see that they remembered information better after prequestions.

Step 4: next time, ask students to recall how well they remembered after answering prequestions.

In other words: students need to experience the benefits and to have them repeatedly pointed out. This combination, probably, helps students believe that prequestions really do help.

This insight (probably?) helps with the motivation problem that has been troubling me in the past.

In other words: students who believe that prequestions will help are much likelier to participate in the curious mental exercise of trying to answer questions whose answer they can’t yet know.

TL;DR

When students answer questions about information they’re about to learn, they remember that information better – even if they get the answers wrong.

This strategy might be effective in the short term, but hamper motivation over time. After all, why should students even try to answer questions if they’re unlikely to know the answer?

To counteract this motivational problem, take students through Pan & Rivers’s procedure for them to experience and remember the benefits that prequestions provide.

We don’t have LOTS of research on this strategy, but we do have enough to make it a plausible approach.


* Sadly, the “prequestion” strategy has frequently been called “pretesting.” Of course, the presence of the stem “test” both confuses the strategy (there’s no testing!) and disinclines people from participating (who wants more testing?).

So, let me emphasize: “prequestions” are simply questions. They’re not a test.

BTW: I’ve recently seen the word “pretrieval” as a way to avoid the “pretest” moniker. You might like it better than “prequestions.”


Agarwal, P. K., & Bain, P. M. (2019). Powerful teaching: Unleash the science of learning. John Wiley & Sons.

Kornell, N., & Son, L. K. (2009). Learners’ choices and beliefs about self-testing. Memory17(5), 493-501.

Pan, S. C., & Rivers, M. L. (2023). Metacognitive awareness of the pretesting effect improves with self-regulation support. Memory & Cognition, 1-20.

Richland, L. E., Kornell, N., & Kao, L. S. (2009). The pretesting effect: Do unsuccessful retrieval attempts enhance learning?. Journal of Experimental Psychology: Applied15(3), 243.

Sana, F., Forrin, N. D., Sharma, M., Dubljevic, T., Ho, P., Jalil, E., & Kim, J. A. (2020). Optimizing the efficacy of learning objectives through pretests. CBE—Life Sciences Education19(3), ar43.