Skip to main content
Beyond Retrieval Practice: The Benefits of Student-Generated Questions
Andrew Watson
Andrew Watson

Retrieval Practice has gotten a lot of press in recent years — especially at our conference last fall on Deeper Learning.

The short version: students don’t benefit much from simple review — say, rereading a passage. But, they benefit a lot from actively trying to recall information — say, answering questions about that passage.

Dr. Pooja Agarwal puts it this way: Students should practice not by trying to put information into their brains, but by trying to take information out.

(She and Patrice Bain have written a great book on the topic: Powerful Teaching.)

We have LOTS of research showing that retrieval practice yields great benefits. Can other strategies match it?

Here’s an idea: maybe instead of having students answer questions (retrieval practice), we should have them create questions to be answered. Just perhaps, generating questions might boost learning more than simple review. Or — let’s get crazy: maybe generating questions boosts learning as much as retrieval practice? Even more?

Generating Research

Over the years, the “generation effect” has been studied occasionally — alas, not as much as retrieval practice. Often, research in this area includes a training session where students learn how to ask good questions. That step makes sense … but it might discourage teachers from adopting this strategy. Who has the time?

Researchers in Germany had three groups of college students read slides from a lecture about infant developmental psychology.

The first group practiced the information by rereading it. Specifically, the were instructed to memorize the content of those slides.

Group two practiced by answering questions on each slide. They if they couldn’t remember the answer, they were allowed to go back and review the slide. In effect, this was “open-book retrieval practice.”

In group three,

“students were instructed to formulate one exam question in an open response format for the content of each slide [,] and also to provide an answer to that question.”

That is: they generated questions.

So, here’s the big question: when they took a surprise quiz, how did students in each group do?

Drum Roll Please…

First: Students who generated questions scored ~10% higher on that surprise quiz than those who tried to memorize information.

Second: Students who generated questions did as well as those who used retrieval practice.

Third: Questioners got these benefits even without explicit training in how to ask good questions.

Fourth: Question generators (and retrieval practicers) scored higher than mere reviewers on both factual question and transfer questions.

Fifth: Researchers got these impressive results even though the surprise quiz took place one week later. (In research like this, those quizzes often happen right away. Of  course, a week’s delay looks a lot more like genuine learning.)

We could hardly ask for better results than these. In this research paradigm, question generation worked as well as retrieval practice — which works better than almost anything else we know of to help students learn.

Explaining Amazing Results

Why would this be? Why does generating questions help students as much as answering them?

This study doesn’t answer that question directly, but it suggests a rough-n-ready answer.

Both common sense and lots o’ research tell us: students learn more when they think hard about something. (Obvi.)

If we increase the challenge of the thinking task, we prompt students to think harder and therefore to learn better.

Psychologists talk about “desirable difficulties”: a level of mental challenge that forces students to work their synapses but doesn’t overtax them.

In this case, we can reasonably hypothesize that students who must create a question on a topic have to think hard about it. To come up with a good question, they have to think at least as hard as students answering questions on that topic.

And, they have to think considerably harder than students who simply reread a passage.

Voila! Generating questions help students learn.

A Few Caveats

As always, research provides teachers with helpful guidance. But: we need to adapt it to our own circumstances.

First: this study took place with college students. We should take care that our students can — in fact — come up with good questions.

For instance, I’m a high-school English teacher. I would use this technique with Their Eyes Were Watching God or Passing or Sula. But I don’t think I’d use it with The Scarlet Letter or Hamlet. My students struggle to understand the basics with those texts; I’m not sure they’d do a good job coming up with resonant exam questions.

More precisely: I’d structure those assignments quite differently. I suspect I could be open-ended with an assignment to create Passing questions, but would offer a lot more guidance for Scarlet Letter questions.

Second: yes, this study found that retrieval practice and question generation resulted in additional learning. And, we have a reasonably hypothesis about why that might be so.

But, we have MUCH more research about retrieval practice. Before we invest too heavily in question generation, we should keep our eyes peeled for more studies.

Third: In this paradigm, trying to memorize resulted in less learning. However, we shouldn’t conclude that students should never try to memorize. At times, “overleaning” is essential for reducing working memory load — which facilitates learning.

As long as we keep these caveats in mind, we can be excited about trying out a new review technique.

And: this can work in online settings as well!

An Exciting Event In Mindfulness Research [Repost]
Andrew Watson
Andrew Watson

I’ve been reviewing old posts, looking for information that might be particularly helpful in today’s strange times.

This post — from September — gives us greater confidence that mindfulness helps reduce stress.

It’s particularly persuasive research because it studies both mental behavior (psychology) and neural behavior (neuroscience) at the same time.

And, we could all use a little stress reduction today…


Let’s imagine a GREAT study on the benefits of mindfulness.

As school people, we’re happy that mindfulness might be helpful at home or at work, but we really want it to be helpful to students. So, we’d love for this study to take place at school.

We’d like the study to show that mindfulness changes mental processes. For instance, we’d love to know that it helps students feel less stress.

And, we’d like the research to look at brains as well as minds. That is: we’d like to have some fMRI data showing relevant changes in brain regions.

At the same time that students report they feel less stress (that’s the mind), we might see neural modulation typical of less stress (that’s the brain).

Finally, the study’s methodology would hold up to scrutiny. It would, for instance, include a plausible control group. (I’ve written about problems with control groups, including this study about mindfulness.)

Lo and Behold

Sure enough, this study exists!

Working with 6th graders at a school outside Boston, Clemens Bauer randomly assigned half to a mindfulness program and half to a coding training program.

Both groups devoted 45 minutes, four times a week to this effort, for 8 weeks. And, by the way, students in both groups enjoyed this time equally. (So: AT LAST we’ve got a plausible and active control group.)

Bauer’s team had students fill out a stress survey before and after this 8-week stretch. (Sample question: “In the last month, how often have you been upset because of something that happened unexpectedly?”)

And, he performed fMRI scans on them before and after as well.

When looking at those scans, Bauer’s team had a specific prediction. High stress responses typically includes elevated amygdala activation. Often, we can manage that stress response by using the prefrontal cortex–the part of the brain right behind your forehead.

If mindfulness helps manage stress, we would expect to see…

…greater connectivity between the prefrontal cortex and the amygdala, and

…concomitantly reduced activity in the amygdala.

That is, we’d be able to see that mindfulness strengthened connections between self-control systems in the prefrontal cortex. In turn, this increase in self-control would help mitigate stress responses in the amygdala.

Of course, I’m offering a very simplified version of a fantastically complex neural story. Books have been written on these connections, and it’s not blog-friendly kind of information.

Results, Please

If you’re a fan of mindfulness, you’re going to LOVE these results.

Students who practiced mindfulness reported less stress than those in the control group.

They showed higher levels of prefrontal cortex connectivity with the amygdala.

They showed lower levels of amygdala activity when they looked at angry faces.

So: both in their mental activity (reported stress level) and in the neural activity (in the amygdala, between the amygdala and the prefrontal cortex), eight weeks of mindfulness led to beneficial results for these students.

Technically speaking, that’s a home run.

What’s Next

First: to repeat, this study is powerful and persuasive. We can simply revel in its conclusions for a while.

Second: as teachers, we’re glad that student stress levels are lower. The next question is: do students learn more? We can assume they do, but we should measure as well. (To be clear: I think lower stress is an important goal on its own, whether or not it leads to more learning.)

Third: as the study’s authors acknowledge, the sample size here is relatively small. I hope they get funding to repeat it on a much larger scale.

As noted in this study, there’s a disappointing history in the world of mindfulness research. Small studies–often lacking random assignment or a control group–come to promising conclusions. But, the bigger the study–and the better the methodology–the smaller the results.

So: now that we’ve gotten strong effects with a randomized study and a plausible control group, I hope to see these same results at a much larger scale.

I might go sit quietly for a while, and try to clear my mind of extraneous thoughts.

Dr. Kurt Fischer: A Tribute
Andrew Watson
Andrew Watson

Professor Kurt Fischer changed my professional life. If you’re reading this blog, odds are good he helped change yours as well.

Throughout most of the 20th century, teachers, psychologists, and neuroscientists had little to say to one another.

Even psychology and neuroscience — two fields that might seem to have many interests in common — eyed each other suspiciously for decades. Certainly teachers weren’t a welcome part of any wary conversation that might take place.

As we all know, and Dr. Fischer helped us see, these fields have so much to learn from each other.

Today’s growing consensus that these disciplines — and several others — should be in constant conversation results in large measure from his insight, effort, generosity, and wisdom. So: he’s changed our lives, and greatly benefited our students.

Since I heard of his death, I’ve been thinking how Dr. Fischer’s great skill was to keep the bigger picture in mind. He did so in at least two essential ways.

Creating Interdisciplinary Institutions

Academic disciplines exist for good reasons. And yet — despite all the good that they do — they can create barriers and restrict conversations.

To foster inter-disciplinary and multi-disciplinary conversations, Dr. Fischer knew we needed institutional systems. In our field, he helped start all the essential ones.

He helped create the Mind, Brain, and Education strand at Harvard’s Graduate School of Education. It was, I believe, the first such program in the world.

He helped found the International Mind Brain Education Society (Imbes.org), which works to “to facilitate cross-cultural collaboration in biology, education and the cognitive and developmental sciences.”

He helped found the Mind Brain Education Journal, which publishes vital interdisciplinary research.

And, of course, he helped organize the very first Learning and the Brain conference — to ensure that these conversations took place not simply in academic institutions, but with classroom teachers as well.

In starting all these institutions and starting all these conversations, Dr. Fischer created a generation of leaders — those who now champion the work we do every day.

That’s the bigger picture he could see from the beginning.

Understanding Brains in Context

Dr. Fischer saw the bigger picture in his teaching life as well.

As part of his work at Harvard’s School of Education, he taught a course on “Cognitive Development, Education, & the Brain.”

Over those weeks, he returned frequently to an especially damaging fallacy, which he called “brain in a bucket.”

That is, he wanted his students not to think about individual brains operating in some disembodied ether. Instead, he wanted us to think constantly about context:

How does the brain interact with the body?

In what ways is it shaped by development?

How do family interactions shape self? Social interactions? Cultural interactions?

How should we think about hormones, and about ethics, and about evolution, and about genetics?

In other words: neuroscience teaches us a lot about brains. But we should always think about the bigger picture within which that brain functions, and about the forces that created it in the first place.

Never focus on “a brain in a bucket,” because that brain makes no sense without the context that surrounds and shapes it.

In Conclusion

So for me, that’s Dr. Fischer’s legacy. He helped create the context that shaped so many of our brains:

Graduate programs in Mind, Brain, Education,

Learning and the Brain conferences (55 and going strong),

Professional associations and journals,

The scholars and conversations that inspire teachers and improve teaching.

The world is better because he lived, and a poorer place now that he’s gone. Happily for us, he left great wisdom and greater understanding behind.

Pure Inquiry, Guided Inquiry, and PISA
Andrew Watson
Andrew Watson

Because scientists work by inquiring, it makes rough-n-ready sense that we should teach science through the process of inquiry. Indeed “inquiry-based learning,” like “problem-based” and “project-based” learning, has emphasized students’ construction of their own understanding.

According to a well-known definition, this pedagogy focuses on students…

… asking questions,

… planning and conducting investigations,

… using appropriate tools and techniques to gather data,

… thinking critically and logically about relationships between evidence and explanations,

… constructing and analyzing alternative explanations,

And so forth.

Of course, we should also inquire: does inquiry-based learning in fact help students learn? This question leads to lots of controversy…

Many Methods

We can explore that question in several ways.

We might, for instance, have one group of students learn a topic through inquiry learning, and a control group learn it through direct instruction. When we test them later, we’ll get a good sense of who learned the material better.

That method — if we do everything right — gives us a clear answer.

But: it focuses on a small group of people learning only one thing. Who knows if that clear answer applies in other circumstances?

Or, we might look at large groups of people who studied many things. If we can find out what method their teachers used, and measure how well they learned, we’ve got another useful strategy for answering our question.

Of course, we’ll be less certain about the quality of the teaching than in the highly-controlled environment. Who knows if the inquiry-based teaching was, in fact, well done?

Following this second approach, researchers in the UK looked at PISA data (PISA = Program for International Student Assessment ), and aligned it with high-stakes testing scores in England: the GCSE (General Certificate of Secondary Education).

The PISA data help here because students rank how much time they spent in various inquiry-learning practices: “every lesson, most lessons, some lessons, never.” For instance, students rate how often they are “allowed to design their own experiments.”

So: by linking PISA data about teaching practices with GCSE scores, those researchers can draw some conclusions about the effectiveness of inquiry learning.

What did they find?

Negotiating Our Own Biases

Before I answer that question, let’s acknowledge a problem.

Many teachers already have opinions — strong opinions — about inquiry learning.

Those opinions bias our responses to new information.

If (for example), I don’t think inquiry learning works, and this research shows that it does, I’m inclined to dismiss the study.

“Well, look at all of these methodological problems!”

Yet (the contrary example) if I’m an inquiry-learning champion, research showing its effectiveness automatically seems wise and well-designed.

“Gosh: it’s hard to think of a fairer way to answer this question! Now we KNOW it works…”

So, here’s my suggestion: decide right now — before you know what the researchers concluded — whether or not you’re confident in this study design.

All study designs have some flaws. This one, for instance, relies on student self report. And, as noted above, it doesn’t have any way to control for the quality of inquiry learning practices.

You might reasonably conclude those flaws are too great. In that case, you don’t get to cite this study even if it ends up confirming your beliefs.

Or, you might reasonably conclude that — on average — errors in self-report and quality control will balance themselves out; and this research method gives a big-enough data pool to draw meaningful conclusions. In that case, you have to pay attention to the study even if it contradicts your beliefs.

So: push yourself to decide now…

The Envelope Please

Now that you’ve decided to give credence to this methodology (or not to do so), here’s what they found.

“In summary, neither high inquiry with low guidance, nor high guidance with low inquiry are related to improved science attainment.” (emphasis added)

That is: pure inquiry learning doesn’t result in more learning that plain-old explanation. And, (as Professor Christian Bokhove notes) it doesn’t result in less learning either.

But:

“There is, however, some tentative evidence that moderate levels of inquiry delivered in conjunction with high guidance may have a small positive impact upon science achievement.” (emphasis added)

That is: lots of plain-old guidance PLUS a moderate amount of inquiry gives a little boost.

By the way: what do the researchers count as “guidance”? Here, they’re looking at PISA questions about teachers’ suggestions for improving performance, meeting learning goals, extra help, and so forth.

Teaching Implications

As I noted above, the “direct instruction vs. inquiry” debate generates A LOT of passion.

I think this study invites us to step back and get some perspective. It seems that — at least in this research paradigm — a healthy dose of teacher explanation and support helps students learn. And, some degree of independent inquiry enhances that result. (“Small positive impact.”)

But: inquiry learning neither yields a whole new level of scientific understanding, nor prevents students from learning much at all.

I suspect that, ultimately, we need to ask more granular questions.

Does inquiry give a greater lift in later grades than earlier ones? Perhaps it helps when scientific findings confirm our prior knowledge, but not when then contradict it? Does the teacher’s experience level matter?

Research into these questions might give us specific and practical classroom guidance.

The Neuroscience of Retrieval Practice
Andrew Watson
Andrew Watson

What’s the best way for students to practice? Should they review information or procedures? Or, should they try to remember or enact them?

We’ve got scads of research showing that retrieval practice helps brains learn.

That is: if I want to learn the definition of a word I’ve studied, I should try to recall it before I look it up again. (For a handy review, check out RetrievalPractice.org.)

So, we know that retrieval practice works. But: why? What’s happening in the brain that makes it work?

Two Possibilities

We’ve got several possible answers, but let’s focus conceptually on two of them.

Increased neural connections

Reduced neural connections

That is: when I engage in retrieval practice, I push myself to remember X. But it takes me a while to get to X. I might start with S, and then wonder about Y. Perhaps I’ll take a detour to gamma. Eventually, I figure out X.

During this mental work, I both remember X and connect X to all those other (rejected) possibilities: S and Y and gamma. By increasing connections among all these topics, I make it easier to remember X later on. If I accidentally think about S, I can quickly get to X.

Or, maybe the opposite process happens.

The first time I try to remember X, I waste mental time with S and gamma. But, the next time, I’ve gotten better at remembering X, and so I take less time to get there. I can “prune away” extraneous mental connections and thereby simplify the remembering process.

In this account, by reducing the steps involved in remembering X, I see the benefits of retrieval practice.

We Have a Winner (?)

A research team in Europe took on this question, and looked at several studies in this field.

Whenever you start looking at neuroscience research, you should brace yourself for complexity. And, this research is no exception. It’s REALLY complicated.

The short version goes like this. Van den Broek and colleagues identify several brain regions associated with memory formation and retrieval. You might have heard of the angular gyrus. You might not have heard of the inferior parietal lobe. Anyway, they’ve got a list of plausible areas to study.

They then asked: did retrieval practice produce more activity in those regions (compared to review)? If yes, that finding would support the “increased connection” hypothesis.

Or, did retrieval practice result in less activity in those regions? That finding would support the “reduced connection” hypothesis.

The answer? Less activity. At least in the studies van den Broek’s team analyzed, the “reduced connection” hypothesis makes better predictions than the “increased connection hypothesis.”

To be clear: I’ve left out a few other explanations they consider. And: I’ve simplified this answer a bit. If you’re intrigued, I encourage you to look at the underlying review: it’s FASCINATING.

To Sum Up

We have at least a tentative idea about why retrieval practice works.

And: we have SUPER PERSUASIVE evidence that retrieval practice works.

Even though we’re not 100% sure about the why, we should — as teachers — give our students as many opportunities as we can to retrieve.

Beyond “Tricks-n-Tips”: What does Cog Sci Tell Us About Online Learning?
Andrew Watson
Andrew Watson

In our early scramble to get teaching online, it’s easy to focus on the immediately practical: how to auto-mute on Zoom, how to use Dropbox links, how to find the best online resources.

This emphasis on tricks and tips makes good sense in the short term.

Once we’ve  gotten a few days’ experience in this new teaching world, we can take a mental step back and ask about the bigger learning picture.

What can cognitive science tell us about teaching and learning online?

As is so often the case, the answer to that question boils down to these words: “don’t just do this thing. Instead, think this way.”

In other words: research can give lots of very specific advice. But it’s probably most useful when it suggests broad, flexible principles that teachers can adapt to our own specific circumstances.

One Place to Start

Regular readers know that working memory is essential for learning. It allows us to hold and combine ideas, bits of information, mental processes, and so forth.

When we successfully hold and combine — and practice doing so the right way — that’s when learning happens.

Alas, we don’t have much working memory.

This CRUCIAL bottleneck dooms many worthy teaching endeavors. But, if we manage it well, we show real expertise in our craft.

So, if the question is:

“what can cognitive science tell us about online learning?”

one answer is:

“As much as we can, we should recognize and mitigate the working memory demands of this new learning world.”

In other words: students are using working memory not only to learn our content, but also to manage the novel physical and mental space in which this learning should happen. As much as feasible, we should help.

A Simple Example

Over on Twitter, I’ve been learning from David Weston (@informed_edu) to get practical information about online teaching. (Some of those “tricks and tips.”)

For instance, he recently posted a video showing how teachers can show a PowerPoint presentation over Zoom.

For some of us, that’s immensely helpful information.

At the same time — depending on your prior knowledge — this video might require lots of heavy lifting in working memory.

You’ve got to use ALT+TAB (if you’re using a PC) or COMMAND+TAB (if you’re using a MAC). You’ve got to navigate one arrangement of buttons for PowerPoint, and a quite different arrangement for Zoom. You’ve got to determine whether or not you have to switch back-n-forth during the presentation to advance the PowerPoint slides.

If you know from PowerPoint and Zoom, then this combination of steps is probably quite easy to manage.

If, however, you’re a newbie to either, then you might struggle to process all those steps effectively. You’ll probably have to rewatch parts of the video. You’ll probably make several mistakes. You’ll probably get frustrated before you finally figure it out.

And — here’s my key point — our students are probably experiencing similar frustrations. They’re figuring out new systems. They’re adapting old learning models to new (bizarre) circumstances.

All that working memory stress comes on top of the working memory stress that learning always requires.

And so my advice is not “do this thing” (“Here’s how you can solve this problem…”) Instead, cognitive science encourages us to “think this way.”

We should develop the new mental habit of asking: how does this particular learning arrangement increase working memory load for me and my students? And, what can I do to fix the problem?

Two Important Points

First: almost certainly the solutions to the working memory problems will be…

… choose to slow down and practice the new/unfamiliar steps,

… use your teacherly instincts,

… be patient with your students and yourself.

That advice isn’t super specific. But: it’s really flexible. And, given what we know about working memory, it really will help.

Second: I’ve used Weston’s video as an example of potential working memory overload NOT because it’s badly done. Instead, Weston has created a video that will help most people; and, it will help even more if we pause to recognize its working memory demands.

That is: if technology just isn’t your thing, if you’ve never Zoomed before, if you’re not sure whether you have a PC or a Mac, assume that you’ll need to reduce working memory demands in one part of your teaching world to create some working memory headroom to deal with the technology.

That’s hard to do. But: it’s MUCH easier to do if we proactively think this way than if we try to solve working memory problems as they occur.

Cognitive science tells us that our brains work that way. We can use that knowledge to make online teaching and learning the best it can be.

Beyond the Mouse: Pointing in Online Learning [Repost]
Andrew Watson
Andrew Watson

As teachers across the country prepare to move our work online, I’ve been looking over previous posts that might offer practical guidance.

This post — from July of last year — asks a simple question: in online teaching, does pointing matter?

Happily, research by Richard Mayer points us in a useful direction.


You know, of course, that the right kind of movement can help students learn. The nascent field of “embodied cognition” works to explore the strategies that work most effectively.

Here’s a collection of resources.

And, here’s a recent blog post about kindergarteners moving to learn the number line.

You also know that online learners easily get distracted, often because they multitask. (I say “they” because you and I would never do such things.)

This recent post shows that even folding laundry — a harmless-seeming activity — reduces online learning.

What happens when we put these two research pools together?

Specifically: can movement reduce distraction, and increase learning, for online learners?

Benefits of Online Pointing?

Several researchers — including the estimable Richard Mayer — wanted to answer that question.

Specifically, they wanted to know: do pointing gestures made by the teacher help online students learn?

They had students watch an online lecture (about “neural transmission,” naturally).

For the first group of students, the teacher pointed at specific places on relevant diagrams.

For the second group, the teacher pointed generally toward the diagrams (but not at specific parts of them).

For the third, the teacher moved his hands about, without pointing specifically.

For the fourth, the teacher didn’t move his hands.

Do different pointing strategies help or hurt?

Benefits Indeed

Sure enough, pointing matters.

Students in the first group spent more time looking at the relevant parts of the diagrams.

They did better on a test that day.

And — most important — they did better than the other groups on a test a week later.

Now: a week isn’t exactly learning. We want our students to remember facts and concepts for months. (Preferably, forever.)

But, the fact that the memories had lasted a week suggests it’s MUCH likelier they’ll last longer still.

Practical Implications

If your classroom life includes online teaching, or teaching with videos, try to include specific pointing gestures to focus students on relevant information. At least with this student population, such gestures really helped.

By the way, this study doesn’t answer an interesting and important question: “does student movement as they watch online lectures help or hurt their learning?”

We know from the study cited above that irrelevant movement (like folding laundry) doesn’t help. But: should students mirror your gestures as they watch videos? Should you give them particular gestures to emulate?

We don’t know yet…but I hope future research helps us find an answer.

What Do Teachers Get Right About Cognitive Science?
Andrew Watson
Andrew Watson

Here’s a chance to test your knowledge about the teaching implications of cognitive science. Which answer would you pick to this question?

After teaching students the names of the branches of the US government and what each does, which would be the most effective way a teacher could help their students remember this information?

A) Have students read the facts for 10 days at the beginning of class.

B) Have students copy the facts into a notebook where they can reference them as needed.

C) Have students take a once-a-week quiz for 10 weeks where they recall the facts from memory.

D) Have students participate in a review game where they have to recall the facts from memory several times in one class period.

As you think about that question — which I’ll answer later in the post — ask yourself: what basic principle of learning informs your choice?

How Can We Discover What Teachers Know?

For several years now, Deans for Impact have worked to improve teacher education. In particular, they want schools of education to emphasize well-established principles from cognitive science.

They have done lots of great work to further this mission — including publishing this invaluable resource on the science of learning. (Quick: download it now!)

Of course, if they — and we — are going to help teachers improve, we have to know what teachers already believe and do. If teachers don’t believe in learning styles theory, then we don’t have to debunk it. (Alas, lots of teachers do.)

To answer that question, Deans for Impact developed a 54 question assessment of teacher beliefs, and administered it to 1000+ teachers in the fall of 2019. The question you answered above is one of those 54 questions.

Based on the answers they got, they now have a much better idea of typical beliefs and misunderstandings. As they note, however, these teachers are enrolled in education schools that are interested in cognitive science. So:

“the data generated from this assessment is more likely to overstate what most teacher-candidates know about learning science.”

With that caveat in mind, what did they learn?

What Do Teachers Know about Cognitive Science?

Unsurprisingly, D4I found a mixed bag.

In some categories, teachers-in-training did quite well. In particular, they had good information about the importance of building, and the right ways to build, feedback loops.

That’s really good news, of course, because feedback is so important.

In general, teachers also had a clear understanding that prior knowledge matters a lot. When students lack relevant background knowledge, they struggle mightily to learn.

Sadly, teachers overestimated the possibility of critical thinking.

Of course we want our students to have strong critical thinking skills. But, for the most part, those skills don’t exist generically. That is: I must have a great deal of specific content knowledge before I can think critically about a particular topic.

If that claim seems surprising or suspect, try to answer this question: are Dreiser’s novels more like Wharton’s or Dos Passos’s? Unless you know A LOT about Dreiser and Wharton and Dos Passos (and novels), you’ll struggle to have much to say.

Needs Improvement

Alarmingly, teachers-in-training scored only 33% on questions relating to “practicing with a purpose.” We learn almost everything by practicing in the right way, so this finding should encourage us to focus quite emphatically on this research field.

To do that, let’s return to the question at the top of this post. What kind of practice would help students remember information about branches of the US government?

A) Have students read the facts for 10 days at the beginning of class.

This choice spaces practice out. That’s good. But, it doesn’t allow for active recall. As we know from the world of retrieval practice, recall creates more lasting memories than mere review.

B) Have students copy the facts into a notebook where they can reference them as needed.

This choice is a dud. It requires students to do minimal processing (“copying”!), and to do it once. Nothing to see here. Move along.

C) Have students take a once-a-week quiz for 10 weeks where they recall the facts from memory.

Choice C requires recall (a quiz). And, it includes spacing (over 10 weeks!). Spacing + retrieval looks great!

D) Have students participate in a review game where they have to recall the facts from memory several times in one class period.

This option sounds fun — it’s a game! And, it includes active recall. But, alas, active recall combined with fun isn’t as beneficial as active recall combined with spacing.

So, we might be tempted by option D — in fact, 60% of teachers-in-training chose it. Only 13% opted for choice C: the one best supported by cognitive science. (By the way: if you’re interested in combining retrieval practice with spacing, check out this research.)

In Sum

Generally speaking: keep Deans for Impact on your radar. They’re a GREAT (and greatly reliable) resource for our work.

Specifically speaking: this most recent report lets us know where we should focus most urgently as we help teachers improve our profession.

Overcoming Potential Perils of Online Learning [Repost]
Andrew Watson
Andrew Watson

In June of 2019, I wrote about Dr. Rachael Blasiman’s research into the effect of typical distractions on online learning.

Given the current health climate, I thought her work might be especially helpful right now.

The key take-aways here:

First: (unsurprisingly) distractions interfere with online learning, and

Second: (crucially) we can do something about that.

In brief, we should start our online classes by teaching students how to learn online…

Here’s the post from June.


Online learning offers many tempting — almost irresistible — possibilities. Almost anyone can study almost anything from almost anywhere.

What’s not to love?

A tough-minded response to that optimistic question might be:

“Yes, anyone can study anything, but will they learn it?”

More precisely: “will they learn it roughly as well as they do in person?”

If the answer to that question is “no,” then it doesn’t really matter that they undertook all that study.

Rachael Blasiman and her team wanted to know if common at-home distractions interfere with online learning.

So: can I learn online while…

…watching a nature documentary?

…texting a friend?

…folding laundry?

…playing a video game?

…watching The Princess Bride?

Helpful Study, Helpful Answers

To answer this important and practical question, Blasiman’s team first had students watch an online lecture undistracted. They took a test on that lecture, to see how much they typically learn online with undivided attention.

Team Blasiman then had students watch 2 more online lectures, each one with a distractor present.

Some students had a casual conversation while watching. Others played a simple video game. And, yes, others watched a fencing scene from Princess Bride.

Did these distractions influence their ability to learn?

On average, these distractions lowered test scores by 25%.

That is: undistracted students averaged an 87% on post-video quizzes. Distracted students averaged a 62%.

Conversation and The Princess Bride were most distracting (they lowered scores by ~30%). The nature video was least distracting — but still lowered scores by 15%.

In case you’re wondering: men and women were equally muddled by these distractions.

Teaching Implications

In this case, knowledge may well help us win the battle.

Blasiman & Co. sensibly recommend that teachers share this study with their students, to emphasize the importance of working in a distraction-free environment.

And, they encourage students to make concrete plans to create — and to work in — those environments.

(This post, on “implementation intentions,” offers highly effective ways to encourage students to do so.)

I also think it’s helpful to think about this study in reverse. The BAD news is that distractions clearly hinder learning.

The GOOD news: in a distraction-free environment, students can indeed start to learn a good deal of information.

(Researchers didn’t measure how much they remembered a week or a month later, so we don’t know for sure. But: we’ve got confidence they had some initial success in encoding information.)

In other words: online classes might not be a panacea. But, under the right conditions, they might indeed benefit students who would not otherwise have an opportunity to learn.


I’ve just learned that both of Dr. Blasiman’s co-authors on this study were undergraduates at the time they did the work. That’s quite unusual in research world, and very admirable! [6-11-19]

Does Teaching HANDWRITING Help Students READ?
Andrew Watson
Andrew Watson

I recently saw a newspaper headline suggesting that teaching students HANDWRITING ultimately improves their READING ability.

As an English teacher, I was intrigued by that claim.

As a skeptic, I was … well … skeptical.

In this case, we have two good reasons to be skeptical. First, we should always be skeptical. Second, claims of transfer rarely hold up.

What is “transfer”?

Well, if you teach me calculus, then it’s likely I’ll get better at calculus. If you teach me to play the violin, it’s likely I’ll get better at playing the violin. But: if you teach me to play the violin, it’s NOT likely that this skill will transfer to another skill — like calculus. (And, no: music training in youth doesn’t reliably improve math ability later in life.)

In fact, most claims of transfer — “teaching you X makes you better at distantly-related-thing A” — end up being untrue.

So, is it true — as this newspaper headline implied — that handwriting skills transfer to reading skills?

The Research

This newspaper article pointed to research by Dr. Anabela Malpique, working in Western Australia.

Her research team worked with 154 6-7 year-olds around Perth. They measured all sorts of variables, including…

…the students’ handwriting automaticity (how well can they write individual letters),

…their reading skills (how accurately they read individual words),

…the amount of time the teachers reported spending in reading/writing instruction.

And, they measured handwriting automaticity and reading skills at the beginning and end of the year. For that reason, they could look for relationships among their variables over time. (As you can see, Malpique’s research focuses on many topics — not just the writing/reading question that I’m discussing in this post.)

Tentative Conclusions

To their surprise, Malpique’s team found that more fluent letter formation at the beginning of the year predicted more fluent word reading at the end of the year. In their words, this finding

suggest[s] that being able to write letters quickly and effortlessly in kindergarten facilitates pre-reading and decoding skills one year later.

In other words: this research allows the possibility that teaching writing does ultimately help students read single words.

However — and this is a big however — the researchers’ methodology does NOT allow for causal conclusions. They see a mathematical “relationship” between two things, but don’t say that the writing ability led to later reading ability.

They warn:

Experimental research is needed to confirm these findings[,] and systematically evaluate potential explanatory mechanism[s] of writing-to-reading effects over time in the early years.

They specifically note that they did NOT measure reading comprehension; they measured single word reading.

To put this in other words: we would like to know if

a) teaching letter writing leads to

b) improved letter writing fluency, which leads to

c) improved single word reading, which leads to

d) improved reading comprehension.

These findings make the b) to c) connection more plausible, but the certainly do not “prove” that a) leads to d).

Classroom Implications

This research doesn’t claim we should make big changes right away.

I do think it leads to this conclusion:

Some schools are replacing books with computers and tablets. I can imagine (although I haven’t heard this) that advocates might make this claim:

“In the future, no one will need to write by hand. Everything will be keyboarding, and so we need to get children typing as soon as possible. Let’s replace handwriting instruction with keyboarding instruction, to prepare our kids for the future!”

If we hear that argument, we can say:

“I have LOTS of objections to that logical chain. In particular, we have tentative reasons to believe that handwriting instruction improves reading. If that’s true — and we don’t yet know — we should be VERY wary of doing anything that slows our students’ ability to read. We might not be handwriting so much in the future, but we’ll be reading forever.”

In sum: I don’t think that newspaper article captured essential nuances. However, this research raises the intriguing possibility that transfer just might take place from writing instruction to single-word reading. We need more research to know with greater certainty.

But, given the importance of reading for school and life, we should be excited to find anything that can help students do better.