Skip to main content
Getting the Order Just Right: When to “Generate,” When to “Retrieve”?
Andrew Watson
Andrew Watson

When teachers get advice from psychology and neuroscience, we start by getting individual bits of guidance. For instance…

… mindful meditation reduces stress, or

… growth mindset strategies (done the right way) can produce modest benefits, or

… cell phones both distract students and reduce working memory.

Each single suggestion has its uses. We can weave them, one at a time, into our teaching practices.

After a while, we start asking broader questions: how can we best combine all those individual advice bits?

For instance: might the benefits of growth mindset strategies offset the detriments of cell phones?

Happily, in recent years, researchers have started to explore these combination questions.

Retrieval Practice, Generative Learning

Long time readers know about the benefits of retrieval practice. Rather than simply review material, students benefit when they actively try to recall it first.

So too, generative learning strategies have lots of good research behind them. When students have to select, organize, and integrate information on their own, this mental exercise leads to greater learning. (Check out a handy book review here.)

Now that we have those individual bits of guidance, can we put them together? What’s the best way to combine retrieval practice with generative learning?

A recent study explored exactly this question.

Researchers in Germany had college students study definitions of 8 terms in the field of “social attribution.”

So, for instance, they studied one-sentence definitions of “social norms” or “distinctiveness” or “self-serving bias.”

One group — the control group — simply studied these definitions twice.

A second group FIRST reviewed these words with retrieval practice, and THEN generated examples for these concepts (that’s generative learning).

A third group FIRST generated examples, and THEN used retrieval practice.

So, how well did these students remember the concepts — 5 minutes later, or one day later?

The Envelope Please

The researchers wanted to know: does the order (retrieval first? generation first?) matter?

The title of their study says it all: “Sequence Matters! Retrieval practice before generative learning is more effective than the reverse order.”

Both 5 minutes later and the next day, students who did retrieval practice first remembered more than those who came up with examples first (and, more than the control group).

For a variety of statistical reasons, I can’t describe how much better they did. That is: I can’t say “These student scored a B, and these score a B-.” But, they did “better enough” for statistical models to notice the difference.

And so, very tentativelyI think we teachers can plan lessons in this way: first instruct, then have students practice with retrieval, then have them practice with generation.

Wait, Why “Tentatively”?

If the research shows that “retrieval first” helps students more than “generation first,” why am I being tentative?

Here’s why:

We can’t yet say that “research shows” the benefits of a retrieval-first strategy.

Instead, we can say that this one study with these German college students who learned definitions of words suggests that conclusion.

But: we need many more studies of this question before we can spot a clear pattern.

And: we’d like some 1st grade students in Los Angeles, and some 8th grade students in Reykjavik, and some adult learners in Cairo before we start thinking of this conclusion as broadly applicable.

And: we’d like to see different kinds of retrieval practice, and different kinds of generative learning strategies, before we reach a firm conclusion.

After all, Garvin Brod has found that different generative learning strategies have different levels of effectiveness in various grades. (Check out this table from this study.)

To me, it seems entirely plausible that students need to retrieve ideas fluently before they can generate new ideas with them: hence, retrieval practice before generative learning.

But, “entirely plausible” isn’t a research-based justification. It’s a gut feeling. (In fact, for various reasons, the researchers had predicted the opposite finding.)

So, I think teachers should know about this study, and should include it our thinking.

But, we shouldn’t think it’s an absolute conclusion. If our own students simply don’t learn well with this combination, we might think about switching up the order.

TL;DR

Students learn more from retrieval practice, and they learn more from generative learning strategies.

If we want to combine those individual strategies, we’ll (probably) help students more if we start with retrieval practice.

And: we should keep an eye out for future research that confirms — or complicates — this advice.


Roelle, J., Froese, L., Krebs, R., Obergassel, N., & Waldeyer, J. (2022). Sequence matters! Retrieval practice before generative learning is more effective than the reverse order. Learning and Instruction80, 101634.

Brod, G. (2020). Generative learning: Which strategies for what age?. Educational Psychology Review, 1-24.

The Bruce Willis Method: Catching Up Post-Covid
Andrew Watson
Andrew Watson

In the third Die Hard movie, Brue Willis and his unexpected partner Samuel L. Jackson need to get to Wall Street a hurry. They commandeer a cab.

An experienced cab driver, Jackson suggests taking 9th Avenue south, but Willis insists on going through Central Park.

It turns out: he doesn’t mean taking the road that runs through the Central Park, but driving through the park itself — across crowded lawns, through busy playgrounds, past famous fountains, down winding bike-paths.

His desperate short-cut helps the team catch up.

In education these days, it seems that we need our very own Bruce Willis.

Because of Covid, our students are WAY BEHIND.

5th graders don’t know as much math as they used to. 2nd graders can’t read as well as they once could. 9th graders have lost even more social skills than 9th graders usually lose.

Because our students know less and can do less, we teachers want to help them CATCH UP.

And so we ask: what’s the educational analogue to driving through the park? How can we — like Bruce and Samuel — help our students learn faster?

Like lots of folks, I’ve been thinking about that question for a while now. I’ve got bad news, and worse news; and I’ve got good news.

The Bad News

The Bruce Willis Method does not exist in education.

We can’t “drive through the park.” We can’t, in other words, help students “learn the same amount, only faster.”

Here’s why I say so:

If we knew how to teach any faster, we would have been doing so already.

Seriously. Do you know any teacher who says, “I could have covered this curriculum in 10 weeks. But what the heck, I’m going to drag it out and take 12 or 13”?

I don’t. And I suspect you don’t either.

We have always been helping our students learn as best we could. If we knew better ways, we would have been using them.

Of course Willis can get through the park faster; it was a MOVIE!  Alas, we can’t follow his example.

I am, in fact, quite worried about all the talk of “catching up.” In my mind, it creates two clear dangers:

First Danger:

If we try to catch up, we’ll probably — in one way or another — try to speed up. We will, for instance, explore a topic in 2 weeks instead of 3 weeks. We will combine 3 units into 1.

However, the decision to speed up necessarily means that students spend less time thinking about a particular topic.

As Dan Willingham has taught us: “memory is the residue of thought.” If students spend less time thinking about a topic, they will learn less about it.

The result: they won’t catch up. Instead, they will be further behind.

In other words: such efforts to help students recover from Covid learning muddle will — paradoxically —  hinder their learning.

Second Danger:

If we believe that “catching up” is a realistic short-term possibility, we open ourselves up to inspiring-but-unfounded claims.

People who don’t work in schools will tell us that “you can’t solve problems with the same thinking that created those problems in the first place.”

Their claims might include words & phrases like “transformational” or “thinking outside the box” or “new paradigm” or “disrupt.”

These claims will almost certainly come with products to buy: new technology here, new textbooks there, new mantras yon.

They will sound uplifting and exciting and tempting and plausible.

But…

… any “research-based” claims will almost certainly extrapolate substantially beyond the research’s actual findings;

… these ideas won’t have been tested at scale in a realistic setting;

… such claims will defy core knowledge about cognitive architecture. (No, students can’t overcome working memory limitations simply because “they can look up everything on the internet.”)

In other words: because the goal (“catching up”) is so tempting, we might forget to be appropriately skeptical of inspiring claims (“your students can catch up if you only do THIS THING!”).

Now is the time to be more skeptical, not less skeptical, of dramatic claims.

The Good News

Despite all this gloomy news, I do think we have a very sensible and realistic option right in front of us.

I propose three steps for the beginning of the next school year.

Step 1: determine what our students already know.

In previous years, I could reasonably predict that my students know this much grammar and this much about Shakespeare and this much about analyzing literature.

Well, they just don’t anymore. I need to start next year by finding out what they really do know. (Hint: it will almost certainly be less — maybe dramatically less — than they did in the past.)

Step 2plan a realistic curriculum building from that foundation.

If we meet our students where they are, they are much likelier to learn the new ideas and procedures we teach them.

In fact, they’re also likelier to strengthen and consolidate the foundation on which they’re building.

Yes, I might feel like my students are “behind.” But they’re behind an abstract standard.

As long as they’re making good progress in learning new ideas, facts, and procedures, they’re doing exactly the right cognitive work. They won’t catch up this year.

But if they make steady progress for several years, they’ll be well back on track.

Step 3draw on the lessons of cognitive science.

In the paragraphs above, I’ve been highly skeptical of uplifting, simplistic quick-fix claims. (“If we revolutionize education with X, our students will learn calculus in 6th grade!”)

At the same time, I do think that teachers can make steady and coherent improvements in our work. When we understand the mental processes that lead to long-term memory formation, we can teach more effectively.

We should study…

working memory function: the core mental bottleneck that both allows and impedes learning;

… the importance of desirable difficulties — spacing, interleaving, retrieval practice — in forming long-term memories;

… the sub-components of attention that add up to concentration and understanding;

… a realistic framework for understanding student motivation.

And so forth.

Understanding these topics will not “revolutionize education overnight.”

However, teachers who design lessons and plan syllabi with these insights in mind can in fact help their students consolidate ideas more effectively.

In other words: don’t follow Bruce Willis through the park.

Instead, we should learn how learning takes place in the brain. When our teaching is guided by that knowledge, our students have the best long-term chance of getting back on track.

Does a Teacher’s Enthusiasm Improve Learning?
Andrew Watson
Andrew Watson

Sometimes research confirms our prior beliefs.

Sometimes it contradicts those beliefs.

And sometimes, research adds nuance and insight to overly-broad generalizations.

Here’s the story:

Benefits of Enthusiasm

It seems too obvious to say that a teacher’s enthusiasm benefits learning. OF COURSE it would do that.

After all, what student wants a boring, unenthusiastic teacher?

But psychology is a science. We don’t just announce that our beliefs — even really obvious beliefs — are true.

Instead, we convert those beliefs into testable hypotheses. We run some experiments. We look at data.

IF the data from the experiment support the hypothesis, then we can start making (tentative) claims.

Once we start thinking scientifically about the effects of a teacher’s enthusiasm, we quickly run into difficult questions.

How, exactly, do we define “enthusiasm”?

One we’ve got a definition, how do we measure it?

What results are we looking for? Do we want enthusiasm to promote students’ attention? Their motivation? Their learning?

If we don’t have clear answers to those questions, we can’t proceed with a scientific answer. (We can, of course, have an answer based on personal experience. Those answers are important, but not the same thing as a scientific answer.)

Getting Specific

In a study published in 2020 — “Displayed enthusiasm attracts attention and improves recall” — several scholars took on those challenges directly.

They started by training teachers in behaviors that demonstrate high levels of enthusiasm (exuberant gestures, varied facial expression, excited & rapid speech, etc.) and low levels of enthusiasm (a few quiet gestures, fixed facial expression, vocal monotone).

Teachers then read two short passages to 4th and 5th grade public-school students. One passage was a story about a farmer; the other was a description of the habits and characteristics of dragonflies. (By the way: this distinction between the story and the description will turn out to be important.)

These passages together took about 3 minutes to read.

To measure the effect of high enthusiasm vs. low enthusiasm, researchers counted several variables, including…

… the number of seconds that students looked at the reader;

… the number of times that students smiled;

… and, the number of facts about the farmer story and dragonfly description that students recalled.

In other words: these researchers found ways to answer those scientific questions listed above. So far, so good.

Asking Tough Questions

At this point, we can ask some reasonable questions:

First, counting “number of seconds” seems like a basically plausible way of measuring attention. (We can quibble, and ask for other measures, and explain why that measure isn’t perfect, but it’s plausible on its face.)

However, I myself think that “counting smiles” seems unusually squishy for a research-based conclusion. Perhaps I’m being overly picky here, but “smiles” strike me as a highly amorphous unit of counting.

Second, the duration of the “enthusiasm” — all of 3 minutes — might not be a helpfully representative amount of time.

For instance: a teacher might delight students by telling jokes for a minute or two at the beginning of class. All that humor might get high ratings from students.

But: if that teacher keeps telling jokes, all that forced humor might get irritating after a while. So too, “high enthusiasm” might have one effect for 3 minutes and a very different effect after 30.

Third, the study measures how many facts students remember immediately after they heard the reading.

Of course, teachers don’t want students to remember just right away; we want them to remember for a long time. And the relationship between short-term and long-term memory gets really complicated.

Strategies that help immediate recall might not enhance long-term learning; Nick Soderstrom has the goods here.

Results?

So, what did the researchers find?

Any study that measures so many variables will produce LOTS of findings. Those findings will be difficult to summarize easily.

The study summarizes their findings in this sentence:

Our results confirm that displayed enthusiasm captures attention and that attention partially explains the positive effect of displayed enthusiasm on recall.

For the reasons listed above, I’m hesitant to accept that conclusion without several caveats. At a minimum, I wish it said “short-term recall.”

Even more important, I think this summary overlooks a crucial finding. Researchers found that “enthusiasm” enhanced short-term recall for the farmer story, but NOT for the dragonfly description.

This distinction leads to an important question: do you spend more time in your classroom telling (farmer-like) stories or providing (dragonfly-like) information and descriptions?

The answer to that question certainly varies from teacher to teacher, from grade to grade, from discipline to discipline, from culture to culture.

Even the most optimistic reading of this study suggests that high enthusiasm will help students remember the story, but not the information.

That’s an important distinction; one we should make clearly when offering advice to teachers.

The Bigger Picture

I myself have a hypothesis.

I suspect that a teacher’s consistent and genuine enthusiasm — not 3 minutes, not 3 weeks, but maybe 3 months or more — gradually creates a particular kind of classroom atmosphere.

That atmosphere — quietly, subtly, probably immeasurably — helps students appreciate the class work, the discipline, and the camaraderie/community.

For instance: a student recently described one of my colleagues this way: “Oh, Ms. So-and-So! She’s the ONLY reason I like English…” Knowing Ms. So-and-So’s enthusiasm for her subject,  I can certainly understand why she would inspire a doubting high-school student.

And I suspect her enthusiasm ultimately means that this student learns more English.

As you can see, my hypothesis doesn’t stem from research. Heck: it’s so nebulous that I don’t think it could be researched.

In other words: do I think that a teacher’s enthusiasm ultimately enhances learning? I do. And: my belief springs not from research, but from experience and common sense. *


Moe, A., Frenzel, A. C., Au, L., & Taxer, J. L. (2021). Displayed enthusiasm attracts attention and improves recall. British Journal of Educational Psychology91(3), 911-927.


* To be clear: I haven’t done a comprehensive search for research on teacher enthusiasm. I did plug this study into ConnectedPapers.com, and quickly scanned the results. As far as I could tell from this very brief look, research in this field is pursuing lots of helpful and optimistic leads, but doesn’t yet have confident conclusions. If you know of persuasive research looking at this topic, I hope you’ll let me know!

When Analogies Go Wrong: The Benefits of Stress?
Andrew Watson
Andrew Watson

An amazing discovery becomes an inspiring analogy:

Researchers at BioSphere 2 noticed a bizarre series of events: their trees kept collapsing under their own weight.

Why on earth would trees collapse? It doesn’t happen outside the BioSphere; so why would it happen inside?

And then the researchers figured it out. The BioSphere doesn’t have wind.

Trees react to the stress of wind by growing stronger. If they don’t get that beneficial stress, they can’t stand up when they become adult trees.

And here’s the heart-warming bit: that’s true for humans too.

As we grow and develop, we need some modest, reasonable stresses in our lives. Those small stressors make our emotional “tree trunks” strong, so we can manage the greater stresses of adult life.

I really want to make an uplifting poster right now — don’t you?

First Things First

This story that I’ve told begins with science: “Researchers at the Biosphere…”

And so, when I read that story, I felt a small shudder of delight. I can use this story to explain to students — and parents, and teachers — the benefits of reasonable/modest stresses in their lives.

After all, it’s a GREAT story, and a great analogy.

Even better, I can share the research behind it. (That’s what I do for a living: share research with teachers, students, and parents.)

However, the website where I first read that story doesn’t link to any research.

Hmmm.

So, I started looking.

This trees-need-wind story (and its uplifting analogy) shows up frequently on the interwebs. In fact, I think I notice two waves — one around 2013, another around 2020.

But, exactly none of the articles included any scientific links — much less links supporting the claim.

Glimmers of Hope?

When I switched from Google to Google Scholar, I did find this brief report.

It appears in Science magazine — a highly reputable source — and includes this sentence:

The trunks and branches of large trees became brittle and prone to catastrophic and dangerous collapse.

So, have I found the scientific backing that this analogy was missing?

Alas, this sentence is but one part of a long catalogue of problems in BioSphere 2, as noted in that report:

Vines grew “exceptionally aggressive[ly].”

19 of 25 vertebrate species went extinct.

“All pollinators went extinct.”

CO2 levels, oxygen levels, temperature, and light exposure all went haywire.

And, NONE of these problems has much of anything to do with wind.

In fact, the word “wind” doesn’t appear in this brief article.

Simply put: as far as I can tell, the whole “wind makes trees stronger” story sounds great, but has no research backing — certainly not at Biosphere 2.

Some Conclusions

First: does wind help strengthen trees?

Maybe.

I’ve been reading about something called — believe it or not — “reaction wood.” You can read about it here.

Second: does manageable stress benefit people in the long run.

Sure.

Check out “Yerkes-Dodson.”

Third: should we use uplifting-but-false analogies to communicate important scientific truths?

As long as Learning and the Brain is here, heck no.

Handwritten Notes or Laptop Notes: A Skeptic Converted?
Andrew Watson
Andrew Watson

Here’s a practical question: should our students take notes by hand, or on laptops?

If we were confident that one strategy or the other produced more learning – factual learning, conceptual learning, ENDURING learning – then we could give our students straightforwardly useful advice.

Sadly, the research in this field has – in my opinion – produced unhelpful advice because it rests on an obviously flawed assumption.

Happily, Dr. Paul Penn (Twitter handle @Dr_Paul_Penn) recently pointed me to a study with several pertinent benefits.

First, the researchers worked with 10-year-olds, not with adults. Research with college students can be useful, but it might not always help K-12 teachers.

Second, the research took place in the students’ regular classroom, not in a psychology lab. This more realistic setting gives us greater confidence in the research’s applicability.

Third, students took notes in both a science class and in a history class. The disciplinary breadth makes its guidance more useful.

Finally, this study – for reasons that I’ll explain – makes the “obviously flawed assumption” go away.

In this post,

I’ll start by explaining the new study.

Then I’ll explain the initial study (with the “obvious flaw”).

Then I’ll explain how the new study – by accident – makes that flaw go away.

I’ll wrap up with the big picture.

The Black Death, and Beyond

Researchers Simon Horbury and Caroline Edmonds had ten-year-olds watch videos in their history and science classes.

The history videos focused on the Black Death. The science video explored cells.

Students took laptop notes in one class, and handwritten notes in the other.

Immediately after the videos, and then again a week later, students took a multiple choice quiz. Questions covered both factual recall (“Where did the Black Death originate?”) and conceptual understanding (“Why were the wealthy less likely to be afflicted by the plague?”).

To be thorough, researchers even counted the number of words students wrote in their notes. (Believe it or not, this detail will turn out to be important at the end of this post.)

So, did it matter how students took notes?

Yup.

The study measures several variables, but the headline is: in both science and history, taking notes by hand improved learning – especially a week later.

The study includes lots of specifics — conceptual vs. factual, immediate test vs. week-later test — but that summary gets the job done.

Yes, this is a very small study (26 people at its biggest), so we shouldn’t think it’s the final word on the matter. But it offers good reason to believe that handwritten notes help.

Back to the Beginning

Like all research in this field, Horbury & Edmonds’s work rests atop a well-known study by Mueller and Oppenheimer, cleverly entitled “The Pen Is Mightier than the Laptop.”

I’ve written about this study several times before, so I’ll be brief here.

Mueller and Oppenheimer had one group of college students take notes by hand, and another group take notes on a laptop. They found that two variables mattered for learning:

Variable #1: the number of words students wrote. Crudely put: more words in notes resulted in more learning.

This finding isn’t terribly surprising. More writing suggests more thinking; more thinking suggests more learning.

Variable #2: the degree to which students reworded the lecture. Student who put the lecture’s ideas into their own words learned more than those who simply took notes verbatim.

Again, this finding makes sense. If I simply copy down the lecturer’s ideas, I’m not thinking much. If I put them in my own words, well, now I’m thinking more.

So far, so good. No obvious flaws.

Now the study gets tricky.

The students who took handwritten notes wrote FEWER words (that’s bad), so they had to REWORD the lecture (that’s good).

The students who took laptop notes could write MORE words (that’s good), so they ended up copying the lecture VERBATIM (that’s bad).

Which pairing of good+bad is better?

In Mueller and Oppenheimer’s conclusion, handwritten notes resulted in more learning.

It’s okay to write fewer words, as long as you’re rewording as you go. Remember: more rewording = more thinking.

Obvious Flaw

I promised several paragraphs ago to point out the obvious flaw in the study. Here goes:

Mueller and Oppenheimer saw an obvious possibility: if we TRAIN laptop note takers to reword, then they’ll get BOTH benefits.

That is, students who take laptop notes correctly get the advantages of more words and more rewording.

So much thinking! So much learning!

So, the researchers ran the study again. This time they included a third group: laptop note takers who got instructions not to reword.

What happened?

Nothing. Even though they got those instructions, laptop note takers continued to copy verbatim. They still remembered less than their handwriting peers.

The Mueller and Oppenheimer study draws this conclusion: since students can’t be trained to take laptop notes correctly – and they tried! – then handwritten notes are best.

WAIT JUST A SECOND. [Please mentally insert the sound of a record scratch here.]

The researchers told students – ONCE – to change a long-held habit (verbatim copying of notes). When students failed to do so, they concluded that students can’t ever change.

In my own experience, telling my students to do something once practically NEVER has much of an effect.

Students need practice. LOTS of practice. Practice and FEEDBACK. Lots of feedback.

Obviously.

In other words, I think the Mueller and Oppenheimer study contains a conspicuous failure in logic. We shouldn’t conclude that handwritten notes are better. We SHOULD conclude that we should teach students to take laptop notes and reword as they do so.

If they can learn to do so (of course they can!), then laptop notes will be better — because they allow more words AND rewording.

Muller and Oppenheimer’s own data make that the most plausible conclusion.

Conflicting Messages

To review:

The Horbury & Edmonds study suggests that handwritten notes are better.

The Mueller and Oppenheimer study suggests (to me, at least) that laptop notes will be better – as long as students are correctly trained to reword notes as they go.

Which advice should we follow?

My answer comes back to that obscure detail I noted in parentheses.

Horbury and Edmonds, you may remember, counted the number of words students wrote. Unlike the college students, who can type faster than they write, 10-year-olds don’t.

They wrote basically the same number of words by hand as they did on the laptop.

Here’s the key point: as long as students write as fast as they type, the hypothetical advantage that I predict for college laptop note-takers simply won’t apply to younger students.

After all, laptop notes provide additional benefit only if students write more words. These younger typists don’t write more words.

Since handwritten notes produce more learning, let’s go with those!

Final Thoughts

In this post, I’ve considered two studies about note taking and laptops.

In truth, several studies explore this field. And, unsurprisingly, the results are a bit of a hodge-podge.

If you want a broader review of research in this field, check out this video from Dr. Paul Penn, who first pointed me to the Horbury and Edmonds study:

https://www.youtube.com/watch?v=TXLHxf__poE

Given the research we have, I DON’T think we can make emphatic, confident claims.

But, based on this study with 10-year-olds, I’m much more open to the possibility that handwritten notes are — at least in younger grades — the way to go.


Horbury, S. R., & Edmonds, C. J. (2021). Taking class notes by hand compared to typing: Effects on children’s recall and understanding. Journal of Research in Childhood Education35(1), 55-67.

Mueller, P. A., & Oppenheimer, D. M. (2014). The pen is mightier than the keyboard: Advantages of longhand over laptop note taking. Psychological science25(6), 1159-1168.

Too Good to Be True? “Even Short Nature Walks Improve Cognition”?
Andrew Watson
Andrew Watson

Good news makes me nervous.

More precisely: if I want to believe a research finding, I become very suspicious of it. After all: it’s easy to fool me when I want to be fooled.

Specifically: I’m an outdoors guy. I’ve worked at summer camps for ages, and love a good walk in the forests around Walden Pond.

So, when I read research showing that even a brief nature walk produces cognitive benefitsI’m both VERY EXCITED and EXTRA SKEPTICAL.

Let’s start with the assumption that it’s just not true.

Persuade Me

The research I’m speaking of is in fact a review article; it summarizes and compares the results of 14 studies. (The review article was flagged by Professor Dan Willingham, one of the leaders in translating science research for the classroom.)

These 14 studies shared important commonalities:

First: they looked at “one-time” exposure to nature. They didn’t look at — say — outdoor education programs. Instead, they looked at — say — a brisk walk in a park near the school.

Second: these “one-time exposures” were all relatively brief — somewhere between 10 and 90 minutes.

Third: these “brief, one-time exposures” did NOT deliberately focus the participants on nature. That is: students didn’t walk in the park to learn about trees and birds. They walked in the park to have the experience of walking in the park.

I might be skeptical about one study. I might be skeptical of two studies. But if 14 studies (or a substantial percentage of them) all reach the same conclusion … well, maybe I’ll be persuaded.

Equally interesting: these studies ran the K-16 gamut. We’re not looking at a narrow age-range here: more like two decades.

Conclusions (and Questions)

So, what did this potentially-persuasive bunch of studies show?

YES: in 12 of the 14 studies, brief, one-time, passive exposure to nature does benefit cognition.

More specifically, researchers found benefits in measures of directed attention and working memory.

They looked for, but did not find, benefits in measures of inhibition (another important executive function).

And, crucially, they did not measure academic performance. If a walk in nature enhances attention and working memory, we can reasonably predict that it will also improve learning. But: these studies did not measure that prediction.

Because this review covers so many studies, it’s easy to get lost in the details.

One point I do want to emphasize: the impressive variety of “exposures.”

Some students walked or played in a park, woods, or nature trail.

Some simply sat and read outdoors.

Amazingly, some walked on a treadmill watching a simulated nature trail on the monitor.

In fact, some simply sat in a classroom “with windows open on to green space.”

In other words: it doesn’t take much nature to get the benefits of nature.

Inevitable Caveats

First: in these studies, exposure to nature helped restore attention and working memory capacity that had been strained.

It did not somehow increase overall attention and WM capacity in an enduring way. Students recovered faster. But they didn’t end up with more of these capacities than they started with.

Second: most of these “exposures” included some modest physical activity.

How much (if any) of the benefit came from that physical exertion, instead of the greenery?

We don’t yet know.

A Skeptic Converted?

I have to say, I’m strongly swayed by this review.

In the past, I’ve seen studies that might contradict this set of conclusions.

But the number of studies, the variety of conditions, the variety of cognitive measures, and the range of ages all seem very encouraging.

Perhaps we can’t (yet) say that “research tells us” brief exposures to nature benefit students. But I feel much more comfortable speculating that this belief just might be true.

Working Memory: Make it Bigger, or Use it Better?
Andrew Watson
Andrew Watson

Cognitive science has LOTS of good news for teachers.

Can we help students remember ideas and skills better?

Yes, we can! (Check out retrieval practice and other desirable difficulties).

Can we promote students’ attention?

Yes, we can! (Posner and Rothbart’s “tripartite” theory gives us lots of guidance.)

Can we foster motivation?

Yes, we can! (As long as we’re modest about expectations and honest about the research, growth mindset can help.)

At the same time, we’ve occasionally got bad news as well.

Do cell phones distract students from their work?

Yes, they do! (Even when they’re turned off.)

Do students have “learning styles”?

Not in any meaningful way, no. (As Daniel Willingham says: when it comes to learning, people are more alike than different.)

The WORST News

I regularly talk with teachers and school leaders about working memory.

After a definition and some fun exercises, I emphasize three key points:

First: working memory is ESSENTIAL for learning. No academic information gets into long-term memory except through working memory. (Really.)

Second: it’s sadly LIMITED. (You probably can alphabetize 5 random words. You probably can’t alphabetize 10. You’ve run out of WM.)

Third: we know of no artificial way of making it bigger … except for letting children grow up. (WM capacity increases as we age, until our early twenties. No, you don’t want to know what happens next.)

This third point consistently creates genuine consternation.

Because: we REALLY want to make working memory bigger. After all: it’s essential, and it’s limited.

And because: almost every other cognitive function CAN get bigger.

If you want to learn more Spanish, practice Spanish. You’ll learn more.

If you want to get better at meditation, practice meditation; you’ll get better.

If you want to increase your working memory – and, trust me, you do – common sense suggests that practice should help.

That is: if you keep doing working memory exercises, your working memory should improve.

And yet, weirdly, it just doesn’t. People have tried and tried. Some companies make big claims.

Alas, we just don’t have consistent, robust research suggesting that any of these strategies work.

So, as I say, that’s really bad news.

Don’t Panic: There’s REALLY Good News

After all that bad news, it’s time for some good news. Let me start with an analogy.

I’m 5’10”.

I’m never the first pick for anyone’s basketball team. And: no matter how much I try, I’ll never get any taller.

However – and this is the key point – I can use the height I have more effectively. If I learn how to play basketball well (at my height), I can be a better player.

I’m not taller; my “height capacity” hasn’t changed. But my use of that height can improve.

So too, teachers can help students use the working memory they have more effectively.

In fact, we have LOTS of strategies for helping teachers do so. We have so many strategies that someone should write a book about them. (It’s possible I already did.)

For instance: “dual coding” doesn’t increase students’ WM capacity. It does, however, allow them to use more of the WM that they already have.

For that reason, dual coding – used correctly – can help students learn.

Don’t Stop Now

The good news keeps going.

Like dual coding, relevant knowledge in long-term memory reduces WM demands. The precise reasons get complicated, but the message is clear: students who know more can – on average – think more effectively.*

For that reason, a well-structured curriculum can help students learn. The knowledge they acquire along the way transforms WM-threatening tasks into WM-friendly tasks.

In many cases, simple common sense can manage WM load.

Once teachers understand why instructions take up WM space, we know how to dole out instructions more effectively.

Once we see why choices both motivate students’ interest and stress students’ WM, we can seek out the right number of choices.

So too, once we focus on “the curse of knowledge,” we start to recognize all the ways our own expertise can result in WM overload. This perspective powerfully reshapes lesson plans.

In other words: when teachers understand WM, we begin – naturally and intuitively – to adjust classroom demands to fit within cognitive limits.

That process takes time, with stumbles and muddles along the way. But the more we practice, the more skillful and successful we become.

And, notice this key point: none of these strategies make WM bigger. Instead, they help students use it better.

TL; DR

Although working memory is VITAL for learning, students (and adults) don’t have very much.

We therefore WANT to make it bigger.

The good news is: although we really can’t make it bigger, we really can help students use it more effectively.

When we shift our focus from making it bigger to using it better, we adopt teaching strategies that help students learn.


* For this reason, cognitive scientists get very antsy when they hear the claim that “students don’t need to know facts because they can look them up on the interwebs.” Because of working memory limits, students must have knowledge in long-term memory to use large amounts of it effectively.

Learning How to Learn: Do Video Games Help?
Andrew Watson
Andrew Watson

Long-time readers know: I like research that surprises me.

If a study confirms a belief I already have, I’m glad for that reinforcement. However, I have more to learn when a study challenges my beliefs.

As you’ll see below, I’m not always persuaded by challenging research. But: it’s always fun to explore.

Today’s Surprise

A study published last October grabbed my attention with its surprising title: Action video game play facilitates “learning how to learn.”

That title includes several shockers.

First: it suggests that action video games might be good for people.

Second: it suggests that they might even be good for learning.

Third: it suggests that “learning how to learn” is a thing. (I’m more skeptical about this concept than most; that’s a topic for another blog.)

Teacher and parent conversations often focus on the potential harms of action video games — both for children’s characters, and for their learning. So, this strong claim to the contrary certainly invites curiosity — even skepticism.

In fact, this study comes from researchers who have been looking at the cognitive benefits of action video games for several years now. Their work prompts lots of controversy; in other words, it might help us learn more about learning!

This study starts out with lots of promise…

Sims vs. Call of Duty

When you read research for a living (as I do), you start to develop an informal mental checklist about research methodology.

This study checks lots of boxes:

Plausible, active control group? Check.

Pre-registration? Check.

Appropriate uncertainty/humility? Check.

Sometimes when I look at surprising findings, I quickly dismiss them because the research paradigm doesn’t withstand scrutiny.

In this case, it all holds together well. (I should emphasize: I’m NOT an expert in this field, and other researchers might spot flaws that I don’t.)

The overall idea is straightforward enough. Researchers worked with two groups of college students.

First, researchers tested students’ “attentional control” and “working memory.”

Next, students played 45 hours (!) of video games.

The control group played games like Sims 3: in other words, a strategy video game, but not an action video game.

The study group played Call of Duty: Blacks Ops, and other such games that involve movement and aiming and navigating (and shooting).

Finally, they retested students’ attention and working memory. Here’s the kicker:

Researchers used new tests of working memory and attention. And, they watched to see how quickly students improved at these new tests.

Researchers wanted to know, in a tidy shorthand, did playing action video games help students “learn how to learn” these new attention/memory tests?

Results, and Implications

Did playing action video games help students learn new attention and memory tasks? YES.

Unfortunately, the research method here makes it hard to quantify the size of the benefit. (Bayesian statistics, anyone?) But the headline is: students in the action-video-game group did better than the strategy-video-game group at learning new cognitive skills.

What, then, should we conclude from this surprising research?

First: We have LOTS of reasons to dislike action video games, like “first-person shooters.” Many include morally repellent plot lines and actions. For some folks, the whole idea of a “game about shooting” is yucky.

At the same time, this study offers us a compelling, tantalizing clue — one that might encourage us to notice these games.

Here’s what I mean…

Second: If you focus on research into cognitive science, you know a) that working memory is ESSENTIAL to learning, b) we don’t have very much, and c) we don’t know of artificial ways to create more.

In other words: working memory limitations create a terrible bottleneck that constricts the potential for learning.

Other have tried to find ways to increase working memory. Some claim to do so. Very consistently, these research claims do not replicate.

BUT…

This study claims to have found a way to help increase working memory.

I can hardly overstate the importance of that news.

So Many Ifs

IF playing action video games improves working memory (we’re not yet sure it does,) and

IF those WM gains result in better learning (this research team didn’t test that question), and

IF we can figure out WHY and HOW such games work their working-memory magic, and

IF we can get those benefits with a game that doesn’t include shooting/killing (and all those moral qualms (IF you have those moral qualms)),

THEN we might be at the beginning of a very exciting process of discovery here.

I’m very interested in following this series of possibilities. Honestly: finding ways to enhance working memory would be a real game-changer for our profession…and potentially our species.

In brief: WATCH THIS SPACE.

(A Final Note)

This study doesn’t look at “learning how to learn” in the way that most people use that phrase.

Typically, “LHTL” involves teaching students about cognitive science and encouraging them to use those use that knowledge as they study.

This research, however, isn’t investigating that strategy.

 


Zhang, RY., Chopin, A., Shibata, K. et al. Action video game play facilitates “learning to learn”. Commun Biol 4, 1154 (2021). https://doi.org/10.1038/s42003-021-02652-7

Don’t Hate on Comic Sans; It Helps Dyslexic Readers (Asterisk)
Andrew Watson
Andrew Watson

People have surprising passions.

Some friends regularly announce that the Oxford comma is a hill they’re ready to die on. (I’m an English teacher, and yet I wonder: you’re willing to die over a punctuation mark?)

With equal energy and frequency, Twitter and Facebook resonate with mockery of the typeface Comic Sans. (Again, it’s a typeface. Why all the pique?)

Comic-sans mockery, however, often earns this earnest rebuttal:

“Comic sans helps dyslexic readers, who struggle with other fonts. Comic sans isn’t dreadful; it’s essential!”

I’ve read this statement so often that I simply assumed it’s true. Would Twitter lie?

Just Checking…

I have, in fact, seen the claim that “comic sans benefits dyslexic readers” twice this week.

However, I’ve started to notice a curious silence: no one cites specific research to back up that claim.

So, I thought I’d find it for myself.

Long-time readers know my routine. I surfed over to Google Scholar, and searched the terms “dyslexia” and “font.” And, to be on the safe side, I also searched “dyslexia” and “comic sans.”

From there, I used Scite.ai and Connectedpapers.com to follow up on my findings.

The results surprised me, so I thought I’d pass them along.

Does Comic Sans Benefit Dyslexic Readers?

I don’t know.

More precisely, I can’t find research that explores that question directly.

When I did the searches described above, I found several studies that seemed promising. And yet, when I looked at the specifics, I found that the researchers hadn’t explored exactly this question.

For instance:

Several studies cite the British Dyslexia Association style guide as their source for this recommendation.

That guide does recommend Comic Sans (and other sans serif fonts, including Arial). However, it doesn’t cite any research to support that claim.

Hmmmm.

This study, helpfully called “Good Fonts for Dyslexia,” does indeed ask 48 dyslexic readers to study passages in different fonts. It asks exactly the question we’re trying to answer.

However, this research team didn’t include Comic Sans among the fonts they studied.

They do recommend Helvetica, Courier, Arial, Verdana and CMU for dyslexic readers. But they have no recommendation one way or the other about Comic Sans.

Double hmmmmm.

Most of the studies I found focus less on font and more on web design. (And, the most common font-related conclusion I found is: fonts designed to benefit dyslexic readers don’t.)

At this point, I simply don’t have a research-based answer to this question.

To Be Clear…

This search genuinely surprised me. Given the frequency of the claim — just google it! — I assumed I’d find a robust research pool.

But, no.

Given the potential for controversy here, I want to answer some likely questions:

“Are you saying Comic Sans DOESN’T help dyslexic readers?”

No. I’m saying I can’t find a research-based answer either way.

“If you’re not an expert in dyslexia, how can you be so sure?”

Honestly, I’m not sure. I’m usually fairly skilled at finding the research basis behind educational claims. (Heck, I wrote a book about doing so.) But in this case, I simply couldn’t discover a convincing answer to the question.

“Look, this research right here shows that Comic Sans does help!”

AWESOME! Please share it with me so I can write a follow-up post.

“My student/child/colleague tells me that Comic Sans helps a lot.”

That kind of individual experience is useful and important. I hope that researchers explore this question, so we can know with greater confidence whether or not it helps most dyslexic readers.

“How long did you look?”

Maybe an hour, spread out over two days. I certainly could have missed something. I hope you’ll let me know if you’ve got a study that looks at this possibility.

TL;DR

You might have heard that Comic Sans helps dyslexic readers; you might have heard that “research says so.”

Those claims might be true, but I haven’t (yet) found research supporting them. If you know of that research, please send it my way!

Perspectives on Critical Thinking: Can We Teach It? How Do We Know?
Andrew Watson
Andrew Watson

Imagine the following scenario:

A school principal gathers wise cognitive scientists to ask a straightforward question…

“Because critical thinking is an essential 21st century skill, we know our students need to develop critical thinking skills. If we want to create a school program or a class or a curriculum to foster critical thinking, what guidance can you give us?”

Happily, we don’t have to imagine. At last week’s Learning and the Brain conference in New York, I asked a distinguished group of cognitive psychologists* exactly that question.

The resulting conversation offered practical suggestions, provocative assertions, and a surprising amount of humor.

I’ll try to summarize that half-hour conversation here.

On the One Hand…

Let’s start at one end of the spectrum, with the most optimistic ways to answer the question:

First: we know what critical thinking is.

Dr. Laura Portnoy, for instance, considers critical thinking the ability to support claims with evidence and reason.

If I claim that “the earth orbits the sun,” I should be able to cite evidence supporting that claim. And I should be able to explain the logical process I use to make conclusions based on that evidence.

Dr. Ben Motz agrees with that foundation, and adds an important step: critical thinkers recognize and avoid logical fallacies.

A comprehensive list of logical fallacies goes on for pages, but critical thinkers typically question their own beliefs aggressively enough to avoid the most common mistakes.

Second: we know how to foster critical thinking.

The specifics of an answer probably vary by age and discipline. However, we’ve got specific curricular strategies to help us foster critical thinking among students.

Dr. Laura Cabrera, with this truth in mind, offers a specific bit of advice: start early.

If we want students to grow as critical thinkers, we shouldn’t wait until their sophomore year in high school. Kindergarten would be a fine place to start.

On the Other Hand…

All these optimistic answers, however, quickly give way to grittier – perhaps more realistic – assessments of the situation.

First: because critical thinking is so complicated, no precise definition holds true in a broadly useful way. In other words – politely speaking – we can’t exactly define it.

In cognitive psychology terminology, as Dr. Derek Cabrera put it, “critical thinking has a construct validity problem.” In fact, the five psychologists on the panel – professors all – don’t agree on a definition.

Second: This definition problem has terrible implications.

If we can’t define critical thinking, broadly speaking, then we can’t determine a consistent way to measure it.

And if we can’t measure it, we have no (scientific) way of knowing if our “critical thinking program” helps students think critically.

Third: In fact, if we can’t measure students’ critical thinking skills right now, we might not realize that they’re already good at it.

Dr. Dan Willingham – author of the well-known Why Don’t Students Like School – made this point at the beginning of our conversation.

“Why,” he asked, “do you think your students have a critical thinking problem? What measurement are you using? What do you want them to do that they can’t do?”

In other words: it’s not obvious we should start a critical thinking program. Because we can’t measure students’ abilities, we just don’t know.

Dr. Derek Cabrera made this point quite starkly: “My advice about starting a critical thinking program is: don’t.

Don’t Start Now

Even if we could measure critical thinking, as it first seemed we could, teachers might not want to give it disproportional attention.

Fourth: some panelists doubt that critical thinking is any more important than many (many) other kinds of thinking – creative thinking, interdisciplinary thinking, systems thinking, fuzzy logic…the list goes on.

Dr. Portnoy, for instance, champions good old-fashioned curiosity. If students ask the right questions (critical or otherwise), they’re doing good thinking and learning.

Why, then, would it be bad if they aren’t doing critical thinking, narrowly defined?

The Cabreras, indeed, argue that students trained to think critically often get too critical. They stamp out potentially good ideas (that spring from imaginative thinking) with all their skills at critical thinking.

Fifth: opportunity cost.

Schools already have far too much to do well, as Dr. Willingham frankly pointed out.

If we plan to add something (a critical thinking program/curriculum), we should know what we plan to take out.

And, we should have a high degree of confidence that the new program will actually succeed in its mission.

If we remove a program that does accomplish one goal and replace it with one that doesn’t, our efforts to improve schools will – paradoxically – have deprived students of useful learning.

Making Sense of the Muddle

All these points might seem like bad news: we (perhaps) don’t know what critical thinking is, and (perhaps) shouldn’t teach it even if we did. Or could.

That summary, I think, overlooks some important opportunities that these panelists highlighted.

Dr. Motz offers specific ways to define critical thinking. His talk at the conference, in fact, focused on successful strategies to teach it.

Even better: he wants teachers to join in this work and try it out with their own students.

The question we face, after all, is not exactly “can I teach critical thinking — generally) — to everyone?”

It is, instead: “can I teach critical thinking — defined and measured this way — to my students?”

If the answer to that question is “yes,” then perhaps I should make room for critical thinking in my students’ busy days.

Made wiser by these panelists’ advice, I know better how to define terms, to measure outcomes, to balance several thinking skills (including curiosity!).

When researchers’ perspectives on critical thinking helps us think critically about our teaching goals, we and our students benefit.


* The panelists: Dr. Derek Cabrera, Dr. Laura Cabrera, Dr. Benjamin Motz, Dr. Lindsay Portnoy, Dr. Dan Willingham.