Skip to main content

Andrew Watson About Andrew Watson

Andrew began his classroom life as a high-school English teacher in 1988, and has been working in or near schools ever since. In 2008, Andrew began exploring the practical application of psychology and neuroscience in his classroom. In 2011, he earned his M. Ed. from the “Mind, Brain, Education” program at Harvard University. As President of “Translate the Brain,” Andrew now works with teachers, students, administrators, and parents to make learning easier and teaching more effective. He has presented at schools and workshops across the country; he also serves as an adviser to several organizations, including “The People’s Science.” Andrew is the author of "Learning Begins: The Science of Working Memory and Attention for the Classroom Teacher."

How Teachers Can Use Neuroscience in Education
Andrew Watson
Andrew Watson

I recently saw two very different looks at neuroscience and learning, and I thought they made a useful pairing for this blog. Here goes…

 

Regular readers know that I’ve recently been exploring research into movement and learning. That is: does walking around – especially outside – help us think, learn, attend, and create?

An image of a brain in a human head, with EEG waves in the background

Because I really want the answer to be “yes,” I force myself to be extra skeptical when I look at the research. And even with all that extra skepticism, the answer is – for the most part – YES!

How do we know?

Well, various researchers have people walk around – or sit still – and then do various mental tasks. Often (although not always), they do better after walking than after sitting.

BOOM.

But wait! Wouldn’t it be great to have more evidence than walkers’ “performance on mental tasks”? Wouldn’t it be great to know what’s going on in their brains?

Beyond “Mental Tasks”

I recently read a Twitter post about this study:

Researchers at the University of Illinois at Urbana-Champaign had several 9 and 10-year-olds take various tests in reading comprehension, spelling, and math.

Researchers also had these students take tests on “attentional control” — which means, more or less, what it sounds like.

Students took these various tests once after sitting still for 20 minutes, and another time after walking at a moderate pace for 20 minutes.

Sure enough, these young students controlled their attention more effectively after walking than after sitting. And, they did better on the reading comprehension test after walking than after sitting.

Now: here’s the brain part.

Researchers also hooked students up to an electroencephalography (EEG) array while they took those tests.

EEG measures electrical activity on the outer-most layer of the brain, so – VERY roughly – it shows how various brain surfaces are acting at particular moments in time.

Here’s where things get very technical. (Neuroscience is ALWAYS very technical.)

EEGs produce up-and-down squiggles; they look a bit like lie detector tests in the movies.

Research with adults has consistently shown that exercise produces a change at the third squiggle in various brain regions. Because that squiggle (sort of) goes up, it’s called the “third positivity,” or P3.

This P3 (third positive squiggle) correlates with better attentional control in adults. Researchers hypothesized that they would get the same result with these young children.

Results, Please

Here’s the big neuroscience news – researchers DID get the same results for children as addults

Changes in P3, induced by walking, took place when the students did better at attentional control.

So, why does this research finding matter?

If students’ minds behave differently after walking – they perform better at attentional control – we would expect that their brains behave differently.

Now we know: they do!

In the field, we call this pattern “converging evidence.” Two very different kinds of research — psychology AND neuroscience — support the same conclusion.

Now we can be even more confident that walking benefits cognition – even though, as you remember, I’m trying to be extra skeptical.

So, here we have the FIRST way that teachers can use neuroscience to support their teaching:

After psychology research suggests that a teaching suggestion might be beneficial, neuroscience can provide converging evidence to make this idea even more persuasive.

FANTASTIC. (By the way: I’ll come back to this study about walking and attentional control at the end of this blog post.)

The Matrix Could Be Real?

I said that I’d seen two articles about neuroscience worth sharing. The first – as you’ve seen – is very specific and researchy.

The second article – pointed out to me by my friend Rob McEntarffer — spends time speculating, musing, and wondering.

 

Crudely speaking, this article wonders if something Matrix-like could happen. Could Laurence Fishburne ever download kung fu into Keanu Reeves?

The article, in WIRED Magazine, opens with a fascinating scene. Doctors have implanted electrodes in a patient’s fusiform face area – the FFA. (Most neuroscientists think that the FFA helps the brain identify and recognize human faces.)

When the researchers stimulate the FFA, this patient – very briefly – sees human features on a blank box: an ear, a sideways smile, an eye.

In other words, electrical current applied to the brain surface created bits of a face. THE MATRIX EXISTS.

Wait. [Sound of record scratch.] Nope. No it doesn’t.

This article does a great job pointing out all the extraordinary complexities going from this tiny baby step to actually “implanting learning in the brain.”

As in, we are nowhere near being able to do anything remotely like that.

Glitches in the Matrix

The idea itself seems plausible. As Adam Rogers writes:

The brain is salty glop that turns sensory information into mind; you ought to be able to harness that ability, to build an entire world in there.

However, all sorts of problems get in the way.

At a very basic level, there are just too many neurons for us to be able to control precisely — something like 50,000 to 100,00 in an area the size of a grain of rice.

To make anything like perception happen, we’d have to get thousands of those stimuli just right. (Imagine how complex LEARNING would be.)

The proto-matrix also faces a timing problem:

Perception and cognition are like a piano sonata: the notes must sound in a particular order for the harmonies to work.

Get that timing wrong and adjacent electrical pings don’t look like shapes — they look like one big smear, or like nothing at all.

Finally — and this point especially merits attention:

The signals you see when a brain is doing brain things aren’t actually thought; they’re the exhaust the brain emits while its thinking.

In other words: all those cool brain images can’t necessarily be reverse engineered. We can measure electrical activity when a brain does something — but artifically recreating such electrical activity won’t guarantee the same underlying thought process.

So, here’s the SECOND way to use neuroscience in teaching:

When teachers understand how fantastically complicated neuroscience — and the underlying neurobiology of thought and learning — truly are, we can see through hype and extravagant claims often made about this field.

Rogers’s article does a GREAT job highlighting that complexity.

An Example

I promised to return to that study about walking and attention, so here goes:

I do think that this study offers some converging neuroscientific evidence that movement prior to learning enhances attentional control.

However, twitter post citing this study implied it reaches a different conclusion: movement during learning is good for attention, creativity, etc.

That is: it claimed that teachers should design lessons that get students up and moving, and that this research requires this conclusion.

In particular, it highlights this image to show changes in brain activity between walking and sitting.

Rogers’s article in WIRED encourages us to think about all the neural complexity underlying this blithe suggestion.

After all, that image is simply a representation of a few dozen P3 graphs:

Many graphs showing electroencephalography results at the 3rd positivity.

Unless we have a clear idea what those squiggles mean, we shouldn’t be too confident about that image showing “changes in brain activity.”

And, by the way, people are often much too confident in interpreting such images. As in: it happens EVERY DAY.

To be clear: I think some movement during class often makes sense — although, as always, the students’ age and the school’s culture will influence this decision.

And this neuroscience research does provide “converging evidence” that movement built into the school day is a good idea.

But it certainly doesn’t require teachers to have students walking from place to place during lessons; that’s not what the any of these researchers measured, and it’s not what they claim.

TL;DR

Neuroscience research focusing on the brain can benefit teachers by supporting — or contradicting — psychology research focusing on the mind.

If both kinds of research point the same direction, we can be especially confident that a teaching suggestion makes sense.

And a deep understanding of the complexity of neuroscience (a la Rogers’s WIRED article) can help us resist overconfident advice that seems to have (but really does not have) neuroscientific backing.


Hillman, C. H., Pontifex, M. B., Raine, L. B., Castelli, D. M., Hall, E. E., & Kramer, A. (2009). The effect of acute treadmill walking on cognitive control and academic achievement in preadolescent children. Neuroscience159(3), 1044-1054.

Warning: Misguided Neuroscience Ahead
Andrew Watson
Andrew Watson

I recently ran across a version* of this chart:

An (inaccurate) chart listing neurotransmitters: their effects and activities that enhance them

As you can see, this chart lists several neurotransmitters and makes recommendations based on their purported roles.

If you want to feel love, you should increase oxytocin. To do so, play with your dog.

If you want to feel more stable, you should boost serotonin. To do so, meditate, or go for a run.

And so forth.

On the one hand, this chart seems harmless enough. It recommends that we do quite sensible things — who can argue against “self-care,” or “hugging your children”? — and so can hardly provoke much controversy.

I, however, see at least two reasons to warn against it.

Willingham’s Razor

Most everyone has read Dan Willingham’s Why Don’t Students Like School?  (If you haven’t: RUN, don’t walk, to do so.)

Professor Willingham has also written a less well known book called When Can You Trust the Experts?, which offers lots of wise advice on seeing though bad “expert” advice.

One strategy he recommends:

Reread the “brain-based” teaching advice, and mentally subtract all the brainy words. If the advice makes good sense without them, why were they there in the first place? **

In the lists above, do we really need the names of the neurotransmitters for that advice to make sense?

To feel a sense of accomplishment, accomplish something.

If you want to feel better, eat chocolate.

To calm down, walk (or run) outdoors.

Who could object to these suggestions? Do we need multi-syllable words to embrace them?

I worry, in fact, that such charts create bad mental habits for teachers. Those habits sound like this:

If someone knows complicated neuro-terminology, then their teaching advice must be accurate. When a blogger uses the phrases “corpus callosum” and “research says,” therefore, I have to take their teaching advice.

No, you really DON’T have to take their advice. LOTS of people use the language of neuroscience to make their suggestions sounds more authoritative.

As I’ve written elsewhere, neuroscience rarely produces classroom-ready teaching advice.

PSYCHOLOGY gives teachers great ideas about memory and attention and learning and motivation.

A biological understanding of what’s happening during those mental functions (i.e., neuroscience) is fascinating, but doesn’t tell teachers what to do.

In brief: beware people who use neuro-lingo to advise you on practical, day-to-day stuff. Like, say, that chart about “happiness chemicals.”

When Simplification Leads to Oversimplification

My first concern: the chart misleadingly implies that neuroscientific terminology makes advice better.

My second concern: the chart wildly oversimplifies fantastically complicated brain realities.

For instance, this chart — like everything else on the interwebs — calls oxytocin “the love hormone.”

A smiley face with the word "oxytocin" as the smile

However, that moniker doesn’t remotely capture its complexity. As best I understand it (and my understanding is very tentative), oxytocin makes social interactions more intense — in both positive AND NEGATIVE directions.

So: when we add oxytocin, love burns brighter, hatred smoulders hotter, jealously rages more greenly.

To call it the “love hormone” is like saying “the weather is good.” Well, the weather can be good — but there are SO MANY OTHER OPTIONS.

The statement isn’t exactly wrong. But its limited representation of the truth makes it a particular kind of wrong.

So too the idea that dopamine is a “reward chemical.” Like oxytocin’s function, dopamine’s function includes such intricate nuance as to be difficult to describe in paragraphs — much less a handy catchphrase. ***

By the way: the most comprehensive and useful description of neurotransmitters I know comes in Robert Sapolsky’s book Behave. As you’ll see, they’re REALLY complicated. (You can meet professor Sapolsky at our conference in February.)

TL;DR

Yes, walking outside and hugging children and exercising are all good ideas for mental health.

No, we don’t need the names of neurotransmitters to make that advice persuasive.

We might worry about taking advice from people who imply that neuro-lingo does make it more persuasive.

And we can be confident that neurotransmitters are much, MUCH more complicated than such simplistic advice implies.


* I’ve made my own modified version of this chart. The point of this blog post is not to criticize the individuals who created the original, but to warn against the kind of thinking that produced it. “Name and shame” isn’t how we roll.

** I’m paraphrasing from memory. I’m on vacation, and the book is snug at home.

*** [Update on 12/30/22] I’ve just come across this study, which explores some of the contradictions and nuances in the function of serotonin as well.

The Limitations of Retrieval Practice (Yes, You Read That Right)
Andrew Watson
Andrew Watson

Last week, I wrote that “upsides always have downsides.”

African American student wearing a bow tie, hand to forehead, looking frustrated and disappointed

That is: anything that teachers do to foster learning (in this way) might also hamper learning (in that way).

We should always be looking for side effects.

So, let me take a dose of my own medicine.

Are there teaching suggestions that I champion that have both upsides and conspicuous downsides?

Case in Point: Retrieval Practice

This blog has long advocated for retrieval practice.

We have lots (and LOTS) of research showing that students learn more when they  study by “taking information out of their brains” than “putting information back into their brains.” (This phrasing comes from Agarwal and Bain.)

So:

Students shouldn’t study vocabulary lists; they should make flash cards.

They shouldn’t review notes; insted, they should quiz one another on their notes.

Don’t reread the book; try to outline its key concepts from memory.

In each of these cases (and hundred more), learners start by rummaging around in their memory banks to see if they can remember. All that extra mental work results in more learning.

SO MUCH UPSIDE.

But wait: are there any downsides?

Let the Buyer Beware: Retrieval-Induced Forgetting

Sure enough, some researchers have focused on “retrieval-induced forgetting.”

Yup. That means remembering can cause forgetting.

How on earth can that be? Here’s the story…

Step 1: Let’s say I learn the definitions of ten words.

Step 2: I use retrieval practice to study the definitions of five of them. So, I remembered five words.

Step 3: Good news! Retrieval practice means I’ll remember the five words that I practiced better.

Step 4: Bad news! Retrieval-induced forgetting means I’ll remember the five words I didn’t practice worse. Yes: worse than if I hadn’t practiced those other five words.

In brief: when I remember part of a topic, I’m likelier to FORGET the part I didn’t practice. (Although, of course, I’m likelier to REMEMBER the part I did practice.)

So, retrieving induces forgetting. Now that’s what I call a downside.

Potential solution?

How do our students get the good stuff (memories enhanced by retrieval practice) without the bad stuff (other memories inhibited by retrieval practice)?

Here’s an obvious solution: tell our students about retrieval-induced forgetting.

Heck, let’s go one step further: tell them about it, and encourage them to resist its effects.

One research group — led by Dr. Jodi Price — tried just this strategy.

The research design here gets quite complicated, but the headline is:

They ran the same “retrieval-induced forgetting” study that others had run, and this time added a brief description of the problem.

In some cases, they added encouragement on how to overcome this effect.

So, what happened when they warned students?

Nothing. Students kept right on forgetting the un-practiced information (although they kept right on remembering the practiced information).

In brief: warnings about retrieval-induced forgetting just didn’t help. (Heck: in some cases, they seemed to promote even more forgetting.)

Alternative Solutions?

Much of the time, we benefit our students by telling them about reserach in cognitive science.

I routinely tell my high-school students about retrieval practice. I show them exactly the same studies and graphs that I show teachers in my consulting work.

In this case, however, it seems that sharing the research doesn’t help. Telling students about retrieval-induced forgetting didn’t stop retrieval induced forgetting.

Conclusion: it’s up to teachers to manage this side effect.

How? We should require retrieval of all essential elements.

For example:

When I teach my students about comedy and tragey, the definitions of those terms include lots of moving pieces.

I know that ALL THE PIECES are equally important. So I need to ensure that my retrieval practice exercises include ALL THE PARTS of those definitions.

Students don’t need to remember everything I say. But if I want them to remember, I need to ensure retrieval practice happens.

Each of us will devise different strategies to accomplish this goal. But to get the upside (from retrieval practice) we should mitigate the downside (from retrieval-induced forgetting).

TL;DR

Retrieval practice is great, but it might cause students to forget the parts they don’t retrieve.

Alas, we can’t solve this problem simply by warning our students.

So, we should structure our review sessions so that students do in fact retrieve EVERYTHING we want them to remember.

If we create such comprehensive retrieval, students can get the upsides and without the downsides.

 

 


Price, J., Jones, L. W., & Mueller, M. L. (2015). The role of warnings in younger and older adults’ retrieval-induced forgetting. Aging, Neuropsychology, and Cognition22(1), 1-24.

Upsides Always Have Downsides: “Side Effects” in Education Research
Andrew Watson
Andrew Watson

Here at Learning and the Brain, we believe that research can improve education.

Young man wearing a tie, showing thumbs up in one image and thumbs down in the other

Specifically, research into psychology (“how the mind works”) and neuroscience (“how the brain works”) can help teachers and schools. After all, we spend all day working with students’ minds and brains!

Every now and then, we should stop and look for flaws in our assumptions.

Are there ways that research might not help learning? Might it actually limit learning?

A recent article by Yong Zhao explores this important — and troubling — question.

The Medical Model

Doctors have long relied on research to test their hypotheses.

Does this treatment work better than that treatment? Let’s run a “randomized control trial” to find out.

Notably, medical research always includes this important question: what side effects* does a treatment produce?

That is:

Any treatment might produce specific benefits.

And, it might also produce specific harms.

Medical research looks for and reports on BOTH.

Sadly — and this is Zhao’s key point — education research tends to skip the second question.

Researchers look for benefits:

Does mindfulness reduce stress?

Can retrieval practice enhance learning?

Should students exercise mid-class?

When they measure the potential upsides of those “treatments,” they don’t always look equally scrupulously for downsides.

And yet: almost everything has downsides.

What to Measure, and When?

Why do we overlook the downsides?

Zhao offers two hypotheses.

First, we all agree that education is good.

If doing X helps students learn, then X is good! Its obvious goodness makes potential badness invisible.

Second, downsides take time — and alternative methods — to discover.

An example. I hypothesize a particular method will help students sing better. So, I test my method in a randomized control trial.

Sure enough, students with the new method sang better! My method worked!

However, my new teaching method just might make students hate singing.

To discover this “side effect,” I have to measure different variables. That is:

I need to check how well they sing (one set of measurements),

AND how much they like singing (a different set of measurements).

It’s also possible that the downside takes longer to arise. The improvement (right now) results in less enjoyment of singing (later on). If I don’t keep measuring, I’ll miss this “side effect.”

New Habits

As Zhao argues, our habit of overlooking potential downsides creates real problems.

For instance, Zhao takes the example of Direct Instruction.

Its proponents can show lots of research suggesting its strengths. Its detractors likewise.

How can these contradictory realities exist?

Well, any complex teaching method will have benefits and detriments. If we focus only on one — if we measure only one — we’ll necessarily miss the other.

Instead, Zhao argues, we should develop the rigorous habit of looking for both: the benefits of any teaching strategy, and also its downsides.

This realisic, complex reality will allow us to make better decisions in classroom and schools.

One More Step

Although Zhao doesn’t mention “opportunity costs,” I think they’re an important part of this conceptual re-think.

That is:

Every time I do use a particular teaching strategy, I don’t use the other one.

If I take time for this stress-reducing technique, I don’t have time for that stress-reducing technique.

Even if a strategy has good research behind it, even if it has relatively few “side effects,” I always want to know: have I given up a better strategy to make time for this merely good strategy?

For example, this point often comes up in discussion of Learning Styles Theory.

If you’ve spent any time in this field, you know: Learning Styles Theory simply doesn’t have good research support behind it.

Alas: it has LOTS of popular support, even among teachers.

When I show teachers the comprehensive research reviews contradicting the theory, they occasionally respond this way:

“Okay, but what harm is it doing? It might be true, so why not teach to my students’ learning style?”

For me, the clear answer is opportunity cost.

If we teachers ARE spending time on teaching methods that have no research support, we ARE NOT spending time on those that do.

If students ARE studying on the treadmill because they’re “kinesthetic learners,” they ARE NOT using study strategies with research support behind them.

Measuring opportunity cost requires subtle and humble calculations. We just might have to give up a long-prized approach to make time for an even better one.

If our students learn more, that sacrifice will have been worth it.

TL;DR

Like medical researchers, we should look both for benefits and for potential harms of any teaching suggestion.

This balanced perspective might take additional time, and might require consideration of  opportunity costs.

It will, however, result in a more realistic and useful understanding of teaching and learning.


*  Many years ago, I read that the phrase “side effects” is misleading. It makes unwanted effects seem unlikely, even though they’re just as likely as the wanted effects.

For that reason, I’m putting the words “side effects” in quotations throughout this post.

I believe it was Oliver Sacks who made this point, but I can’t find the citation so I’m not sure.

If you know the correct source of this insight, please let me know!


Zhao, Y. (2017). What works may hurt: Side effects in education. Journal of Educational Change18(1), 1-19.

Getting the Details Just Right: Retrieval Practice
Andrew Watson
Andrew Watson

Can we ever research a topic too much? Can we reach a point where, well, there’s nothing really more to say about teaching better and learning more?

Perhaps, for instance, we’ve reached peak retrieval practice.

Blog readers – and conference attendees – know that actively recalling information results in greater learning than simple review.

For example: rather reminding my students of yesterday’s discussion of the Harlem Renaissance, I can ask them to write down the key details from memory.  When they make the mental effort to remember, they learn more.

This blog and many authors have written about this topic at length. What more is there to say?

I recently found a study that reminds us: there’s always more to say. If we want to combine teaching experience with researcher insight, we need to take time to get the details just right.

Here’s the story.

A Problem, a Solution, Another Problem

One problem with retrieval practice: it takes time.

I ask the question.

The students write their answers to the question.

I check their answers.

The minutes tick by.

Wouldn’t it be great if we could skip a few steps. How about this abbreviated version:

I ask the question.

The students think about their answers to the question.

I move on.

If my students truly think about the answers, then they’ll get the retrieval practice benefit in much less time.

This solution, however, creates its own problem.

If my students don’t write anything down, how can I know they actually think about the answers? Couldn’t they just nod and look earnest?

After all, what’s their motivation to do the thinking?

Let’s Check

A respected research team in this field has explored this set of problems, and their potential solutions.

In a recently published study, Megan Sumeracki and Johanny Castillo wanted to see if that first problem exists.

They had college students read a short passage. Some wrote answers to review questions; some were instructed to think about answers to those questions.

What happened a few days later?

Sure enough, the students who just thought about (but did not write down) answers were relatively confident that they’d remember information. (That is: they were more confident than those who wrote answers down.)

However, the thinkers actually remembered less than the writers.

Sure enough, as we predicted, students don’t always follow instructions to think about answers.

In other words: when I solve the first problem (retrieval practice takes time) by asking students simply to think, I create a second problem (students don’t really think).

How do we solve this conundrum? Can I solve BOTH problems?

Despair Not

Sumeracki and Castillo had an idea.

They repeated the “think about it” strategy, but this time with an additional ingredient: cold calling.

That is: they asked students to try retrieval practice by thinking about the answer. AND then they cold-called one student at random. (That is, they called on one student who hadn’t raised a hand.)

The researchers hoped to communicate this message: when told to think about the answer, students really should think about it – because they might actually have to answer the question.

What did they find?

Sure enough: students who thought about the answer now remembered as much as the students who wrote down their answers – presumably because they really did the thinking.

This two-part strategy – “retrieval practice by thinking” plus “cold calling” – takes less time AND produces the learning benefits of retrieving.

Problem solved!

One More Problem?

Some readers will have noticed that I raced past a potential controversy.

Truthfully, people do worry about cold calling.

Teachers worry that it creates a hostile, punitive environment. One grad school professor told me that cold calling ramps up stress, and stress destroys the hippocampus, so cold calling is malpractice.

Honestly, we don’t have lots and lots of research here.

One study I’ve found pushes back on the “ramps up stress” narrative. Others support that narrative.

And, as far as I know, we just don’t have good research in K-12 classrooms.

My own instincts say: yes, cold calling can be done badly. But, anything can be done badly. The key point is that cold calling can be done well.

If we create a classroom environment where making mistakes is an entirely normal part of the class routine – an environment that Doug Lemov calls a “culture of error” – then the potential stress of cold calling shouldn’t be a problem.

But, until we have actual research in many different classrooms, I can’t make that recommendation too emphatically.

The Sumeracki and Castillo strategy strikes me as a sensible solution to a real problem. More research on cold calling will make it more persuasive still.


Sumeracki, M. A., & Castillo, J. (2022). Covert and overt retrieval practice in the classroom. Translational Issues in Psychological Science.

Walking Promotes Creativity? A Skeptic Weighs In…
Andrew Watson
Andrew Watson

When teachers try to use psychology research in the classroom, we benefit from a balance of optimism and skepticism.

Family walking toward camera in autumn woods

I confess, I’m often the skeptic.

When I hear that – say – “retrieval practice helps students learn,” I hope that’s true, but I want to see lots of research first.

No matter the suggestion…

… working memory training!

… dual coding!

… mindfulness!

… exercise breaks!!!

… I’m going to check the research before I get too excited. (Heck, I even wrote a book about checking the research, in case you want to do so as well.)

Here’s one surprising example.

Because I really like the outdoors (summer camp, here I come!), I’d LOVE to believe that walking outside has cognitive benefits.

When I get all skeptical and check out the research…it turns out that walking outside DOES have cognitive benefits.

As I wrote back in May, we’ve got enough good research to persuade me, at least for now, that walking outdoors helps with cognition.

Could anything be better?

Yup, Even Better

Yes, reader, I’ve got even better news.

The research mentioned above suggests that walking restores depleted levels of both working memory and attention.

“Yes,” I hear you ask, “but we’ve got other important mental functions. What about creativity? What does the research show?”

I’ve recently found research that looks at that very question.

Alas, studying creativity creates important research difficulties.

How do you define “creativity”?

How do you measure it?

This research, done by Oppezzo and Schwartz, defines it thus: “the production of appropriate novelty…which may be subsequently refined.”

That is: if I can come up with something both new and useful, I’ve been creative – even if my new/useful thing isn’t yet perfect.

Researchers have long used a fun test for this kind of creativity: the “alternative uses” test.

That is: researchers name an everyday object, and ask the participants to come up with alternative uses for it.

For example, one participant in this study was given the prompt “button.” For alternative uses, s/he came up with…

“as a doorknob for a dollhouse, an eye for a doll, a tiny strainer, to drop behind you to keep your path.”

So much creativity!

Once these researchers had a definition and a way to measure, what did they find?

The research; the results

This research team started simple.

Participants – students in local colleges – sat for a while, then took a creativity test. Then they walked for a while, and took second version of that test.

Sure enough, students scored higher on creativity after they walked than after they sat.

How much higher? I’m glad you asked: almost 60% higher! That’s a really big boost for such a simple change.

However, you might see a problem. Maybe students did better on the 2nd test (after the walking) because they had had a chance to practice (after the sitting)?

Oppezzo and Schwartz spotted this problem, and ran three more studies to confirm their results.

So, they had some students sit then walk, while others walked then sat.

Results? Walking still helps.

In another study, they had some students walk or sit indoors, and walk or sit outdoors.

Results: walking promotes creativity both indoors and out.

Basically, they tried to find evidence against the hypothesis that walking boosts creativity…and they just couldn’t do it. (That’s my favorite kind of study.)

Just One Study?

Long-time readers know what’s coming next.

We teachers should never change our practice based on just one study – even if that study includes 4 different experiments.

So, what happens when we look for more research on the topic?

I’ve checked out my go-to sources: scite.ai and connectedpapers.com. (If you like geeking out about research, give them a try – they’re great!)

Sure enough, scite.ai finds 13 studies that support this conclusion, and 3 that might contradict it. (In my experience, that’s a good ratio.)

Connectedpapers.com produces fewer on-point results. However, the most recent study seems like a very close replication, and arrived at similar findings.

In brief: although I’m usually a skeptic, I’m largely persuaded.

TL;DR

Walking outdoors helps restore working memory and attention; walking either indoors or outdoors enhances creativity (at least as measured by the “alternative uses”  test).

I’d love to see some studies done in schools and classrooms. For the time being, I think we have a persuasive foundation for this possible conclusion.

Our strategies for putting this research to good use will, of course, be different for each of us. But it’s good to know: simply walking about can help students think more creatively.


Oppezzo, M., & Schwartz, D. L. (2014). Give your ideas some legs: the positive effect of walking on creative thinking. Journal of experimental psychology: learning, memory, and cognition40(4), 1142.

The Most Important 5 Minutes in Class: The Primacy/Recency Effect
Andrew Watson
Andrew Watson

As we put our lesson plans together, we teachers want to know: are some minutes more valuable than others?

Student Holding Clock

That is:

Do students remember most at the 10-minute mark of the lesson, because they’re mentally revved up?

Or, perhaps they remember most from the final five minutes, because the whole class has led to this grand conclusion.

Or, perhaps some other time slot generates the most learning, because psychology reasons.

What does the research tell us?

Start Here

I occasionally see teaching advice that seeks to answer this question. That advice typically begins with a fascinating research pool.

Here’s the story.

Researchers present students with — say — a list of 15 words. After distraction, how many  words do students remember? And, can we predict which ones?

Several studies suggest a consistent answer.

Students tend to remember words from the beginning of the list. Researchers call that the “primacy” effect.

And, they remember words from the end of the list. That result gets the moniker “recency effect.”

Going all the way back to 1962, this primacy/recency effect has a lot of research behind it. (For a more recent study, click here.)

Lab to Classroom

So, how should teachers plan our lessons based on this particular finding?

Let’s imagine that I tell my students a list of 8 instructions. Because of the primacy/recency effect, I suspect they’ll remember the early and late instructionst better than the ones in the middle. (Hint: maybe I should write down a long list of instructions…)

But: what does this effect tell us about the most valuable teaching time during a class period as a whole?

From time to time, scholars who translate psychology research for classroom teachers make this argument:

The primacy/recency effect suggests that the first several minutes of class, and the final several minutes of class, have the greatest effect on learning.

That is: For the same reason that students remember the first and last instruction from my list of 8, they’ll learn the most during the first and last minutes of class.

Voila: a research-based answer to the question.

I confess, however, that I myself have doubts.

The argument says, in effect:

Rules governing mental processes for 60-120 seconds also govern mental processes for 45-80 minutes.

Honestly, I’m just not sure that’s plausible. My doubts spring from two sources.

Doubts, and More Doubts

In the first place, I doubt this advice because it extrapolates so far beyond the initial research conditions.

If research tells me something about — say — college students, that conclusion might also apply to 1st graders. But it might not. 1st graders aren’t college students.

If research tells me something about adolescents in Iceland, that conclusion might apply to teens in Brazil. But it might not. Icelandic culture differs from Brazilian culture.

And, if research tells me about mental functions over one minute, that conclusion might apply to 20 minutes. (Or 45, or 80.) But IT MIGHT NOT. One minute isn’t twenty.

Long-time readers know I always focus on “boundary conditions.” From my perspective, this advice goes WAY beyond the boundaries of the initial research.

By the way: I’ve asked SEVERAL wise people if they know of primacy/recency research that goes beyond a minute or two. So far, the answer is “no.”

The second reason I doubt this advice because of the specific mental functions involved.

As far as I can tell, researchers explain the primacy/recency effect by talking about short-term memory and working memory.

Both of these mental faculties describe very short-term mental functions. In my grad-school classes, our profs typically said that working memory holds information somewhere between 5 and 30 seconds.

If, in fact, the primacy/recency effect results from short-term and working memory functions, then those findings almost certainly won’t apply to mental processes that take 30+ minutes.

Like, say, our classes.

Just Answer the Question

If this advice doesn’t hold, what can research tell us about the “most important five minutes in class”?

I’ve got two answers.

Answer #1:

I’ve asked lots of people if they have a resaerch-informed answer to this question. So far, no one has a strong “yes.” But, If I hear of one, I’ll pass it along.

And, btw, a friend has answered “we really have to research that question!” So, I’ll let you know if/when his results come through.

Answer #2:

Long-time readers know my mantra: “don’t just do this thing; instead, think this way.”

In this case, I don’t think we can plausibly identify any one time slot that consistently generates the most learning.

Instead, we want to use core ideas from cognitive science to structure lesson plans effectively.

Use retriveal practice.

Beware working-memory overload.

Foster attention.

Activate prior knowledge.

And so forth.

If we follow this approach, every minute will build ultimately — and more-or-less equally — toward students’ learning.


Castel, A. D. (2008). Metacognition and learning about primacy and recency effects in free recall: The utilization of intrinsic and extrinsic cues when making judgments of learning. Memory & Cognition36(2), 429-437.

Working Memory in Everyday Life
Andrew Watson
Andrew Watson

Imagine this scenario: you’re standing in the CVS toothpaste aisle, trying to decide.

You think you should be able to recognize something familiar, but honestly there are so many choices.

Which brand are you loyal to?

Do you want mint?

Fluoride? Foaming? Whitening?

A patented “sensitive teeth” formula? Bacon flavor?

I think I made up the bacon. But, given all those choices and all the combinations, you simply can’t decide.

The Roman Coloseum on a sunny day, with lots of people in view

If you’re like me, you feebly grab at something plausible and make a dash for the register.

If you’ve had a long day of grading, you might just give up entirely.

So: what on earth is going on in your head? Why is picking a box of toothpaste so exhausting?

Cognition Im/possible

When I meet with teachers, I regularly discuss the importance of working memory.

This vital cognitive capacity allows students to hold on to several bits of information, and to reorganize/combine them into new facts, processes, and mental models.

Oversimplifying a bit, you could say it’s where the learning starts happening in the mind.

This essential mental process, however, creates two important problems.

The first problem: our students just don’t have very much working memory.

If you see students forget the question they were about to ask, or give up on a shockingly simple task, or lose focus completely, you might just be looking at working-memory overload.

It happens all the time.

The second problem: most of the ideas that we want our students to learn already exist in our own long-term memory.

We really struggle to see the working-memory load included in their work, because we already know how to do it.

Why can’t they do this simple math problem?

Why do they struggle to use new vocabulary words into a sentence?

And, why isn’t the answer to a history question perfectly obvious?

In every case, the correct answer is in our long-term memory, but students must wrestle with it in their working-memory.

In other words, our own expertise obscures our students’ mental struggles from us.

But: when we go to the CVS toothpaste aisle, we know exactly what they’re going through. Too many mental variables – not enough headspace. Ugh.

When In Rome…

I’ve spent the last week in Rome for a conference, and – believe it or not – found myself thinking about all that toothpaste.

Why? Because: museums.

I visited several museums, and was repeatedly struck by my own working-memory overload.

For example, the room with all those coins:

What should I be learning from the hundreds (and hundreds!!) of doubloons and coppers and denarii?

Which are the most important examples?

Should I spot trends or cycles or dramatic shifts?

Of course, the museum folks know that I have those questions, so they provide answers:

Hundreds and hundreds of little cards with LOTS of information about the coins.

All that information includes specialized vocabulary.

And those vocabulary words get helpful definitions in parentheses.

All these answers – the information, the vocabulary, the definitions – benefit other experts in ancient coins.

But they leave me even more confused and overwhelmed.

In other words: like some teachers, museum experts did not recognize the cognitive overload experienced by many students/museum-goers.

I wanted to learn.

I wanted to understand.

Certainly I wanted to appreciate.

But I just didn’t know how to process SO MUCH STUFF. And, don’t get me started on  the rooms with helmets or wine-jugs…

Inherent Expertise

At the same time I noted my own experience of working-memory overload, I experienced several museum collections that did NOT overwhelm my brain.

For instance, the first room (more or less) in the Vatican Museum includes several hundred busts: matrons, soldiers, children, priests, emperors, even an enslaved person.

To my surprise, I didn’t feel overwhelmed; instead, I felt curious and enticed. I wanted to look at the faces and speculate about their identities and stories and personalities.

Why the different reaction? Here’s my hypothesis:

I have no expertise in coins (or wine jars), and so all those samples felt overwhelming.

However, I have LOTS of expertise with faces. I spend most of my days interacting with them and the personalities behind them.

My inherent expertise with faces meant that 1000 busts felt fun and interesting, whereas 1000 helmets filled me with boredom and dread.

Classroom Implications

I said above that our teacherly expertise makes it difficult for us to spot our students’ working memory struggles.

For that reason, I think we should always look out for the working-memory overload that we all experience.

Driving to a new location in a rental car? Wondering where the rear defrost button is, and when to turn left? Could be working memory overload…

Navigating a new cafeteria, trying to find the silverware and the beverages and the gluten-free options? The salad dressing is where again? Yup: working memory overload…

Too many options when you’re trying to choose a hotel on that website? Perhaps you’re furious about all those helpful pop-ups? You know the feeling…

In brief: the better we get at recognizing working-memory problems in our own lives, the better we may become at spotting the problems our students are likely to have.

Empathy may be the pathway to understanding.

And, that empathy just might help us teach better.

Earworms and Sleep: What Will They Research Next?
Andrew Watson
Andrew Watson

Just last week, I spoke with middle- and upper-school students about learning.

Student lying in bed listening to music on earphones

We all know — and these students certainly know — that learning is hard. So, does cognitive science have any practical suggestions to help them study and learn?

Yes, reader, it does.

We know that retrieval practice helps learning much more than simple review.

We know that multitasking really isn’t a thing.

And, we know that exercise and sleep really help learning.

This last point — the importance of sleep — can be tricky.

After all, students say that they don’t have time to sleep — they have too much homework.*

Several students asked me: “I’m having trouble falling asleep. What do you suggest?”

In the moment, I suggested having a routine. Go to bed at the same time every night (as much as possible).

But, just a few days ago, a new study came across my desk…

Music and Sleep

I’ve often written about Dr. Michael Scullin’s research (for instance, here and here). He typically researches really practical questions. And, he studies and writes about them in unusually clear ways.

So, I’m a fan.

His most recent study looked at an unexpected topic: earworms.

You know: those infuriating tunes that get stuck in your head.

You just can’t get rid of them. (No, I’m not going to mention a song about very young scary fish that have huge teeth and eat seals and occasionally terroize people. “Doo doo doo doo doo doo.”)

What effect do earworms have on sleep?

Questions and Answers

Research into sleep can get quite technical. We start talking about “spindle detection” and “polysomnography” and “frontal slow oscillation activity.”

Rather than go into the details, I’ll offer a quick summary of the conclusions:

First: survey results suggest that most people (87%!) think that listening to music will improve sleep (or, at least, not harm it).

However — a big however — people who reported listening to relatively more music also report relatively lower sleep quality.

Second: the same survey results suggest that “earworms” make up a big part of this problem.

That is: the more music I listen to, the more earworms I experience. And, the more earworms I experience, the worse I sleep.

YIKES.

Third: you might think that music with lyrics results in more earworms than music without lyrics. Scullin’s team, in fact, thinks that’s the “intuitive view.”

Well, as so often happens, our intuitions are wrong.

Believe it or not, people who listen to instrumental versions of popular songs have more earworms — and worse sleep — than those who listen to the songs themelves.

So, What To Do?

What advice should we be giving students about sleep — other than, “get at least 8 hours”?

Scullin’s team sums up their study this way:

There are few behaviors as prevalent in young adults as listening to music, and many regularly listen to music as part of their bedtime routine. Listening to music feels relaxing, but familiar and repetitive music can trigger involuntary musical imagery that worsens sleep quality and daytime functioning.

In other words: to reduce earworms and sleep better, don’t listen to music before going to sleep. And, instrumental versions of popular songs seem to be especially likely to generate earworms.

I can’t believe I’m typing this, but: Listener beware!


* When students say to me, “I can’t sleep, I have too much homework,” I say, “Let’s think about this:

‘Homework’ is anything that helps you learn more.

Sleep helps you learn more.

Therefore, sleep is homework.

Do your sleep homework, and you will learn more.”


Scullin, M. K., Gao, C., & Fillmore, P. (2021). Bedtime music, involuntary musical imagery, and sleep. Psychological Science32(7), 985-997.

“No Cameras Allowed:” Does Taking Pictures During Lectures Benefit Learning?
Andrew Watson
Andrew Watson

Should students use cameras to take pictures of boardwork?

My high school students know my fierce anti-cell-phone policy. Nonetheless, they do occasionally ask if they may take a quick picture. (I typically say yes, and then check to be sure the phone goes back in the bag.)

When I do PD work at schools, or present at conferences, teachers take pictures of my slides ALL THE TIME.

Of course, the fact that students and teachers want to take those pictures doesn’t automatically mean that it’s a good idea to do so.

In fact, we have several reasons to think it’s a bad idea.

First reason: those photos might serve as subtle hint to our brain’s memory systems: “you don’t need to remember this, because you’ve got a photo.”

Second reason: the act of taking a photo might distract students (and teachers) from the content we’re discussing.

For example: If my students are thinking about framing the photo correctly (and using a cool filter), they’re NOT thinking about the ways that Fences combines both comic and tragic symbols.

Third reason: we’ve got research!

Check this out…

Prior Knowledge

Researchers have looked at this question for several years now.

In several studies, for instance, researchers asked participants to tour a museum and take pictures of various works of art.

Sure enough, later tests revealed that people remember more about the artwork they didn’t photograph than the artwork they did photograph.

As predicted above, something about taking a photograph made it harder – not easier – to remember the content.

For all these reasons, it seems, teachers might reasonably discourage students from taking photos.

At the same time, we should probably keep asking questions.

In particular, we should acknowledge that museum photography probably isn’t a good stand-in for classroom photography.

That is: my students (and teachers during PD) probably take photographs to help themselves remember important ideas, concepts, and examples. In museums, people might take pictures because that statue is so cool and beautiful!

The museum research offers a useful and interesting baseline, but we’d love some research into … say … actual classrooms.

Cheesemaking, and Beyond!

Well, I’ve got good news. A research team — led by Dr. Annie Ditta at the University of California, Riverside — has indeed started exploring exactly these questions.

In their studies, Team Ditta had students watch 3 short online video lectures about obscure topics. (Like, cheesemaking. No, I’m not joking.)

Participants took pictures of half of the slides.

Here’s the primary question: did students remember more information from the photographed slides, or the unphotographed slides?

SURPRISE! Taking pictures helped students remember the information on the slide.

For the reasons listed above, I did not expect that result. In fact, the researchers didn’t either.

But, those photos really helped.

In one study, students got 39% of the questions right for the slides they photographed, and 29% right for the ones they didn’t. (Stats folks: Cohen’s d was 0.41.)

Given how EASY this strategy is, we should really pay attention to this finding.

By the way, Dr. Ditta’s study explored some other questions as well.

First: students remembered info from photographed slides better both when they decided which slides to photograph and when they were told which ones to photograph.

So, if we tell students to “photograph this information,” we (probably) don’t disrupt the benefit.

Second: what about spoken information?

Common sense suggests that taking a picture won’t help remember spoken ideas (if those ideas aren’t written on the slide). In fact, taking that picture might distract students from the spoken words.

Strangely, in this research, Team Ditta came up with mixed – and surprising – results. In one study, taking a picture made no difference in memory of spoken material. In the other, it benefitted memory of spoken material.

WOW.

So, What Should Teachers Do?

Before we rush to apply research in our classrooms, we always want to ask a few questions.

In this case, I think we should have LOTS of questions.

First: Dr. Ditta’s research looked at brief, online lectures for college students.

Do these conclusions apply to longer classes? To in-person classes? For K-12 students? To students who aren’t neurotypical?

We just don’t (yet) know.

Second: participants in these studies didn’t do anything with the photos. They simply took them.

Would we find the same pattern for students who reviewed their photos, compared to – say – reviewing their notes?

We don’t (yet) know.

Third: participants were tested on their knowledge 5 minutes after the videos were done.

We’ve got LOTS of research showing that short-term gains don’t necessarily result in long-term learning.

So, would these findings hold a week later? A month later?

We don’t (yet) know.

 

Given all the things we don’t know, how can this research benefit us?

For me, these studies open up new possibilities.

In the past, as I described above, I permitted students (and teachers) to take photos. But I tried to discourage them.

I would even – on occasion – explain all the reasons above why I thought taking photos would reduce learning.

Well, I’m no longer going to discourage.

Instead, I’ll explain the complex possibilities.

Perhaps taking photos helps memory because it signals that THIS INFORMATION DESERVES ATTENTION.

Or, perhaps taking photos helps only if students DON’T review before tests. But, taking notes would help more … especially the students who DO review before tests.

And perhaps, just perhaps, this research team got flukey results because even well-done research sometimes produces flukey results. Future classroom research about taking photos of slides might ultimately suggest that (despite this thoughtful work), it really is a bad idea.

I wish the answer were simpler, but it just isn’t.

TL;DR

Surprising new research suggests that taking photos of lecture slides helps college students remember slide contents – even when students don’t review those photos.

Before we teachers rush to make dramatic changes, we should think carefully how this research fits our classrooms and contexts.

And, we should weigh this memory strategy against lots of other strategies – like retrieval practice.

Finally: let’s all watch this space!


Ditta, A. S., Soares, J. S., & Storm, B. C. (2022). What happens to memory for lecture content when students take photos of the lecture slides?. Journal of Applied Research in Memory and Cognition.

Soderstrom, N. C., & Bjork, R. A. (2015). Learning versus performance: An integrative review. Perspectives on Psychological Science10(2), 176-199.