Skip to main content
The First Three Steps
Andrew Watson
Andrew Watson

Early in January, The Times (of London) quoted author Kate Silverton (on Twitter: @KateSilverton) saying:

It’s the schools that have the strictest discipline that have the highest mental health problems.

Helpfully, they include a video recording of her saying it.

In context, Silverton is saying — in effect — that schools’ strict disciplinary policies damage students’ mental health.

If she’s right, school leaders should know that!

If we run schools with strict disciplinary policies, we should at least consider changing them. Obviously, we don’t want to cause mental health problems.

But … is she right?

This specific question leads to a broader question:

When someone says “research says you should change the way you run your school!,” what should we do next?

Accept the advice? Reject it? Flip a coin?

Let me suggest three simple steps.

Step 1: Ask for Sources

This advice seems too obvious to say out loud.

OF COURSE someone reading this blog would ask for sources.

However, in my experience, we’re very hesitant to do so. It seems — I don’t know — rude, or pushy, or presumptuous.

Especially when the research comes from psychology or neuroscience, we just don’t want to seem stubborn.

But, trust me, it’s always appropriate to ask for the research.

In this case, happily, lots (and lots) of people did ask Silverton for research.

This small niche of edutwitter lit up with people asking — quite simply — “what research suggests that strict school discipline damages mental health?” (To be clear, it also lit up with people praising Silverton for speaking out.)

Even more happily, she responded by citing 11 research studies.

Her transparency allows us to ask a second question:

Step 2: Does the Research, in fact, Support the Claim?

Here again, the question seems to obvious to raise. Who would cite research that doesn’t support the claim they make?

I’m here to tell you: it happens all the time. (I wrote about a recent example here.)

In this case, teacher/researcher/blogger Greg Ashman looked at those sources. (You can read the article he wrote here, although you might have to subscribe to his substack to do so.)

So, does the research support the claim?

Amazingly, most of the cited studies don’t focus on students’ mental health.

That’s right. To support the claim that “strict discipline harms mental health,” Silverton cites very little research about mental health. (Ashman has the details.)

Yes, we might make some guesses based on these studies. But, guesses aren’t research.

As Ashman writes:

it’s easy to accept that suspension and [expulsion] are associated with higher rates of depression without assuming suspension and [expulsion] are the cause.

So, DOES strict school discipline cause mental health problems? I don’t (yet) know of direct research on the subject.

This specific example about school discipline, I hope, emphasizes the broader point:

Simply by a) asking for research and b) giving it a quick skim, we can better decisions about accepting or rejecting “research-based” teaching advice.

Step 3: Actively Seek Out Contradictory Information

Because humans are so complicated, psychology and neuroscience research ALWAYS produces a range of findings.

Even with something as well-supported as retrieval practice, we can find a few studies suggesting limitations — even (very rare) negative effects.

I thought of this truth when I saw a New York Times headline: Cash Aid to Poor Mothers Increases Brain Activity in Babies, Study Finds.

This blog is about brain research, not politics. At the same time, this brain research might be cited to support a policy proposal.

So: what should we do when we see brain research used this way?

Step 1: “Ask for sources.” Good news! The sources are quoted in the article.

Step 2: “Does the research, in fact, support the claim?”

Sure enough, the researchers conclude

“we provide evidence that giving monthly unconditional cash transfers to mothers experiencing poverty in the first year of their children’s lives may change infant brain activity.”

Step 3: “Actively seek out contradictory information.”

Because this claim made the front page of the Times, I kept an eye out for responses, both pro and con.

Just a few days later, I found this tweet thread. In it, Dr. Stuart Richie points out some real concerns with the study.

For instance: the authors “pre-registered” their study. That is, they said “we’re going to measure variables X, Y, and Z to see if we find significant results.”

As it turns out, they found (small-ish) significant results in P, Q, and R, but not X, Y, and Z.

As Richie notes, P, Q, and R are certainly interesting. But:

This is a clear example of hype; taking results that were mainly null and making them into a huge, policy-relevant story. [The research] is a lot more uncertain than this [Times article implies].

To be very clear: I’m not arguing for or against a policy proposal.

I am arguing that when someone says “brain science shows!,” we should ask questions before make big changes.

TL;DR

When people cite brain research to encourage you to teach differently…

… ask for sources,

… confirm they support the recommendation,

… seek out contradictory points of view.

Our students benefit when we follow those three simple steps.

A “Noisy” Problem: What If Research Contradicts Students’ Beliefs?
Andrew Watson
Andrew Watson

The invaluable Peps Mccrea recently wrote about a vexing problem in education: the “noisy relationship between teaching and learning.”

In other words: I can’t really discern EXACTLY what parts of my teaching helped my students learn.

Was it my content knowledge?

The quality of my rapport with them?

The retrieval practice I require?

The fact that they slept and ate well in the days before class?

Some combination of all these variables?

Because I don’t know EXACTLY which teaching variable helped (or hurt) learning, I struggle to focus on the good stuff and eliminate the bad stuff.

I thought about Mccrea’s wisdom when I read a recent study about interleaving.

Here’s the story…

Interleaving 101

Frequent blog readers know all about interleaving, a way of organizing students’ practice.

Let’s say I teach my students about parts of speech.

Once they have a basic understanding of each one, I could have them practice each part of speech on its own.

That is: they identify nouns on Monday, adverbs on Tuesday, prepositions on Wednesday, and so forth.

Researchers call that structure “blocking” — as in “blocks of homework focusing on individual topics.”

Or, I could have my students jumble several topics together every night.

That is: Monday night they practice nouns, adverbs, and prepositions. Tuesday they practice verbs, prepositions, and conjunctions. Wednesday: nouns, verbs, and adjectives.

The total number of practice problems would remain the same, but they’d practice several parts of speech all together.

Researchers call this system “interleaving” — as in “weaving together several different topics.”

Measuring Success

Of course, teachers want to know: does interleaving work? Do students who interleave their practice learn more than students who block?

Let’s imagine two ways of answering that question

Strategy #1: ask the students.

Obviously.

Who knows more about the students’ learning than the students themselves?

Strategy #2measure their learning.

Obviously.

If students who block consistently remember more than students who interleave (or vice versa), then we have a winner.

So, what’s the answer?

Answers, and Vexing Questions

According to Samini and Pan’s 2021 study, strategy #1 yields a clear answer: students say that interleaving is harder and results in LESS learning.

Of course, that means they think that blocking is easier and results in MORE learning.

Alas, strategy #2 arrives at a contradictory result.

When we measure students’ actual learning, they remember more after interleaving than blocking.

Samini and Pan’s study gets this result. And, LOTS AND LOTS of research gets to the same result. (See Agarwal and Bain’s book for a great review of the research.)

In other words, this study points to an especially “noisy” part of the relationship between teaching and learning.

Students genuinely think and believe that interleaving interferes with learning.

However, interleaving in fact promotes learning.

How do we handle this quandary?

Tentative Solutions

In my high-school classroom, we do A LOT of retrieval practice.

Almost every day, I fire off questions and ask students to attempt an answer.

Sometimes I call on raised hands; or cold call; or have students write answers in their notebooks (I circle the room to check their responses). They might write on the board; they might consult in pairs.

I’m entirely comfortable using retrieval practice — and so are my students — because on the second day of class I showed them research about retrieval practice.

I told them:

This might feel hard at first.

But, trust me. It feels hard because your brain is working harder. And that means you’re learning more.

It’s like going to the gym. You don’t gain muscle by picking up paper clips. You gain muscle by picking up heavy things. Hard work leads to better fitness.

The same rule applies here. Retrieval practice is harder, so you’ll learn more.

Since that day, I stop every now and then at the end of an RP session and say: ” do you feel how much you’ve learned? Do you see how much retrieval practice is helping?”

In fact (I swear I am not making this up), one of my Sophomores once said: “Thank you Mr. Watson for making us do retrieval practice every day.”

I tell this story because it applies to interleaving as well.

I’ve been interleaving all year, but I haven’t (yet) explained it to my students. I plan to do so this upcoming week (or next).

My hope is: they’ll see why we’ve been bouncing back and forth from topic to topic in ways that might seem random or disorganized.

We’ve been interleaving all along.

I offer this solution as “tentative” because my context might not match yours.

For instance, if you teach younger or older students, they might not respond as mine do.

If you teach students with diagnosed learning differences, interleaving might not benefit them as much.

And so forth.

As always: consider the research findings, consider my experience, and then use your own best judgment to fit them into your classroom practice.

TL;DR

If students’ beliefs contradict research, I myself tell them about the research — graphs and all. And then I ask them to trust me.

Retrieval practice and interleaving really do work. My students know about this research pool. So far, they’re on board.

If you try this strategy, or another one, I hope you’ll let me know about your own experience.


Samani, J., & Pan, S. C. (2021). Interleaved practice enhances memory and problem-solving ability in undergraduate physics. NPJ science of learning6(1), 1-11.

The Goldilocks Map by Andrew Watson
Erik Jahner, PhD
Erik Jahner, PhD

The Goldilocks Map: A Classroom Teacher’s Quest to Evaluate ‘Brain-Based’ Teaching Advice is an entertaining and eye-opening conversation that seeks to help the reader develop a way of thinking that is sorely missing in today’s discourse around teaching and the brain. It is often stated that we need to be critical consumers of brain-based research as we apply it to the classroom; this book gives a roadmap showing us how. Andrew Watson takes us on this “quest” that reflects his 16 years of teaching experience and subsequent “Mind, Brain, Education” degree. The coaching in this book is an essential introduction for the developing teacher, the experienced teacher seeking to develop their understanding, as well as the experienced researcher who could always use a course in effective translation and writing. The experience Watson offers is delightful for all.

Andrew Watson embeds this search for understanding of the Neuroscience and Psychology of education through a playful and humorous narrative. For some readers, embedding neuroscience in the quests of Aladdin, Goldilocks, and Middle Earth may be off-putting. But seriously, you need to relax a bit and enjoy. In fact, accepting this narrative style is an essential element in disarming our pretentious mindsets and allows one to approach this field with an authentic search for understanding and intellectual transparency while still embracing the simple joys of good storytelling.

The book is not an encyclopedic rehashing of implications of neuroscience for education, but it fills an important gap.  Through a series of deep dives into themes such as environmental enrichment, spaced learning, and music in education, the reader is coached on how to locate, evaluate, and communicate research around these topics and more. As someone who regularly translates between neuroscience and education, I found the book refreshing and very useful.

One of the books greatest strengths is its attention to language use in research and translation. Watson highlights the word use and phrasing used by advocates for neuroeducation and calls our attention to some of the ridiculousness in original publications as well as our subsequent attempts to explain this research to colleagues. However, he does not diminish the research but elevates it by revealing the intention behind published words making the research more accessible. Without careful intention, we may catch ourselves and our peers exercising some common missteps by using language to obfuscate our lack of understanding or to add gravitas to otherwise empty phrases. I guarantee that you will humbly find your own words reflected in these pages and gain strategies to communicate more effectively.

Watson also is taking us on an active quest of discovery by not seeking our passive acceptance of research and application. Each chapter empowers the reader, as a member of the mind, brain, education community, to engage the community with a sense of exploration. Teachers are not simply consumers of research; the translation they enact brings to bear their expertise in acts of community involvement that make this research living. In my opinion, researchers are too often placed on pedestals and some researchers hide in their ivory towers of academia. Here we have the tools to pull this community together and flatten the illusion of a hierarchy.

There are also plenty of unanticipated “gems” in this book that will inspire you to take a moment to go on your own exploratory journey to accompany the pages. I found myself on many occasions pulling up a suggested web resource and learning something new or exploring an article I previously read out of pure curiosity inspired by these pages. I frequently jotted down particularly important turns of phrase and thought experiments that I could put to immediate use in my own scientific practice to not only make my work easier to understand for others but also to help make my own goals transparent to me.

This intellectual, entertaining, and often humorous engagement with the field is just what we all needed – useful as an introduction and useful to get us back on track.

Teaching with Images: Worth the Effort?
Andrew Watson
Andrew Watson

According to Richard Mayer’s “multimedia principle,”

People learn better from words and pictures than from words alone.

If that’s true, then we should — obviously — be sure to include pictures in our teaching.

However…

Whenever we see a broad principle like that, we should always look for specific limitations.

That is…

… does this principle apply to kindergarteners as well as 5th graders and adult learners?

… does it apply for students with an ADHD diagnosis?

… is it true when teaching Civil War history, theorems about similar triangles, and bunting strategies?

And so forth.

Researchers call such limits “boundary conditions,” and we should ALWAYS look for boundary conditions.

So, let’s look at that broad principle ( “pictures + words” > “words”) and ask this boundary question:

Does the content of the picture matter?

Possibilities and Perils

Happily, one of the people asking that question is…Richard Mayer himself.

In his career, he’s come up with a whole suite of useful principles. And, he spends lots of time looking for boundary conditions.

Specifically, in a usefully straightforward study, he and Eunmo Sung study several different kinds of images:

Instructive images: “directly relevant to the instructional goal.”

I’m teaching Macbeth right now, and focusing on the play’s tension between order and chaos. So, I might show students a picture of Scotland’s craggy wildernesses (chaos) and one of a highly structured royal ceremony (order).

Seductive images: “highly interesting but not directly relevant to the instructional goal.”

A movie version of Macbeth — starring Denzel Washington and Frances McDormand — just came out. I could show my students a picture of these two movie stars on the Red Carpet at an Oscar ceremony.

Decorative images: “neutral but not directly relevant to the instructional goal.”

Macbeth can be a grim play: so much beheading, so much unseaming. So: I could include pictures of waterfalls and sunrises on my handouts to raise my students’ spirits a bit.

Once we start exploring these potential boundary conditions — perhaps not all images benefit learning equally — we might get even more useful guidance about combining words and images.

Predictions and Results

Sung and Mayer measured the effects of such images on students’ learning AND on their enjoyment of a lesson.

Take a moment to make some predictions on your own.

Which, if any, of those graphics will help students learn more?

Which, if any, will help students enjoy the lesson more?

[I’ll pause while you think about those questions.]

 

 

Perhaps you, like Sung and Mayer, predicted that ALL the images would increase students’ enjoyment.

And perhaps you predicted that the INSTRUCTIVE images would help students learn, but not the others.

Sure enough, you and they were right. Students LIKE images, but LEARN FROM images that focus their attention on the learning goal. (If you’re interested in the specific numbers, look at the 6th page of the study.)

We should, I think, focus on this key finding: students do not always learn more when they enjoy a lesson more.

We shouldn’t deliberately make our lessons dull.

But: we shouldn’t assume that an enjoyable lesson necessarily results in more learning. In this case, those photos of Macbeth movie stars piqued my students’ curiosity and interest, but didn’t help them learn anything about the play.

Three Final Points

First: the benefits of dual coding have gotten lots of attention in recent years.

To get those benefits, we should remember these boundary conditions. Dual coding helps if — and only if — the images highlight the learning goal.

Second: a recent meta-analysis about “seductive details” nicely complements this study.

Third: Like many teachers, I see the good and the vile in Twitter.

Yes (YES!!), it can be a sink of repulsive yuckiness.

And (surprise!!), it can also be supportive and helpful.

I bring up this point because: a wise soul on Twitter mentioned this Sung & Mayer study recently, and reminded me of its importance.

I can’t remember who brought it up (I would credit that tweep if I did), but I’m grateful for the nudge.

Such useful research! Such helpful guidance!


Sung, E., & Mayer, R. E. (2012). When graphics improve liking but not learning from online lessons. Computers in Human Behavior28(5), 1618-1625.

Let’s Get Practical: How Fast Should Videos Be?
Andrew Watson
Andrew Watson

Research often operates at a highly abstract level.

Psychologists and neuroscientists study cognitive “tasks” that stand in for school work. If we’re being honest, however, we often struggle to see the connection between the research task and actual classroom learning.

HOWEVER…

Every now and then, a study comes along that asks a very practical question, and offers some very practical answers.

Even better: it explores the limits of its own answers.

I’ve recently found a study looking at this (incredibly practical) question:

Because students can easily play videos at different speeds, we need to know: which video speed benefits learning the most?

So: what advice should we give our students about learning from videos?

Exploring The Question

Let’s start with a specific example:

If a student watches a video at double speed, she (obviously) spends only half as much time mentally interacting with its information.

Does that reduction in time lead to an equal reduction in learning? Will she learn half as much as if she had watched it at regular speed?

Dr. Dillon Murphy starts with that question, and then quickly gets interested in crucial related questions:

What about other video speeds? That is: what about watching the video at 1.5x speed? What about 3x speed?

Does the topic of the video matter?

And, here’s a biggie: what should students do with the time they save?

Even before we look at the results of this study, I think we can admire its design.

Murphy’s team ran multiple versions of this study looking at all these different variables (and several others).

They did not, in other words, test one hypothesis and then — based on that one test — tell teachers what to do. (“Best practices require…”)

Instead, they invited us into a complex set of questions and possibilities.

Maybe 1.5x is the most efficient speed for learning.

Maybe 3x is the best speed if students use the time they saved to rewatch the video.

Maybe regular speed is best after all.

Because Murphy’s team explores so many possibilities with such open-minded curiosity, we have a MUCH better chance of figuring out which results apply to us. *

The Envelope Please

Rather than walk you through each of the studies, I’ll start with the study’s overall conclusions.

First: watching videos at higher speeds does reduce learning, but not as much as you might think.

That is: spending half as much time with the video (because a student watched it at double speed) does NOT result in half as much learning.

To be specific: students watched ~ 14 minute videos (about real-estate appraisals, or about Roman history).

A week later, those who watched them at regular speed scored a 59% on a quiz. Those who watched at 2x speed scored a 53%.

59% is higher that 53%, but it’s not twice as high. **

Second: students can use that “saved” time productively.

What should a student do with the 7 minutes she saved? She’s got two helpful choices.

Choice 1: rewatch the video right away.

Students who used their “saved” time to rewatch the video right away recaptured those “lost” points. That is: they had the same score as students who watched the video once at regular speed.

Choice 2: bank the time and rewatch the video later.

In another version of the study, students who watched the 1x video once scored a 55% on a quiz one week later.

Other students watched the 2x video once, and then once again a week later. They scored a 63% on that quiz. (For stats types, the d value is 0.55 — a number that gets my attention.)

In other words: rewatching at double speed a week later leads to MORE LEARNING in the THE SAME AMOUNT OF TIME (14 minutes).

Practical + Practical

Murphy takes great care to look at specific combinations.

His example encourages us to take care as well. For instance:

His team worked with college students. Will this result hold for 8th graders, or 2nd graders?

You can look to you your teacherly experience and judgment to answer that question.

Will this effect hold for longer videos: 30 minutes, or one hour?

We don’t know yet.

These videos included a talking head and slides with words — but not closed captions. Will some other combination (no talking head? closed captions on?) lead to different results?

We don’t know yet.

In other words: Murphy’s study gives us practical guidance. We should use our judgment and experience to apply it to our specific teaching circumstances.


* I should note: This study is unusually easy to read. If the topic interests you, you might look it over yourself.

** Important note: I’ve seen news reports about this study saying that watching once at double speed results in the same amount of learning as watching once at regular speed. That claim is untrue. And: Murphy’s study does not make that claim.

Murphy, D. H., Hoover, K. M., Agadzhanyan, K., Kuehn, J. C., & Castel, A. D. (2021). Learning in double time: The effect of lecture video speed on immediate and delayed comprehension. Applied Cognitive Psychology.

The Benefits of Direct Instruction: Balancing Theory with Practice
Andrew Watson
Andrew Watson

When teachers hear that “research shows we should do X,” we have at least two broad questions:

First Question: what’s the research?

Second Question: what EXACTLY does X look like in the classroom?

People who have the expertise to answer the first question (researchers) might not have the K-12 classroom experience to answer the second question.

And, of course, people who can make it work in the classroom (teachers) might not know or understand the research.

Wouldn’t it be great if we could find one book that answers both sets of questions?

In fact, it would be especially great if that book focused on a controversial topic. In that case, we could see a complete argument – both the why and the how – before we make a judgment about the controversy.

Does that sound tempting? I have good news…

Embracing Controversy

A feisty battle has raged in edu-circles for many years now: “direct instruction” vs. “constructivist pedagogy.” *

In one corner, “constructivists” argue that problems or projects or independent inquiries help students discover and build enduring understanding. And, such exploration fosters authentic motivation as well.

In the other corner, “direct instruction” advocates argue that working memory limitations sharply constrain students’ cognitive workspace. For that reason, teachers must explicitly shape learning experiences with small steps and carefully-designed practice.

Both approaches can be – and frequently are – parodied, misunderstood, and badly practiced. So, a book explaining the WHY (research) and the HOW (classroom practice) would be greatly helpful.

Sage on the Page

Adam Boxer teaches chemistry at a school in London, and has been blogging about his work for some time now. (If you follow our twitter account, @LearningandtheB, you’ve seen links to his work before.)

In his book Explicit & Direct Instruction: An Evidence-Informed Guide for Teachers, Boxer gathers eleven essays that explain the research background and then then get SUPER specific with classroom suggestions.

In the first chapter, Kris Boulton tells the history of “Project Follow Through,” a multi-decade program to discover the best way of teaching children.

Researchers tracked more than 200,000 children in 13 different programs over several years, and compared their learning across three dimensions: basic skills, cognitive skills, and affective skills.

Which approach proved most effective?

Direct Instruction, created by Siegfried Engelmann.** It was, in fact, the only program of the 13 that benefitted students in all three dimensions.

When advocates of Direct Instruction (and direct instruction) insist that research shows its effectiveness, they reasonably enough point to Project Follow Through. (Can others critique this study? Of course…)

Both Boulton and Greg Ashman (in the second chapter) then emphasize the alignment of direct instruction with psychology models: cognitive load theory, schema theory, and so forth.

In brief: we’ve got LOTS of research explaining why direct instruction should work, and showing that it does work.

Let’s Get Practical

After Boulton and Ashman explain the why, the next several chapters deliver on the classroom how.

For me, the book’s great success lies in the number, variety, and specificity of these chapters.

What does direct instruction look like for teaching math?

How about science?

How about writing?

What’s the best number of examples to use?

And so forth.

I especially enjoyed Sarah Cullen’s chapter on fading. Cullen begins with an important question/critique:

How, then, can a teaching method that so depends on instruction – on teachers leading learning and controlling the content to which pupils are exposed – foster autonomy?

Her answer focuses on having scaffolds and removing scaffolds – aka, “fading.”

In particular, Cullen wisely conceptualizes fading over many different time spans: fading across grades (which requires planning across years), fading within a term’s curriculum (requiring planning across months), and fading within a lesson (requiring skill, insight, and practice).

Like the book’s other chapters, Cullen’s offers many specific examples for each of her categories. In other words, she ground theoretical understanding with highly specific classroom realities.

In Brief

If you already think direct instruction sounds right, you’ll be glad to have a how-to guide.

If you think it sounds suspect (or even oppressive), you’ll be glad to read a straightforward explanation of the research behind the approach. (You might not be persuaded, but you’ll understand both sides of the argument more clearly.)

And, if you want realistic classroom examples explained with loving detail, this book will launch 2022 just right.


* I’ve put those labels in quotation marks because both are familiar, but neither one really works.

** Direct Instruction (with capital letters) is the name of Engelmann’s specific program. On the other hand, direct instruction (without capital letters) is a broader approach to thinking about teaching and learning.