Andrew Watson – Page 17 – Education & Teacher Conferences Skip to main content

Andrew Watson About Andrew Watson

Andrew began his classroom life as a high-school English teacher in 1988, and has been working in or near schools ever since. In 2008, Andrew began exploring the practical application of psychology and neuroscience in his classroom. In 2011, he earned his M. Ed. from the “Mind, Brain, Education” program at Harvard University. As President of “Translate the Brain,” Andrew now works with teachers, students, administrators, and parents to make learning easier and teaching more effective. He has presented at schools and workshops across the country; he also serves as an adviser to several organizations, including “The People’s Science.” Andrew is the author of "Learning Begins: The Science of Working Memory and Attention for the Classroom Teacher."

Don’t Hate on Comic Sans; It Helps Dyslexic Readers (Asterisk)
Andrew Watson
Andrew Watson

People have surprising passions.

Some friends regularly announce that the Oxford comma is a hill they’re ready to die on. (I’m an English teacher, and yet I wonder: you’re willing to die over a punctuation mark?)

With equal energy and frequency, Twitter and Facebook resonate with mockery of the typeface Comic Sans. (Again, it’s a typeface. Why all the pique?)

Comic-sans mockery, however, often earns this earnest rebuttal:

“Comic sans helps dyslexic readers, who struggle with other fonts. Comic sans isn’t dreadful; it’s essential!”

I’ve read this statement so often that I simply assumed it’s true. Would Twitter lie?

Just Checking…

I have, in fact, seen the claim that “comic sans benefits dyslexic readers” twice this week.

However, I’ve started to notice a curious silence: no one cites specific research to back up that claim.

So, I thought I’d find it for myself.

Long-time readers know my routine. I surfed over to Google Scholar, and searched the terms “dyslexia” and “font.” And, to be on the safe side, I also searched “dyslexia” and “comic sans.”

From there, I used Scite.ai and Connectedpapers.com to follow up on my findings.

The results surprised me, so I thought I’d pass them along.

Does Comic Sans Benefit Dyslexic Readers?

I don’t know.

More precisely, I can’t find research that explores that question directly.

When I did the searches described above, I found several studies that seemed promising. And yet, when I looked at the specifics, I found that the researchers hadn’t explored exactly this question.

For instance:

Several studies cite the British Dyslexia Association style guide as their source for this recommendation.

That guide does recommend Comic Sans (and other sans serif fonts, including Arial). However, it doesn’t cite any research to support that claim.

Hmmmm.

This study, helpfully called “Good Fonts for Dyslexia,” does indeed ask 48 dyslexic readers to study passages in different fonts. It asks exactly the question we’re trying to answer.

However, this research team didn’t include Comic Sans among the fonts they studied.

They do recommend Helvetica, Courier, Arial, Verdana and CMU for dyslexic readers. But they have no recommendation one way or the other about Comic Sans.

Double hmmmmm.

Most of the studies I found focus less on font and more on web design. (And, the most common font-related conclusion I found is: fonts designed to benefit dyslexic readers don’t.)

At this point, I simply don’t have a research-based answer to this question.

To Be Clear…

This search genuinely surprised me. Given the frequency of the claim — just google it! — I assumed I’d find a robust research pool.

But, no.

Given the potential for controversy here, I want to answer some likely questions:

“Are you saying Comic Sans DOESN’T help dyslexic readers?”

No. I’m saying I can’t find a research-based answer either way.

“If you’re not an expert in dyslexia, how can you be so sure?”

Honestly, I’m not sure. I’m usually fairly skilled at finding the research basis behind educational claims. (Heck, I wrote a book about doing so.) But in this case, I simply couldn’t discover a convincing answer to the question.

“Look, this research right here shows that Comic Sans does help!”

AWESOME! Please share it with me so I can write a follow-up post.

“My student/child/colleague tells me that Comic Sans helps a lot.”

That kind of individual experience is useful and important. I hope that researchers explore this question, so we can know with greater confidence whether or not it helps most dyslexic readers.

“How long did you look?”

Maybe an hour, spread out over two days. I certainly could have missed something. I hope you’ll let me know if you’ve got a study that looks at this possibility.

TL;DR

You might have heard that Comic Sans helps dyslexic readers; you might have heard that “research says so.”

Those claims might be true, but I haven’t (yet) found research supporting them. If you know of that research, please send it my way!

Perspectives on Critical Thinking: Can We Teach It? How Do We Know?
Andrew Watson
Andrew Watson

Imagine the following scenario:

A school principal gathers wise cognitive scientists to ask a straightforward question…

“Because critical thinking is an essential 21st century skill, we know our students need to develop critical thinking skills. If we want to create a school program or a class or a curriculum to foster critical thinking, what guidance can you give us?”

Happily, we don’t have to imagine. At last week’s Learning and the Brain conference in New York, I asked a distinguished group of cognitive psychologists* exactly that question.

The resulting conversation offered practical suggestions, provocative assertions, and a surprising amount of humor.

I’ll try to summarize that half-hour conversation here.

On the One Hand…

Let’s start at one end of the spectrum, with the most optimistic ways to answer the question:

First: we know what critical thinking is.

Dr. Laura Portnoy, for instance, considers critical thinking the ability to support claims with evidence and reason.

If I claim that “the earth orbits the sun,” I should be able to cite evidence supporting that claim. And I should be able to explain the logical process I use to make conclusions based on that evidence.

Dr. Ben Motz agrees with that foundation, and adds an important step: critical thinkers recognize and avoid logical fallacies.

A comprehensive list of logical fallacies goes on for pages, but critical thinkers typically question their own beliefs aggressively enough to avoid the most common mistakes.

Second: we know how to foster critical thinking.

The specifics of an answer probably vary by age and discipline. However, we’ve got specific curricular strategies to help us foster critical thinking among students.

Dr. Laura Cabrera, with this truth in mind, offers a specific bit of advice: start early.

If we want students to grow as critical thinkers, we shouldn’t wait until their sophomore year in high school. Kindergarten would be a fine place to start.

On the Other Hand…

All these optimistic answers, however, quickly give way to grittier – perhaps more realistic – assessments of the situation.

First: because critical thinking is so complicated, no precise definition holds true in a broadly useful way. In other words – politely speaking – we can’t exactly define it.

In cognitive psychology terminology, as Dr. Derek Cabrera put it, “critical thinking has a construct validity problem.” In fact, the five psychologists on the panel – professors all – don’t agree on a definition.

Second: This definition problem has terrible implications.

If we can’t define critical thinking, broadly speaking, then we can’t determine a consistent way to measure it.

And if we can’t measure it, we have no (scientific) way of knowing if our “critical thinking program” helps students think critically.

Third: In fact, if we can’t measure students’ critical thinking skills right now, we might not realize that they’re already good at it.

Dr. Dan Willingham – author of the well-known Why Don’t Students Like School – made this point at the beginning of our conversation.

“Why,” he asked, “do you think your students have a critical thinking problem? What measurement are you using? What do you want them to do that they can’t do?”

In other words: it’s not obvious we should start a critical thinking program. Because we can’t measure students’ abilities, we just don’t know.

Dr. Derek Cabrera made this point quite starkly: “My advice about starting a critical thinking program is: don’t.

Don’t Start Now

Even if we could measure critical thinking, as it first seemed we could, teachers might not want to give it disproportional attention.

Fourth: some panelists doubt that critical thinking is any more important than many (many) other kinds of thinking – creative thinking, interdisciplinary thinking, systems thinking, fuzzy logic…the list goes on.

Dr. Portnoy, for instance, champions good old-fashioned curiosity. If students ask the right questions (critical or otherwise), they’re doing good thinking and learning.

Why, then, would it be bad if they aren’t doing critical thinking, narrowly defined?

The Cabreras, indeed, argue that students trained to think critically often get too critical. They stamp out potentially good ideas (that spring from imaginative thinking) with all their skills at critical thinking.

Fifth: opportunity cost.

Schools already have far too much to do well, as Dr. Willingham frankly pointed out.

If we plan to add something (a critical thinking program/curriculum), we should know what we plan to take out.

And, we should have a high degree of confidence that the new program will actually succeed in its mission.

If we remove a program that does accomplish one goal and replace it with one that doesn’t, our efforts to improve schools will – paradoxically – have deprived students of useful learning.

Making Sense of the Muddle

All these points might seem like bad news: we (perhaps) don’t know what critical thinking is, and (perhaps) shouldn’t teach it even if we did. Or could.

That summary, I think, overlooks some important opportunities that these panelists highlighted.

Dr. Motz offers specific ways to define critical thinking. His talk at the conference, in fact, focused on successful strategies to teach it.

Even better: he wants teachers to join in this work and try it out with their own students.

The question we face, after all, is not exactly “can I teach critical thinking — generally) — to everyone?”

It is, instead: “can I teach critical thinking — defined and measured this way — to my students?”

If the answer to that question is “yes,” then perhaps I should make room for critical thinking in my students’ busy days.

Made wiser by these panelists’ advice, I know better how to define terms, to measure outcomes, to balance several thinking skills (including curiosity!).

When researchers’ perspectives on critical thinking helps us think critically about our teaching goals, we and our students benefit.


* The panelists: Dr. Derek Cabrera, Dr. Laura Cabrera, Dr. Benjamin Motz, Dr. Lindsay Portnoy, Dr. Dan Willingham.

Do Classroom Decorations Distract Students? A Story in 4 Parts…
Andrew Watson
Andrew Watson

Teacher training programs often encourage us to brighten our classrooms with lively, colorful, personal, and uplifting stuff:

Inspirational posters.

Students’ art work.

Anchor charts.

Word walls.

You know the look.

We certainly hope that these decorations invite our students in and invigorate their learning. (We might even have heard that “enriched environments promote learning.”)

At the same time, we might worry that all those decorations could distract our students from important cognitive work.

So, which is it? Do decorations distract or inspire? Do they promote learning or inhibit learning? If only we had research on this question…

Part I: Early Research

But wait: we DO have research on this objection.

Back in 2014, a team led by Dr. Anna Fisher asked if classroom decorations might be “Too Much of a Good Thing.”

They worked with Kindergarten students, and found that — sure enough — students who learned in highly-decorated rooms paid less attention and learned less than others in “sparsely” decorated classroom.

Since then, other researchers have measured students’ performance on specific mental tasks in busy environments, or in plain environments.

The results: the same. A busy visual field reduced working memory and attention scores, compared to plain visual environments.

It seems that we have a “brain-based” answer to our question:

Classroom decorations can indeed be “too much of a good thing.”

Taken too far, they distract students from learning.

Part II: Important Doubts

But wait just one minute…

When I present this research in schools, I find that teachers have a very plausible question.

Sure: those decorations might distract students at first. But, surely the students get used to them.

Decorations might make learning a bit harder at first. But ultimately students WON’T be so distracted, and they WILL feel welcomed, delighted, and inspired.

In this theory, a small short-term problem might well turn into a substantial long-term benefit.

And I have to be honest: that’s a plausible hypothesis.

Given Fisher’s research (and that of other scholars), I think the burden of proof is on people who say that decorations are not distracting. But I don’t have specific research to contradict those objections.

Part III: The Researchers Return

So now maybe you’re thinking: “why don’t researchers study this specific question”?

I’ve got good news: they just did.

In a recently-published study, another research team (including Fisher, and led by Dr. Karrie Godwin, who helped in the 2014 study) wondered if students would get used to the highly decorated classrooms.

Research isn’t research if we don’t use fancy terminology, so they studied “habituation.” As in: did students habituate to the highly decorated classrooms?

In the first half of their study, researchers again worked with Kindergarteners. Students spent five classes studying science topics in plainly decorated classrooms. (The visual material focused only on the topic being presented.)

Then they spent ten classes studying science topics in highly decorated classrooms. (These decorations resembled typical classroom decorations: posters, charts, artwork, etc.)

Unsurprisingly (based on the 2014 study), students were more distractable in the decorated classroom.

But: did they get used to the decorations? Did they become less distractable over time? Did they habituate?

The answer: a little bit.

In other words: students were less distractible than they initially were in the decorated classroom. But they were still more distractible than in the sparsely decorated room.

Even after ten classes, students hadn’t fully habituated.

Part IV: Going Big

This 2-week study with kindergarteners, I think, gives us valuable information.

We might have hoped that students will get used to decorations, and so benefit from their welcoming uplift (but not be harmed by their cognitive cost). So far, this study deflates that hope.

However, we might still hold out a possibility:

If students partially habituate over two weeks, won’t they fully habituate eventually? Won’t the habituation trend continue?

Team Godwin wanted to answer that question too. They ran yet another study in primary school classrooms.

This study had somewhat different parameters (the research nitty-gritty gets quite detailed). But the headline is: this study lasted 15 weeks.

Depending on the school system you’re in, that’s between one-third and one-half of a school year.

How much did the students habituate to the visual distractions?

The answer: not at all.

The distraction rate was the same after fifteen weeks as it was at the beginning of the year.

To my mind, that’s an AMAZING research finding.

Putting It Together

At this point, I think we have a compelling research story.

Despite our training — and, perhaps, despite our love of decoration — we have a substantial body of research suggesting that over-decorated classrooms interfere with learning.

The precise definition of “over-decorated” might take some time to sort out. And, the practical problems of putting up/taking down relevant learning supports deserves thought and sympathetic exploration.

However: we shouldn’t simply hope away the concern that young students can be distracted by the environment.

And we shouldn’t trust that they’ll get used to the busy environment.

Instead, we should deliberately create environments that welcome students, inspire students, and help students concentrate and learn.


Fisher, A. V., Godwin, K. E., & Seltman, H. (2014). Visual environment, attention allocation, and learning in young children: When too much of a good thing may be bad. Psychological science25(7), 1362-1370.

Godwin, K. E., Leroux, A. J., Seltman, H., Scupelli, P., & Fisher, A. V. (2022). Effect of Repeated Exposure to the Visual Environment on Young Children’s Attention. Cognitive Science46(2), e13093.

A Little Help, Please…
Andrew Watson
Andrew Watson

I’ve got a problem, and I’m hoping you can help me.

Here’s the situation…

I work as a high school English teacher. And I’m also a consultant – presenting psychology and neuroscience research for teachers and students and parents.

In that consulting work, I often face this problem: teachers/parents/students believe – quite confidently – in some brain myth or another.

For instance:

When I talk with teachers about managing working memory load, I regularly get this question:

“Can we reduce working memory overload by aligning instruction with students’ learning style?”

When I talk about research into attention and distraction, I often hear this rejoinder:

“Yes, but: all the research shows that an enriched environment enhances learning.”

A discussion about student motivation often defaults to this baseline:

“Around here we remind students to have a growth mindset. That will get the job done.”

A comment about note-taking strategies prompts this response:

“Of course, we know from research that handwritten notes result in more learning than laptop notes.”

In these moments, how should I – the “outside expert” – respond?

We’ve Got Two Hands

On the one hand, I should – obviously – let them know they’re wrong.

First, because they are wrong (as far as research currently shows).

No: learning styles theories have not held up over time. We just don’t have good evidence to support them.

No: ‘enriched environment’ research doesn’t apply to schools. (It was initially done with rats; lots of research suggests that busy classrooms distract from learning. I tell this story in a recent book.)

No: mindset theory is not a slam dunk. This topic churns up lots of controversy, but my own view is…

…we’ve seen enough positive results to think something is going on there,

…and enough negative results to know we don’t have a good handle on the specifics yet.

And

No: the handwriting vs. laptop debate is nowhere near settled.

The second reason to highlight these errors: we don’t want their colleagues to believe these myths.

If I don’t contradict these false beliefs right away, they can easily propagate.

These two truths, however, face an ugly “on the other hand.”

On the Other Hand

When I speak up to contradict these myths, I’m pursuing two goals:

Change the mind of the person who made the comment, and

Encourage other listeners to adopt correct beliefs.

Here’s my awkward question:

does contradicting brain myths directly actually accomplish those goals?

Imagine I say:

“I’m so glad you’ve brought up learning styles. It turns out that the research just hasn’t supported this theory.”

Will the teachers who made those comments in fact change their minds?

Will others around them believe me?

Honestly, I’m not so sure…

A Compelling Backstory

Let’s ask this surprising question: why do people believe in learning styles?

Why do they believe that elaborate classroom decoration enhances learning, or that handwritten notes rule? Why do laptop notes receive so much confident hatred?

Almost certainly, teachers believe in these myths because some other consultant told them that “research says so.”

Or, they heard these myths at a conference touting “brain science!”

That is: teachers don’t believe these myths because they reject research. Teachers believe them because they embrace research.

In many cases, I suspect, they first heard that information at a PD day organized by their principal or district. In other words: they were once professionally expected to believe this myth.

Teachers are not, for the most part, stubborn flat-earth luddites. Instead, they have used these (seemingly) research-based strategies for years. Those strategies might even seem to help.

Why, then, should they change those beliefs? Just because some new guy (me) shows up and says “today’s research shows…”?

The Big Question

So, here’s my problem.

I really must correct brain myths.

And, I’m really unsure that “correcting brain myths” directly will work.

For the last few years, I’ve adopted a 3-step strategy in this situation:

First: I don’t contradict in public. Embarrassing people rarely inspires them to change their opinions.

Instead, I offer strong, research-based alternatives. (“Rather than focus on learning styles to reduce working memory load, I would …”)

Second: I ask that teacher curious questions in a one-on-one conversation:

“Where did you first hear about learning styles? Which version have you tried? What research have you explored? Have you looked at recent studies?”

Once rapport develops, I’ll mention that more current research hasn’t supported the learning styles hypothesis. I might even offer to send links and share resources.

Third: I include school leadership. Most principals and leadership teams I’ve worked with know common neuromyths, and want to root them out.

In-school leaders know better than I the best places to intervene: perhaps a departmental conversation, or a future faculty meeting. That is: they know how to spread the word widely without singling out and embarrassing any one teacher.

I wish I were sure these methods always work. But I simply don’t know.

And so, here are my questions to you:

What approach would be most effective with your colleagues?

What approach would be most effective with you?

If, for instance, you feel entirely certain that handwritten notes work better than laptop notes, what could I say to influence your thinking?

Would it, in fact, help to contradict you at that moment, in front of your peers? (“Let me explain why that study is so obviously flawed…”)

Did the research-based link above open new avenues for your thinking?

Would you rather have a one-on-one conversation about that research?

Honestly, I’m open for suggestions!

TL;DR

We really must correct brain myths in education. And, I’m really unsure about the best way to do so.

I’m hoping that you’ve got helpful suggestions…

Does Higher Engagement Promote Learning?
Andrew Watson
Andrew Watson

Long-time readers know: I thoroughly enjoy research that challenges my beliefs.

After all, I (probably) have lots to learn when a study makes me think anew.

In this case — even better! — I’ve found a study that (I suspect) challenges almost everybody’s beliefs.

Here’s the story…

The “Active vs. Passive” Debate

Education scholars often fiercely advocate for “active learning.”

This phrase serves as a catchy shorthand for several educational beliefs and practices.

People who champion a “constructivist” approach to schools, or embrace project pedagogies, or advocate student “voice and choice” often describe their approach this way.

And, they often point out one crucial benefit to active learning: student “engagement.” Students who shape their own learning feel invested in and energized by their efforts.

Other scholars zealously dispute this account.

Whereas their approach has been dismissed as merely “passive learning,” they often prefer phrases such as “direct instruction” to explain their views.

In this view of learning, limitations on working memory prevent novices from tackling overly complex problems.

Students benefit from highly structured pedagogy, in which expert teachers help students build mental models (“schema”) and thereby achieve their own expertise.

For champions of direct instruction, “engagement” might look good (“the students are all so active!”), but doesn’t necessarily result in actual learning. (Why? Because students might well experience working memory overload….)

If you attended our conference in San Francisco at the beginning of February, you heard speakers embrace both sides of this debate.

This Does Not Compute

A study published in 2019 splendidly complicates this tidy summary.

A research team led by Dr. Louis Deslauriers ran a straightforward experiment.

Researchers worked with two groups of students enrolled in an introductory physics class at Harvard.

The first group studied topic A in an “active learning” paradigm, and topic B with a “passive lecture.”

The second group switched that order: topic A was “passive lecture,” and topic B was “active learning.

The research team found a surprising set of results.

Students learned more from the “active learning” classes, but enjoyed (and thought they learned more from) the “passive lecture.”

Paradoxically, passive learning enhanced engagement but reduced understandingActive learning enhanced learning but reduced engagement.

Almost everyone will find that combination of results surprising, even disappointing.

Puzzle #1 (with a potential explanation)

Members of Team Active Learning, I suspect, predicted that the students would learn more when their professors followed that approach. Voila: they did.

And (almost certainly) teachers on that team predicted that active learning would result in higher engagement. Yet — as measured in this study — it didn’t.

Students clearly preferred the “passive lecture.”

For instance, survey results show that students wanted other physics courses to be taught with passive lecture/direct instruction.

 

The researchers have a hypothesis explaining this puzzling result. They wonder if the additional cognitive challenge created by active learning resulted in “desirable difficulty.”

That is: the students had to think harder — a challenge they didn’t really enjoy.

And this extra thought resulted in more learning. (You can watch a short video here to learn more about this hypothesis.)

Puzzle #2 (with another potential explanation)

Members of Team Direct Instruction, no doubt, are delighted that students preferred the (misnamed) “passive lecture.” According to the survey results, students felt they learned more from it than from the “active learning.”

And yet, Direct Instruction advocates no doubt feel genuine puzzlement that their preferred approach resulted in less learning. How could that be?

 

I myself have a hypothesis explaining this puzzling finding.

Contrary to many stereotypes, direct instruction advocates do NOT champion uninterrupted lecture.

Instead, they suggest that teachers start with straightforward explanation of core concepts.

Once those have been presented clearly, then students should do substantial independent mental work with those ideas.

In other words, advocates of direct instruction heatedly reject the label “passive learning.” Students do plenty of active cognitive work after they get the benefit of initial priming from instructors.

And yet, in this study, students in the passive learning group had to, in the researchers’ words, “adjust to a complete elimination of any active engagement” — such as “demonstrations, … interactive quizzes, or conceptual questions.”

NO educational thinker feels surprise that students learn less in the total absence of active engagement.

That’s not “direct instruction.” That’s … well … that’s a very bad idea. (To be clear: a very bad idea that happens all too frequently.)

A (Potential) Resolution

Because the “passive learning” condition subjected the students to pure lecture, then this study seems much less surprising (to me).

With “passive learning,”

Students learned LESS from uninterrupted lecture. (Why? They didn’t do any independent mental work with the material.)

Because the professor’s explanation made sense, on the other hand, they FELT they understood the material better.

With “active learning,”

Students learned MORE, because they interacted with the concepts and problems individually.

Alas, they FELT they understood less because they experienced the “difficult” half of “desirable difficulties.”

In other words: the study results seem confusing because the labels don’t mean what we thought they meant.

Until we know EXACTLY what happened in both “passive” and “active” learning, we can’t really judge how well those phrases align with our preconceptions — and with our own teaching practices.

One more thought

If a particular diet benefits, say, professional athletes, will it benefit me?

I’m going to be honest: I’m not a professional athlete.

A diet that benefits their level of physical fitness, metabolism, professional goals, etc., might not be healthy for me. (In his swimming prime, Michael Phelps ate 8000-10,000 calories a day. I suspect my doctor would discourage me from doing so.)

If Harvard even remotely lives up to its reputation, then students in Harvard physics classes understand an impressive amount of science. They have a great deal of motivation to learn more about science. They’ve been impressively successful in academic pursuits.

If a teaching method works with Harvard physics students, will it work with my 10th grade English students? Will it work with your 2nd graders? Maybe … but also, maybe not.

In general: I’m hesitant to apply research done at Harvard (or Stanford, or Oxford, or the US Naval Academy…) to most K-12 learning.

It’s entirely possible that the method “works” not because of the method, but because of the extraordinary background of the students who participate in it.

TL;DR

Before we embrace research “active learning” or “direct instruction,” we should know…

… EXACTLY what those labels mean in the research, and

… the GOODNESS OF FIT between those research participants and our students.

Dan Willingham has wisely written: “one study is just one study, folks.”

The Downsides of Desirable Difficulties
Andrew Watson
Andrew Watson

For several years now, we’ve been talking about the benefits of “desirable difficulties.”

For instance, we know that spreading practice out over time helps students learn more than does doing all the practice at once.

Why? Because that schedule creates greater mental challenges. Our students must think harder.

In other words: “spacing” creates “desirable difficulty.”

Likewise, we know that jumbling many topics together during practice helps students learn more than does practicing only one thing at a time.

Why? Students face greater cognitive challenges as they try to figure out which strategy to use or topic to notice.

Desirable difficulty.

And: requiring students to use retrieval practice helps them lots more than simple review.

Yup, you guessed it: d_______ d__________.

A theory that is simultaneously counter-intuitive and common-sense. What’s not to love?

Not So Desirable

I’ll tell you what’s not to love: the big silence.

The phrase “desirable difficulty” implies, obviously, that our students might face UNdesirable difficulties.

And yet, very few people ever discuss — much less research — this topic.

So, what exactly would an undesirable difficulty be? How can I predict or spot them?

I discuss this question with teachers quite often, and I have two sets of suggestions.

The First Strategy

At a Learning and the Brain conference a few years ago, Dr. Robert Bjork (who coined the phrase “desirable difficulty” with his wife, Dr. Elizabeth Ligon Bjork) explained that DDs have two core features.

First: they require students to think harder about the material.

Second: despite the difficulties, students ultimately succeed.

By implication, difficulties that don’t meet those criteria aren’t desirable.

For instance, I’ve just assigned a final project on Macbeth to my sophomores: they must think about the play, create a new something (a set design, a costume plot, a new scene, etc.), and then explain their thinking.

I’ve warned my students quite strictly: they may use technology, but they should NOT get carried away with all the cool tech possibilities at hand.

If they know how to edit videos and want to shoot a scene, that’s fine. But they should not simply throw in 1001 cool editing effects. Those edits would make them think harder, perhaps, but not think harder about the play.

The work would be difficult, but not desirably difficult.

So, too, I might ask my students to write a sentence that uses noun clauses as both the subject of the verb and as an appositive, and also uses an introductory subordinate clause as an adverb.

In this case, my students would think harder (that’s Bjork’s first criterion), but they almost certainly wouldn’t succeed (Bjork’s second criterion).

Again: a difficulty, but not a desirable one.

In other words, we want to ramp up the difficulty — but not too far — without letting the focus subtly shift to another topic.

The Second Strategy

So, difficulties aren’t desirable if they don’t meet both of Bjork’s criteria.

Another way to recognize UNdesirable difficulties: MOST difficulties are undesirable.

So, I can make attention more challenging by — say — playing loud music while students read.

More difficult? Yes. Desirable? No.

I can vex my students’ working memory by giving them ten verbal instructions to remember and follow.

More difficult? Still yes. Desirable? Still no.

I could fiendishly reduce my students’ motivation by inculcating a fixed mindset.

You know the answer. That additional difficulty would in no way be desirable.

In other words, a few specific difficulties (spacing, interleaving, retrieval practice) can be desirable. Most others, however, simply are not.

TL;DR

Desirable difficulties — which require students to think harder before they succeed at their work — can foster deeper learning.

However, most classroom difficulties don’t meet that definition, and therefore aren’t desirable.

Whenever we champion desirable difficulties, we should be sure to mention and guard against the undesirable ones that imperil students’ learning.

Too Good to be True: When Research and Values Collide
Andrew Watson
Andrew Watson

Let’s start with some quick opinions:

Flipped classrooms…

… can transform education and foster students’ independence, or

… are often a waste of time, and at best just rename stuff we already do.

A growth mindset…

… allows students to learn and become anything, or

… is just an over-hyped fad with little research support.

Multiple-choice questions…

… help me see what my students already know (and help them learn), or

… reduce knowledge to trivia, and enforce an authoritarian view of learning.

It seem strange that our profession can contain such starkly contrasting beliefs about core practices.

But if your experience is like mine, you know that debates among teachers can quickly arrive at these extremes. (If you hang out on Twitter, you probably see these clashes at their fiercest.)

Resolving Conflict with Research (?)

When we come across such vehement debates, we might propose an obvious way to settle them: research.

If the science shows X, well then, we teachers should believe X. And, we should run our classes and our schools the X way.

Obviously.

Alas, this solution might not work as well as we would hope. A recent essay by Brendan Schuetze (Twitter handle, @BA_Schuetze) helps explains why.

As Schuetze outlines, Mindset Theory lives in a strange place in the world of education.

On the one hand: research suggests that specific growth-mindset strategies offer some students modest benefits under particular circumstances. (Better said: they sometimes or probably do.)

On the other hand: lots of teachers and school systems think that…well…a growth mindset means that “anyone who tries can succeed at anything.”

How can it be that researchers (often) have one view of an educational theory, and teachers (often) have such a dramatically different understanding of that same theory?

The Values We Hold Influence the Beliefs We Adopt

To answer this question, Schuetze focuses on “values-alignment.” That is: we (teachers specifically, people generally) are quick to endorse research that aligns with values we already hold.

If (and this is my example, not Schuetze’s) we value innovation and the transformative power of technology, we’re likelier to think that flipped classrooms will radically improve education.

We might even come across research supporting this value-aligned position.

If we value tradition and the transformative power of face-to-face conversation, we’re likelier to think that this flipped-classroom nonsense will fail quickly and spectacularly, and we’ll go back to the methods that have always worked.

We can easily discover research supporting this position as well.

In his essay, Schuetze takes the example of growth mindset.

In a well-sourced recap, Schuetze explains:

Teacher education programs tend to endorse transformative constructivist pedagogy (as opposed to more traditional pedagogy), where social justice and the socio-emotional needs of students are increasingly seen as legitimate educational concerns…

In line with this affective turn, teachers are encouraged to be concerned not only with intellectual development, but also with molding, inspiring, and caring for their students–or what might be summarized in one word as the “growth” of students.

Because teacher training programs encourage us to value students’ “growth” quite broadly, our profession tends to believe any research that holds up growth as an outcome.

And we might not ask hard questions before we embrace that belief.

More Concerns, Possible Solutions

In fact (I’m inferring this from Schuetze’s essay), we’re likelier to over-interpret the plausibility and effectiveness of that theory.

Imagine a modest, research-based suggestion aligns with our values:

Researchers say, “X might help these students a bit under these circumstances.”

We teachers hear, “X transforms students — it’s almost magic!”

In my experience — and here I’m going WAY beyond Schuetze’s essay — our hopeful beliefs then call up the very “evidence” we need to persuade ourselves:

Well-meaning teachers write hopeful books that extrapolate substantially from the research they cite.

Blog posts — in an effort to make complex research clear — gloss over the nuance and uncertainty that researchers typically highlight.

Edu-Tweeps with thousands of followers simplify complex ideas into 280 characters.

Suddenly, it seems “everybody believes” that “research shows” what we already value.

To face this problem, I think we need to combine several steps.

Too Good

In the first place, I think it helps to focus on Schuetze’s troubling insight. We might find, someday, that a teaching practice BOTH helps our students learn AND contradicts our values.

Perhaps flipped classrooms really do help students (for the most part), even though we value tradition and face-to-face pedagogy.

Or, to reverse the case,

Perhaps growth mindset strategies don’t really help, even though we value students’ overall growth above their narrow academic achievement.

In these cases, we should honestly accept the tension between research and values. If we act as if they align when they don’t, we won’t make decisions as effectively or thoughtfully as we should.

That is: we can quite appropriately say:

This intervention might not help students learn more. But it aligns with a core value in our community, so we’ll do it anyway.

In the second case, I think we should hone an odd kind of skepticism:.

If a research-based teaching suggestion sounds deeply good — that is, if it aligns with our values — then we have an extra responsibility to assume it’s too good to be true.

Does “authenticity” sound good to you? You should BEWARE a pedagogical strategy called “authentic exploration.”

Does “mindfulness” sound uplifting? You should BEWARE mindfulness initiatives.

Have you (like me) always enjoyed being outdoors with the trees? You (like me) should BEWARE any teaching initiative with the words “woods” or “nature” or “wilderness” in the title.

Of course, when you warily undertake a review of the research literature, you just might find that it does in fact support this core value. (Quick: let’s all study in a forest!)

But we owe it to our profession and our students to admit: the values we hold dear might lead us into too credulous acceptance of the next new thing.

I (hope I) value my students’ development too much to let that happen.

New Research: Unrestricted Movement Promotes (Some Kinds of) Creativity
Andrew Watson
Andrew Watson

Teachers like creativity.

We want our students to learn what has come before, certainly. And, we want them to do and think and imagine new things with that prior knowledge.

We want them, in ways big and small, to create. How can we best foster such creativity?

Over the years, I’ve often heard that walking outside promotes creativity.

Because I work at a summer camp, I’m in favor of almost anything that promotes being outside. Alas, it turns out, this data pool didn’t hold up very well.

Since that time, lots of folks have focused on the walking part of “walking outside.” Several studies do suggest that simply walking around ramps up creative output. (For instance, here.)

Can we be more specific than “walking around”? Do some kinds of walking boost creativity more than others?

Defining Creativity

Ironically, the study of creativity begins with mundane, even tedious, tasks: defining and measuring it.

Researchers often focus on two kinds of creativity.

First, my students might come up with something new and useful.

Researchers measure this flavor of creativity (“divergent”) in a fun way:

Think about, say, a brick. Now, list all the things you might do with a brick.

The answer “build a wall” doesn’t score very high, because almost everyone says “build a wall.”

The answer “raise the water level in my pool by throwing lots of bricks in” does score high, because — well — because nobody ever says that. This answer is new and (assuming you care about the water level in your pool) useful.

Second, my students might see hidden connections.

Researchers measure this flavor of creativity (“convergent”) in another fun way:

Think about these three words: cottage, swiss, and cake.

Can you think of a word that pairs with each of those to make a meaningful phrase? (Answer: “cheese.” As in, cottage cheese, etc.)

Researchers in Germany wanted to knowwhat kind of walking might increase DIVERGENT creativity.

Here’s what they found…

It’s All About the Freedom

Researchers Supriya Murali and Barbara Händel asked participants to walk or to sit.

And, they asked them to do so in restricted or unrestricted ways.

Unrestricted walkers, for instance, could walk around a large room however they pleased. Restricted walkers had to walk back and forth down the middle of the room. (Notice: all this walking was inside.)

Unrestricted sitters sat in a solid chair (no wheels, no reclining features) with a view of the full room. Restricted sitters sat in the same chair, but with a computer screen in front of them. The “fixation cross” on the screen implied (if I understand this correctly) that the participants should remain focused on the screen.

What happened afterwards, when they took a test on divergent thinking?

Headlines:

Walkers scored higher on tests of divergent creativity than sitters.

Participants without restrictions (both walking and sitting) scored higher than their restricted peers.

For some interesting reason, unrestricted movement reduces restrictions in subsequent mental activity.

Classroom Implications

As I think about this research, it implies some happy, practical suggestions.

If we want our students to launch an explicitly creative assignment — start composing a poem, imagine an approach to studying a historical question, plan an environmentally-friendly city — we can give them an extra boost of physical freedom.

Walking outside might be good.

But if they can’t walk outside (that’s just not possible in many schools), then walking inside could be good.

Heck, if 25 students walking around in the classroom sounds like too much chaos, maybe they can choose a new place to sit for a while.

In other words: this research suggests that the actual movement (walking/sitting) matters, and that the relative degree of restriction also matters.

Even if students sit in solid chairs, their freedom to choose seats or move seats or sit cross-legged (or whatever) might jostle some creative energy in useful ways.

TL;DR

As long as we don’t make our claims too strong or grand, this research allows a sensible claim: “by reducing physical limitations for a while, we might help students expand their mental activity and creativity.” *


* I should note that the sample sizes in these three studies are quite small: 20, 17, and 23. Were these studies repeated with larger sample sizes (and/or in more classroom-like conditions), I’d be more confident and emphatic in drawing these conclusions.


Kuo, C. Y., & Yeh, Y. Y. (2016). Sensorimotor-conceptual integration in free walking enhances divergent thinking for young and older adults. Frontiers in psychology7, 1580.

Murali, S., & Händel, B. (2022). Motor restrictions impair divergent thinking during walking and during sitting. Psychological research, 1-14.

The First Three Steps
Andrew Watson
Andrew Watson

Early in January, The Times (of London) quoted author Kate Silverton (on Twitter: @KateSilverton) saying:

It’s the schools that have the strictest discipline that have the highest mental health problems.

Helpfully, they include a video recording of her saying it.

In context, Silverton is saying — in effect — that schools’ strict disciplinary policies damage students’ mental health.

If she’s right, school leaders should know that!

If we run schools with strict disciplinary policies, we should at least consider changing them. Obviously, we don’t want to cause mental health problems.

But … is she right?

This specific question leads to a broader question:

When someone says “research says you should change the way you run your school!,” what should we do next?

Accept the advice? Reject it? Flip a coin?

Let me suggest three simple steps.

Step 1: Ask for Sources

This advice seems too obvious to say out loud.

OF COURSE someone reading this blog would ask for sources.

However, in my experience, we’re very hesitant to do so. It seems — I don’t know — rude, or pushy, or presumptuous.

Especially when the research comes from psychology or neuroscience, we just don’t want to seem stubborn.

But, trust me, it’s always appropriate to ask for the research.

In this case, happily, lots (and lots) of people did ask Silverton for research.

This small niche of edutwitter lit up with people asking — quite simply — “what research suggests that strict school discipline damages mental health?” (To be clear, it also lit up with people praising Silverton for speaking out.)

Even more happily, she responded by citing 11 research studies.

Her transparency allows us to ask a second question:

Step 2: Does the Research, in fact, Support the Claim?

Here again, the question seems to obvious to raise. Who would cite research that doesn’t support the claim they make?

I’m here to tell you: it happens all the time. (I wrote about a recent example here.)

In this case, teacher/researcher/blogger Greg Ashman looked at those sources. (You can read the article he wrote here, although you might have to subscribe to his substack to do so.)

So, does the research support the claim?

Amazingly, most of the cited studies don’t focus on students’ mental health.

That’s right. To support the claim that “strict discipline harms mental health,” Silverton cites very little research about mental health. (Ashman has the details.)

Yes, we might make some guesses based on these studies. But, guesses aren’t research.

As Ashman writes:

it’s easy to accept that suspension and [expulsion] are associated with higher rates of depression without assuming suspension and [expulsion] are the cause.

So, DOES strict school discipline cause mental health problems? I don’t (yet) know of direct research on the subject.

This specific example about school discipline, I hope, emphasizes the broader point:

Simply by a) asking for research and b) giving it a quick skim, we can better decisions about accepting or rejecting “research-based” teaching advice.

Step 3: Actively Seek Out Contradictory Information

Because humans are so complicated, psychology and neuroscience research ALWAYS produces a range of findings.

Even with something as well-supported as retrieval practice, we can find a few studies suggesting limitations — even (very rare) negative effects.

I thought of this truth when I saw a New York Times headline: Cash Aid to Poor Mothers Increases Brain Activity in Babies, Study Finds.

This blog is about brain research, not politics. At the same time, this brain research might be cited to support a policy proposal.

So: what should we do when we see brain research used this way?

Step 1: “Ask for sources.” Good news! The sources are quoted in the article.

Step 2: “Does the research, in fact, support the claim?”

Sure enough, the researchers conclude

“we provide evidence that giving monthly unconditional cash transfers to mothers experiencing poverty in the first year of their children’s lives may change infant brain activity.”

Step 3: “Actively seek out contradictory information.”

Because this claim made the front page of the Times, I kept an eye out for responses, both pro and con.

Just a few days later, I found this tweet thread. In it, Dr. Stuart Richie points out some real concerns with the study.

For instance: the authors “pre-registered” their study. That is, they said “we’re going to measure variables X, Y, and Z to see if we find significant results.”

As it turns out, they found (small-ish) significant results in P, Q, and R, but not X, Y, and Z.

As Richie notes, P, Q, and R are certainly interesting. But:

This is a clear example of hype; taking results that were mainly null and making them into a huge, policy-relevant story. [The research] is a lot more uncertain than this [Times article implies].

To be very clear: I’m not arguing for or against a policy proposal.

I am arguing that when someone says “brain science shows!,” we should ask questions before make big changes.

TL;DR

When people cite brain research to encourage you to teach differently…

… ask for sources,

… confirm they support the recommendation,

… seek out contradictory points of view.

Our students benefit when we follow those three simple steps.

A “Noisy” Problem: What If Research Contradicts Students’ Beliefs?
Andrew Watson
Andrew Watson

The invaluable Peps Mccrea recently wrote about a vexing problem in education: the “noisy relationship between teaching and learning.”

In other words: I can’t really discern EXACTLY what parts of my teaching helped my students learn.

Was it my content knowledge?

The quality of my rapport with them?

The retrieval practice I require?

The fact that they slept and ate well in the days before class?

Some combination of all these variables?

Because I don’t know EXACTLY which teaching variable helped (or hurt) learning, I struggle to focus on the good stuff and eliminate the bad stuff.

I thought about Mccrea’s wisdom when I read a recent study about interleaving.

Here’s the story…

Interleaving 101

Frequent blog readers know all about interleaving, a way of organizing students’ practice.

Let’s say I teach my students about parts of speech.

Once they have a basic understanding of each one, I could have them practice each part of speech on its own.

That is: they identify nouns on Monday, adverbs on Tuesday, prepositions on Wednesday, and so forth.

Researchers call that structure “blocking” — as in “blocks of homework focusing on individual topics.”

Or, I could have my students jumble several topics together every night.

That is: Monday night they practice nouns, adverbs, and prepositions. Tuesday they practice verbs, prepositions, and conjunctions. Wednesday: nouns, verbs, and adjectives.

The total number of practice problems would remain the same, but they’d practice several parts of speech all together.

Researchers call this system “interleaving” — as in “weaving together several different topics.”

Measuring Success

Of course, teachers want to know: does interleaving work? Do students who interleave their practice learn more than students who block?

Let’s imagine two ways of answering that question

Strategy #1: ask the students.

Obviously.

Who knows more about the students’ learning than the students themselves?

Strategy #2measure their learning.

Obviously.

If students who block consistently remember more than students who interleave (or vice versa), then we have a winner.

So, what’s the answer?

Answers, and Vexing Questions

According to Samini and Pan’s 2021 study, strategy #1 yields a clear answer: students say that interleaving is harder and results in LESS learning.

Of course, that means they think that blocking is easier and results in MORE learning.

Alas, strategy #2 arrives at a contradictory result.

When we measure students’ actual learning, they remember more after interleaving than blocking.

Samini and Pan’s study gets this result. And, LOTS AND LOTS of research gets to the same result. (See Agarwal and Bain’s book for a great review of the research.)

In other words, this study points to an especially “noisy” part of the relationship between teaching and learning.

Students genuinely think and believe that interleaving interferes with learning.

However, interleaving in fact promotes learning.

How do we handle this quandary?

Tentative Solutions

In my high-school classroom, we do A LOT of retrieval practice.

Almost every day, I fire off questions and ask students to attempt an answer.

Sometimes I call on raised hands; or cold call; or have students write answers in their notebooks (I circle the room to check their responses). They might write on the board; they might consult in pairs.

I’m entirely comfortable using retrieval practice — and so are my students — because on the second day of class I showed them research about retrieval practice.

I told them:

This might feel hard at first.

But, trust me. It feels hard because your brain is working harder. And that means you’re learning more.

It’s like going to the gym. You don’t gain muscle by picking up paper clips. You gain muscle by picking up heavy things. Hard work leads to better fitness.

The same rule applies here. Retrieval practice is harder, so you’ll learn more.

Since that day, I stop every now and then at the end of an RP session and say: ” do you feel how much you’ve learned? Do you see how much retrieval practice is helping?”

In fact (I swear I am not making this up), one of my Sophomores once said: “Thank you Mr. Watson for making us do retrieval practice every day.”

I tell this story because it applies to interleaving as well.

I’ve been interleaving all year, but I haven’t (yet) explained it to my students. I plan to do so this upcoming week (or next).

My hope is: they’ll see why we’ve been bouncing back and forth from topic to topic in ways that might seem random or disorganized.

We’ve been interleaving all along.

I offer this solution as “tentative” because my context might not match yours.

For instance, if you teach younger or older students, they might not respond as mine do.

If you teach students with diagnosed learning differences, interleaving might not benefit them as much.

And so forth.

As always: consider the research findings, consider my experience, and then use your own best judgment to fit them into your classroom practice.

TL;DR

If students’ beliefs contradict research, I myself tell them about the research — graphs and all. And then I ask them to trust me.

Retrieval practice and interleaving really do work. My students know about this research pool. So far, they’re on board.

If you try this strategy, or another one, I hope you’ll let me know about your own experience.


Samani, J., & Pan, S. C. (2021). Interleaved practice enhances memory and problem-solving ability in undergraduate physics. NPJ science of learning6(1), 1-11.