Skip to main content

Andrew Watson About Andrew Watson

Andrew began his classroom life as a high-school English teacher in 1988, and has been working in or near schools ever since. In 2008, Andrew began exploring the practical application of psychology and neuroscience in his classroom. In 2011, he earned his M. Ed. from the “Mind, Brain, Education” program at Harvard University. As President of “Translate the Brain,” Andrew now works with teachers, students, administrators, and parents to make learning easier and teaching more effective. He has presented at schools and workshops across the country; he also serves as an adviser to several organizations, including “The People’s Science.” Andrew is the author of "Learning Begins: The Science of Working Memory and Attention for the Classroom Teacher."

Too Good to Be True? “Even Short Nature Walks Improve Cognition”?
Andrew Watson
Andrew Watson

Good news makes me nervous.

More precisely: if I want to believe a research finding, I become very suspicious of it. After all: it’s easy to fool me when I want to be fooled.

Specifically: I’m an outdoors guy. I’ve worked at summer camps for ages, and love a good walk in the forests around Walden Pond.

So, when I read research showing that even a brief nature walk produces cognitive benefitsI’m both VERY EXCITED and EXTRA SKEPTICAL.

Let’s start with the assumption that it’s just not true.

Persuade Me

The research I’m speaking of is in fact a review article; it summarizes and compares the results of 14 studies. (The review article was flagged by Professor Dan Willingham, one of the leaders in translating science research for the classroom.)

These 14 studies shared important commonalities:

First: they looked at “one-time” exposure to nature. They didn’t look at — say — outdoor education programs. Instead, they looked at — say — a brisk walk in a park near the school.

Second: these “one-time exposures” were all relatively brief — somewhere between 10 and 90 minutes.

Third: these “brief, one-time exposures” did NOT deliberately focus the participants on nature. That is: students didn’t walk in the park to learn about trees and birds. They walked in the park to have the experience of walking in the park.

I might be skeptical about one study. I might be skeptical of two studies. But if 14 studies (or a substantial percentage of them) all reach the same conclusion … well, maybe I’ll be persuaded.

Equally interesting: these studies ran the K-16 gamut. We’re not looking at a narrow age-range here: more like two decades.

Conclusions (and Questions)

So, what did this potentially-persuasive bunch of studies show?

YES: in 12 of the 14 studies, brief, one-time, passive exposure to nature does benefit cognition.

More specifically, researchers found benefits in measures of directed attention and working memory.

They looked for, but did not find, benefits in measures of inhibition (another important executive function).

And, crucially, they did not measure academic performance. If a walk in nature enhances attention and working memory, we can reasonably predict that it will also improve learning. But: these studies did not measure that prediction.

Because this review covers so many studies, it’s easy to get lost in the details.

One point I do want to emphasize: the impressive variety of “exposures.”

Some students walked or played in a park, woods, or nature trail.

Some simply sat and read outdoors.

Amazingly, some walked on a treadmill watching a simulated nature trail on the monitor.

In fact, some simply sat in a classroom “with windows open on to green space.”

In other words: it doesn’t take much nature to get the benefits of nature.

Inevitable Caveats

First: in these studies, exposure to nature helped restore attention and working memory capacity that had been strained.

It did not somehow increase overall attention and WM capacity in an enduring way. Students recovered faster. But they didn’t end up with more of these capacities than they started with.

Second: most of these “exposures” included some modest physical activity.

How much (if any) of the benefit came from that physical exertion, instead of the greenery?

We don’t yet know.

A Skeptic Converted?

I have to say, I’m strongly swayed by this review.

In the past, I’ve seen studies that might contradict this set of conclusions.

But the number of studies, the variety of conditions, the variety of cognitive measures, and the range of ages all seem very encouraging.

Perhaps we can’t (yet) say that “research tells us” brief exposures to nature benefit students. But I feel much more comfortable speculating that this belief just might be true.

Working Memory: Make it Bigger, or Use it Better?
Andrew Watson
Andrew Watson

Cognitive science has LOTS of good news for teachers.

Can we help students remember ideas and skills better?

Yes, we can! (Check out retrieval practice and other desirable difficulties).

Can we promote students’ attention?

Yes, we can! (Posner and Rothbart’s “tripartite” theory gives us lots of guidance.)

Can we foster motivation?

Yes, we can! (As long as we’re modest about expectations and honest about the research, growth mindset can help.)

At the same time, we’ve occasionally got bad news as well.

Do cell phones distract students from their work?

Yes, they do! (Even when they’re turned off.)

Do students have “learning styles”?

Not in any meaningful way, no. (As Daniel Willingham says: when it comes to learning, people are more alike than different.)

The WORST News

I regularly talk with teachers and school leaders about working memory.

After a definition and some fun exercises, I emphasize three key points:

First: working memory is ESSENTIAL for learning. No academic information gets into long-term memory except through working memory. (Really.)

Second: it’s sadly LIMITED. (You probably can alphabetize 5 random words. You probably can’t alphabetize 10. You’ve run out of WM.)

Third: we know of no artificial way of making it bigger … except for letting children grow up. (WM capacity increases as we age, until our early twenties. No, you don’t want to know what happens next.)

This third point consistently creates genuine consternation.

Because: we REALLY want to make working memory bigger. After all: it’s essential, and it’s limited.

And because: almost every other cognitive function CAN get bigger.

If you want to learn more Spanish, practice Spanish. You’ll learn more.

If you want to get better at meditation, practice meditation; you’ll get better.

If you want to increase your working memory – and, trust me, you do – common sense suggests that practice should help.

That is: if you keep doing working memory exercises, your working memory should improve.

And yet, weirdly, it just doesn’t. People have tried and tried. Some companies make big claims.

Alas, we just don’t have consistent, robust research suggesting that any of these strategies work.

So, as I say, that’s really bad news.

Don’t Panic: There’s REALLY Good News

After all that bad news, it’s time for some good news. Let me start with an analogy.

I’m 5’10”.

I’m never the first pick for anyone’s basketball team. And: no matter how much I try, I’ll never get any taller.

However – and this is the key point – I can use the height I have more effectively. If I learn how to play basketball well (at my height), I can be a better player.

I’m not taller; my “height capacity” hasn’t changed. But my use of that height can improve.

So too, teachers can help students use the working memory they have more effectively.

In fact, we have LOTS of strategies for helping teachers do so. We have so many strategies that someone should write a book about them. (It’s possible I already did.)

For instance: “dual coding” doesn’t increase students’ WM capacity. It does, however, allow them to use more of the WM that they already have.

For that reason, dual coding – used correctly – can help students learn.

Don’t Stop Now

The good news keeps going.

Like dual coding, relevant knowledge in long-term memory reduces WM demands. The precise reasons get complicated, but the message is clear: students who know more can – on average – think more effectively.*

For that reason, a well-structured curriculum can help students learn. The knowledge they acquire along the way transforms WM-threatening tasks into WM-friendly tasks.

In many cases, simple common sense can manage WM load.

Once teachers understand why instructions take up WM space, we know how to dole out instructions more effectively.

Once we see why choices both motivate students’ interest and stress students’ WM, we can seek out the right number of choices.

So too, once we focus on “the curse of knowledge,” we start to recognize all the ways our own expertise can result in WM overload. This perspective powerfully reshapes lesson plans.

In other words: when teachers understand WM, we begin – naturally and intuitively – to adjust classroom demands to fit within cognitive limits.

That process takes time, with stumbles and muddles along the way. But the more we practice, the more skillful and successful we become.

And, notice this key point: none of these strategies make WM bigger. Instead, they help students use it better.

TL; DR

Although working memory is VITAL for learning, students (and adults) don’t have very much.

We therefore WANT to make it bigger.

The good news is: although we really can’t make it bigger, we really can help students use it more effectively.

When we shift our focus from making it bigger to using it better, we adopt teaching strategies that help students learn.


* For this reason, cognitive scientists get very antsy when they hear the claim that “students don’t need to know facts because they can look them up on the interwebs.” Because of working memory limits, students must have knowledge in long-term memory to use large amounts of it effectively.

Learning How to Learn: Do Video Games Help?
Andrew Watson
Andrew Watson

Long-time readers know: I like research that surprises me.

If a study confirms a belief I already have, I’m glad for that reinforcement. However, I have more to learn when a study challenges my beliefs.

As you’ll see below, I’m not always persuaded by challenging research. But: it’s always fun to explore.

Today’s Surprise

A study published last October grabbed my attention with its surprising title: Action video game play facilitates “learning how to learn.”

That title includes several shockers.

First: it suggests that action video games might be good for people.

Second: it suggests that they might even be good for learning.

Third: it suggests that “learning how to learn” is a thing. (I’m more skeptical about this concept than most; that’s a topic for another blog.)

Teacher and parent conversations often focus on the potential harms of action video games — both for children’s characters, and for their learning. So, this strong claim to the contrary certainly invites curiosity — even skepticism.

In fact, this study comes from researchers who have been looking at the cognitive benefits of action video games for several years now. Their work prompts lots of controversy; in other words, it might help us learn more about learning!

This study starts out with lots of promise…

Sims vs. Call of Duty

When you read research for a living (as I do), you start to develop an informal mental checklist about research methodology.

This study checks lots of boxes:

Plausible, active control group? Check.

Pre-registration? Check.

Appropriate uncertainty/humility? Check.

Sometimes when I look at surprising findings, I quickly dismiss them because the research paradigm doesn’t withstand scrutiny.

In this case, it all holds together well. (I should emphasize: I’m NOT an expert in this field, and other researchers might spot flaws that I don’t.)

The overall idea is straightforward enough. Researchers worked with two groups of college students.

First, researchers tested students’ “attentional control” and “working memory.”

Next, students played 45 hours (!) of video games.

The control group played games like Sims 3: in other words, a strategy video game, but not an action video game.

The study group played Call of Duty: Blacks Ops, and other such games that involve movement and aiming and navigating (and shooting).

Finally, they retested students’ attention and working memory. Here’s the kicker:

Researchers used new tests of working memory and attention. And, they watched to see how quickly students improved at these new tests.

Researchers wanted to know, in a tidy shorthand, did playing action video games help students “learn how to learn” these new attention/memory tests?

Results, and Implications

Did playing action video games help students learn new attention and memory tasks? YES.

Unfortunately, the research method here makes it hard to quantify the size of the benefit. (Bayesian statistics, anyone?) But the headline is: students in the action-video-game group did better than the strategy-video-game group at learning new cognitive skills.

What, then, should we conclude from this surprising research?

First: We have LOTS of reasons to dislike action video games, like “first-person shooters.” Many include morally repellent plot lines and actions. For some folks, the whole idea of a “game about shooting” is yucky.

At the same time, this study offers us a compelling, tantalizing clue — one that might encourage us to notice these games.

Here’s what I mean…

Second: If you focus on research into cognitive science, you know a) that working memory is ESSENTIAL to learning, b) we don’t have very much, and c) we don’t know of artificial ways to create more.

In other words: working memory limitations create a terrible bottleneck that constricts the potential for learning.

Other have tried to find ways to increase working memory. Some claim to do so. Very consistently, these research claims do not replicate.

BUT…

This study claims to have found a way to help increase working memory.

I can hardly overstate the importance of that news.

So Many Ifs

IF playing action video games improves working memory (we’re not yet sure it does,) and

IF those WM gains result in better learning (this research team didn’t test that question), and

IF we can figure out WHY and HOW such games work their working-memory magic, and

IF we can get those benefits with a game that doesn’t include shooting/killing (and all those moral qualms (IF you have those moral qualms)),

THEN we might be at the beginning of a very exciting process of discovery here.

I’m very interested in following this series of possibilities. Honestly: finding ways to enhance working memory would be a real game-changer for our profession…and potentially our species.

In brief: WATCH THIS SPACE.

(A Final Note)

This study doesn’t look at “learning how to learn” in the way that most people use that phrase.

Typically, “LHTL” involves teaching students about cognitive science and encouraging them to use those use that knowledge as they study.

This research, however, isn’t investigating that strategy.

 


Zhang, RY., Chopin, A., Shibata, K. et al. Action video game play facilitates “learning to learn”. Commun Biol 4, 1154 (2021). https://doi.org/10.1038/s42003-021-02652-7

Don’t Hate on Comic Sans; It Helps Dyslexic Readers (Asterisk)
Andrew Watson
Andrew Watson

People have surprising passions.

Some friends regularly announce that the Oxford comma is a hill they’re ready to die on. (I’m an English teacher, and yet I wonder: you’re willing to die over a punctuation mark?)

With equal energy and frequency, Twitter and Facebook resonate with mockery of the typeface Comic Sans. (Again, it’s a typeface. Why all the pique?)

Comic-sans mockery, however, often earns this earnest rebuttal:

“Comic sans helps dyslexic readers, who struggle with other fonts. Comic sans isn’t dreadful; it’s essential!”

I’ve read this statement so often that I simply assumed it’s true. Would Twitter lie?

Just Checking…

I have, in fact, seen the claim that “comic sans benefits dyslexic readers” twice this week.

However, I’ve started to notice a curious silence: no one cites specific research to back up that claim.

So, I thought I’d find it for myself.

Long-time readers know my routine. I surfed over to Google Scholar, and searched the terms “dyslexia” and “font.” And, to be on the safe side, I also searched “dyslexia” and “comic sans.”

From there, I used Scite.ai and Connectedpapers.com to follow up on my findings.

The results surprised me, so I thought I’d pass them along.

Does Comic Sans Benefit Dyslexic Readers?

I don’t know.

More precisely, I can’t find research that explores that question directly.

When I did the searches described above, I found several studies that seemed promising. And yet, when I looked at the specifics, I found that the researchers hadn’t explored exactly this question.

For instance:

Several studies cite the British Dyslexia Association style guide as their source for this recommendation.

That guide does recommend Comic Sans (and other sans serif fonts, including Arial). However, it doesn’t cite any research to support that claim.

Hmmmm.

This study, helpfully called “Good Fonts for Dyslexia,” does indeed ask 48 dyslexic readers to study passages in different fonts. It asks exactly the question we’re trying to answer.

However, this research team didn’t include Comic Sans among the fonts they studied.

They do recommend Helvetica, Courier, Arial, Verdana and CMU for dyslexic readers. But they have no recommendation one way or the other about Comic Sans.

Double hmmmmm.

Most of the studies I found focus less on font and more on web design. (And, the most common font-related conclusion I found is: fonts designed to benefit dyslexic readers don’t.)

At this point, I simply don’t have a research-based answer to this question.

To Be Clear…

This search genuinely surprised me. Given the frequency of the claim — just google it! — I assumed I’d find a robust research pool.

But, no.

Given the potential for controversy here, I want to answer some likely questions:

“Are you saying Comic Sans DOESN’T help dyslexic readers?”

No. I’m saying I can’t find a research-based answer either way.

“If you’re not an expert in dyslexia, how can you be so sure?”

Honestly, I’m not sure. I’m usually fairly skilled at finding the research basis behind educational claims. (Heck, I wrote a book about doing so.) But in this case, I simply couldn’t discover a convincing answer to the question.

“Look, this research right here shows that Comic Sans does help!”

AWESOME! Please share it with me so I can write a follow-up post.

“My student/child/colleague tells me that Comic Sans helps a lot.”

That kind of individual experience is useful and important. I hope that researchers explore this question, so we can know with greater confidence whether or not it helps most dyslexic readers.

“How long did you look?”

Maybe an hour, spread out over two days. I certainly could have missed something. I hope you’ll let me know if you’ve got a study that looks at this possibility.

TL;DR

You might have heard that Comic Sans helps dyslexic readers; you might have heard that “research says so.”

Those claims might be true, but I haven’t (yet) found research supporting them. If you know of that research, please send it my way!

Perspectives on Critical Thinking: Can We Teach It? How Do We Know?
Andrew Watson
Andrew Watson

Imagine the following scenario:

A school principal gathers wise cognitive scientists to ask a straightforward question…

“Because critical thinking is an essential 21st century skill, we know our students need to develop critical thinking skills. If we want to create a school program or a class or a curriculum to foster critical thinking, what guidance can you give us?”

Happily, we don’t have to imagine. At last week’s Learning and the Brain conference in New York, I asked a distinguished group of cognitive psychologists* exactly that question.

The resulting conversation offered practical suggestions, provocative assertions, and a surprising amount of humor.

I’ll try to summarize that half-hour conversation here.

On the One Hand…

Let’s start at one end of the spectrum, with the most optimistic ways to answer the question:

First: we know what critical thinking is.

Dr. Laura Portnoy, for instance, considers critical thinking the ability to support claims with evidence and reason.

If I claim that “the earth orbits the sun,” I should be able to cite evidence supporting that claim. And I should be able to explain the logical process I use to make conclusions based on that evidence.

Dr. Ben Motz agrees with that foundation, and adds an important step: critical thinkers recognize and avoid logical fallacies.

A comprehensive list of logical fallacies goes on for pages, but critical thinkers typically question their own beliefs aggressively enough to avoid the most common mistakes.

Second: we know how to foster critical thinking.

The specifics of an answer probably vary by age and discipline. However, we’ve got specific curricular strategies to help us foster critical thinking among students.

Dr. Laura Cabrera, with this truth in mind, offers a specific bit of advice: start early.

If we want students to grow as critical thinkers, we shouldn’t wait until their sophomore year in high school. Kindergarten would be a fine place to start.

On the Other Hand…

All these optimistic answers, however, quickly give way to grittier – perhaps more realistic – assessments of the situation.

First: because critical thinking is so complicated, no precise definition holds true in a broadly useful way. In other words – politely speaking – we can’t exactly define it.

In cognitive psychology terminology, as Dr. Derek Cabrera put it, “critical thinking has a construct validity problem.” In fact, the five psychologists on the panel – professors all – don’t agree on a definition.

Second: This definition problem has terrible implications.

If we can’t define critical thinking, broadly speaking, then we can’t determine a consistent way to measure it.

And if we can’t measure it, we have no (scientific) way of knowing if our “critical thinking program” helps students think critically.

Third: In fact, if we can’t measure students’ critical thinking skills right now, we might not realize that they’re already good at it.

Dr. Dan Willingham – author of the well-known Why Don’t Students Like School – made this point at the beginning of our conversation.

“Why,” he asked, “do you think your students have a critical thinking problem? What measurement are you using? What do you want them to do that they can’t do?”

In other words: it’s not obvious we should start a critical thinking program. Because we can’t measure students’ abilities, we just don’t know.

Dr. Derek Cabrera made this point quite starkly: “My advice about starting a critical thinking program is: don’t.

Don’t Start Now

Even if we could measure critical thinking, as it first seemed we could, teachers might not want to give it disproportional attention.

Fourth: some panelists doubt that critical thinking is any more important than many (many) other kinds of thinking – creative thinking, interdisciplinary thinking, systems thinking, fuzzy logic…the list goes on.

Dr. Portnoy, for instance, champions good old-fashioned curiosity. If students ask the right questions (critical or otherwise), they’re doing good thinking and learning.

Why, then, would it be bad if they aren’t doing critical thinking, narrowly defined?

The Cabreras, indeed, argue that students trained to think critically often get too critical. They stamp out potentially good ideas (that spring from imaginative thinking) with all their skills at critical thinking.

Fifth: opportunity cost.

Schools already have far too much to do well, as Dr. Willingham frankly pointed out.

If we plan to add something (a critical thinking program/curriculum), we should know what we plan to take out.

And, we should have a high degree of confidence that the new program will actually succeed in its mission.

If we remove a program that does accomplish one goal and replace it with one that doesn’t, our efforts to improve schools will – paradoxically – have deprived students of useful learning.

Making Sense of the Muddle

All these points might seem like bad news: we (perhaps) don’t know what critical thinking is, and (perhaps) shouldn’t teach it even if we did. Or could.

That summary, I think, overlooks some important opportunities that these panelists highlighted.

Dr. Motz offers specific ways to define critical thinking. His talk at the conference, in fact, focused on successful strategies to teach it.

Even better: he wants teachers to join in this work and try it out with their own students.

The question we face, after all, is not exactly “can I teach critical thinking — generally) — to everyone?”

It is, instead: “can I teach critical thinking — defined and measured this way — to my students?”

If the answer to that question is “yes,” then perhaps I should make room for critical thinking in my students’ busy days.

Made wiser by these panelists’ advice, I know better how to define terms, to measure outcomes, to balance several thinking skills (including curiosity!).

When researchers’ perspectives on critical thinking helps us think critically about our teaching goals, we and our students benefit.


* The panelists: Dr. Derek Cabrera, Dr. Laura Cabrera, Dr. Benjamin Motz, Dr. Lindsay Portnoy, Dr. Dan Willingham.

Do Classroom Decorations Distract Students? A Story in 4 Parts…
Andrew Watson
Andrew Watson

Teacher training programs often encourage us to brighten our classrooms with lively, colorful, personal, and uplifting stuff:

Inspirational posters.

Students’ art work.

Anchor charts.

Word walls.

You know the look.

We certainly hope that these decorations invite our students in and invigorate their learning. (We might even have heard that “enriched environments promote learning.”)

At the same time, we might worry that all those decorations could distract our students from important cognitive work.

So, which is it? Do decorations distract or inspire? Do they promote learning or inhibit learning? If only we had research on this question…

Part I: Early Research

But wait: we DO have research on this objection.

Back in 2014, a team led by Dr. Anna Fisher asked if classroom decorations might be “Too Much of a Good Thing.”

They worked with Kindergarten students, and found that — sure enough — students who learned in highly-decorated rooms paid less attention and learned less than others in “sparsely” decorated classroom.

Since then, other researchers have measured students’ performance on specific mental tasks in busy environments, or in plain environments.

The results: the same. A busy visual field reduced working memory and attention scores, compared to plain visual environments.

It seems that we have a “brain-based” answer to our question:

Classroom decorations can indeed be “too much of a good thing.”

Taken too far, they distract students from learning.

Part II: Important Doubts

But wait just one minute…

When I present this research in schools, I find that teachers have a very plausible question.

Sure: those decorations might distract students at first. But, surely the students get used to them.

Decorations might make learning a bit harder at first. But ultimately students WON’T be so distracted, and they WILL feel welcomed, delighted, and inspired.

In this theory, a small short-term problem might well turn into a substantial long-term benefit.

And I have to be honest: that’s a plausible hypothesis.

Given Fisher’s research (and that of other scholars), I think the burden of proof is on people who say that decorations are not distracting. But I don’t have specific research to contradict those objections.

Part III: The Researchers Return

So now maybe you’re thinking: “why don’t researchers study this specific question”?

I’ve got good news: they just did.

In a recently-published study, another research team (including Fisher, and led by Dr. Karrie Godwin, who helped in the 2014 study) wondered if students would get used to the highly decorated classrooms.

Research isn’t research if we don’t use fancy terminology, so they studied “habituation.” As in: did students habituate to the highly decorated classrooms?

In the first half of their study, researchers again worked with Kindergarteners. Students spent five classes studying science topics in plainly decorated classrooms. (The visual material focused only on the topic being presented.)

Then they spent ten classes studying science topics in highly decorated classrooms. (These decorations resembled typical classroom decorations: posters, charts, artwork, etc.)

Unsurprisingly (based on the 2014 study), students were more distractable in the decorated classroom.

But: did they get used to the decorations? Did they become less distractable over time? Did they habituate?

The answer: a little bit.

In other words: students were less distractible than they initially were in the decorated classroom. But they were still more distractible than in the sparsely decorated room.

Even after ten classes, students hadn’t fully habituated.

Part IV: Going Big

This 2-week study with kindergarteners, I think, gives us valuable information.

We might have hoped that students will get used to decorations, and so benefit from their welcoming uplift (but not be harmed by their cognitive cost). So far, this study deflates that hope.

However, we might still hold out a possibility:

If students partially habituate over two weeks, won’t they fully habituate eventually? Won’t the habituation trend continue?

Team Godwin wanted to answer that question too. They ran yet another study in primary school classrooms.

This study had somewhat different parameters (the research nitty-gritty gets quite detailed). But the headline is: this study lasted 15 weeks.

Depending on the school system you’re in, that’s between one-third and one-half of a school year.

How much did the students habituate to the visual distractions?

The answer: not at all.

The distraction rate was the same after fifteen weeks as it was at the beginning of the year.

To my mind, that’s an AMAZING research finding.

Putting It Together

At this point, I think we have a compelling research story.

Despite our training — and, perhaps, despite our love of decoration — we have a substantial body of research suggesting that over-decorated classrooms interfere with learning.

The precise definition of “over-decorated” might take some time to sort out. And, the practical problems of putting up/taking down relevant learning supports deserves thought and sympathetic exploration.

However: we shouldn’t simply hope away the concern that young students can be distracted by the environment.

And we shouldn’t trust that they’ll get used to the busy environment.

Instead, we should deliberately create environments that welcome students, inspire students, and help students concentrate and learn.


Fisher, A. V., Godwin, K. E., & Seltman, H. (2014). Visual environment, attention allocation, and learning in young children: When too much of a good thing may be bad. Psychological science25(7), 1362-1370.

Godwin, K. E., Leroux, A. J., Seltman, H., Scupelli, P., & Fisher, A. V. (2022). Effect of Repeated Exposure to the Visual Environment on Young Children’s Attention. Cognitive Science46(2), e13093.

A Little Help, Please…
Andrew Watson
Andrew Watson

I’ve got a problem, and I’m hoping you can help me.

Here’s the situation…

I work as a high school English teacher. And I’m also a consultant – presenting psychology and neuroscience research for teachers and students and parents.

In that consulting work, I often face this problem: teachers/parents/students believe – quite confidently – in some brain myth or another.

For instance:

When I talk with teachers about managing working memory load, I regularly get this question:

“Can we reduce working memory overload by aligning instruction with students’ learning style?”

When I talk about research into attention and distraction, I often hear this rejoinder:

“Yes, but: all the research shows that an enriched environment enhances learning.”

A discussion about student motivation often defaults to this baseline:

“Around here we remind students to have a growth mindset. That will get the job done.”

A comment about note-taking strategies prompts this response:

“Of course, we know from research that handwritten notes result in more learning than laptop notes.”

In these moments, how should I – the “outside expert” – respond?

We’ve Got Two Hands

On the one hand, I should – obviously – let them know they’re wrong.

First, because they are wrong (as far as research currently shows).

No: learning styles theories have not held up over time. We just don’t have good evidence to support them.

No: ‘enriched environment’ research doesn’t apply to schools. (It was initially done with rats; lots of research suggests that busy classrooms distract from learning. I tell this story in a recent book.)

No: mindset theory is not a slam dunk. This topic churns up lots of controversy, but my own view is…

…we’ve seen enough positive results to think something is going on there,

…and enough negative results to know we don’t have a good handle on the specifics yet.

And

No: the handwriting vs. laptop debate is nowhere near settled.

The second reason to highlight these errors: we don’t want their colleagues to believe these myths.

If I don’t contradict these false beliefs right away, they can easily propagate.

These two truths, however, face an ugly “on the other hand.”

On the Other Hand

When I speak up to contradict these myths, I’m pursuing two goals:

Change the mind of the person who made the comment, and

Encourage other listeners to adopt correct beliefs.

Here’s my awkward question:

does contradicting brain myths directly actually accomplish those goals?

Imagine I say:

“I’m so glad you’ve brought up learning styles. It turns out that the research just hasn’t supported this theory.”

Will the teachers who made those comments in fact change their minds?

Will others around them believe me?

Honestly, I’m not so sure…

A Compelling Backstory

Let’s ask this surprising question: why do people believe in learning styles?

Why do they believe that elaborate classroom decoration enhances learning, or that handwritten notes rule? Why do laptop notes receive so much confident hatred?

Almost certainly, teachers believe in these myths because some other consultant told them that “research says so.”

Or, they heard these myths at a conference touting “brain science!”

That is: teachers don’t believe these myths because they reject research. Teachers believe them because they embrace research.

In many cases, I suspect, they first heard that information at a PD day organized by their principal or district. In other words: they were once professionally expected to believe this myth.

Teachers are not, for the most part, stubborn flat-earth luddites. Instead, they have used these (seemingly) research-based strategies for years. Those strategies might even seem to help.

Why, then, should they change those beliefs? Just because some new guy (me) shows up and says “today’s research shows…”?

The Big Question

So, here’s my problem.

I really must correct brain myths.

And, I’m really unsure that “correcting brain myths” directly will work.

For the last few years, I’ve adopted a 3-step strategy in this situation:

First: I don’t contradict in public. Embarrassing people rarely inspires them to change their opinions.

Instead, I offer strong, research-based alternatives. (“Rather than focus on learning styles to reduce working memory load, I would …”)

Second: I ask that teacher curious questions in a one-on-one conversation:

“Where did you first hear about learning styles? Which version have you tried? What research have you explored? Have you looked at recent studies?”

Once rapport develops, I’ll mention that more current research hasn’t supported the learning styles hypothesis. I might even offer to send links and share resources.

Third: I include school leadership. Most principals and leadership teams I’ve worked with know common neuromyths, and want to root them out.

In-school leaders know better than I the best places to intervene: perhaps a departmental conversation, or a future faculty meeting. That is: they know how to spread the word widely without singling out and embarrassing any one teacher.

I wish I were sure these methods always work. But I simply don’t know.

And so, here are my questions to you:

What approach would be most effective with your colleagues?

What approach would be most effective with you?

If, for instance, you feel entirely certain that handwritten notes work better than laptop notes, what could I say to influence your thinking?

Would it, in fact, help to contradict you at that moment, in front of your peers? (“Let me explain why that study is so obviously flawed…”)

Did the research-based link above open new avenues for your thinking?

Would you rather have a one-on-one conversation about that research?

Honestly, I’m open for suggestions!

TL;DR

We really must correct brain myths in education. And, I’m really unsure about the best way to do so.

I’m hoping that you’ve got helpful suggestions…

Does Higher Engagement Promote Learning?
Andrew Watson
Andrew Watson

Long-time readers know: I thoroughly enjoy research that challenges my beliefs.

After all, I (probably) have lots to learn when a study makes me think anew.

In this case — even better! — I’ve found a study that (I suspect) challenges almost everybody’s beliefs.

Here’s the story…

The “Active vs. Passive” Debate

Education scholars often fiercely advocate for “active learning.”

This phrase serves as a catchy shorthand for several educational beliefs and practices.

People who champion a “constructivist” approach to schools, or embrace project pedagogies, or advocate student “voice and choice” often describe their approach this way.

And, they often point out one crucial benefit to active learning: student “engagement.” Students who shape their own learning feel invested in and energized by their efforts.

Other scholars zealously dispute this account.

Whereas their approach has been dismissed as merely “passive learning,” they often prefer phrases such as “direct instruction” to explain their views.

In this view of learning, limitations on working memory prevent novices from tackling overly complex problems.

Students benefit from highly structured pedagogy, in which expert teachers help students build mental models (“schema”) and thereby achieve their own expertise.

For champions of direct instruction, “engagement” might look good (“the students are all so active!”), but doesn’t necessarily result in actual learning. (Why? Because students might well experience working memory overload….)

If you attended our conference in San Francisco at the beginning of February, you heard speakers embrace both sides of this debate.

This Does Not Compute

A study published in 2019 splendidly complicates this tidy summary.

A research team led by Dr. Louis Deslauriers ran a straightforward experiment.

Researchers worked with two groups of students enrolled in an introductory physics class at Harvard.

The first group studied topic A in an “active learning” paradigm, and topic B with a “passive lecture.”

The second group switched that order: topic A was “passive lecture,” and topic B was “active learning.

The research team found a surprising set of results.

Students learned more from the “active learning” classes, but enjoyed (and thought they learned more from) the “passive lecture.”

Paradoxically, passive learning enhanced engagement but reduced understandingActive learning enhanced learning but reduced engagement.

Almost everyone will find that combination of results surprising, even disappointing.

Puzzle #1 (with a potential explanation)

Members of Team Active Learning, I suspect, predicted that the students would learn more when their professors followed that approach. Voila: they did.

And (almost certainly) teachers on that team predicted that active learning would result in higher engagement. Yet — as measured in this study — it didn’t.

Students clearly preferred the “passive lecture.”

For instance, survey results show that students wanted other physics courses to be taught with passive lecture/direct instruction.

 

The researchers have a hypothesis explaining this puzzling result. They wonder if the additional cognitive challenge created by active learning resulted in “desirable difficulty.”

That is: the students had to think harder — a challenge they didn’t really enjoy.

And this extra thought resulted in more learning. (You can watch a short video here to learn more about this hypothesis.)

Puzzle #2 (with another potential explanation)

Members of Team Direct Instruction, no doubt, are delighted that students preferred the (misnamed) “passive lecture.” According to the survey results, students felt they learned more from it than from the “active learning.”

And yet, Direct Instruction advocates no doubt feel genuine puzzlement that their preferred approach resulted in less learning. How could that be?

 

I myself have a hypothesis explaining this puzzling finding.

Contrary to many stereotypes, direct instruction advocates do NOT champion uninterrupted lecture.

Instead, they suggest that teachers start with straightforward explanation of core concepts.

Once those have been presented clearly, then students should do substantial independent mental work with those ideas.

In other words, advocates of direct instruction heatedly reject the label “passive learning.” Students do plenty of active cognitive work after they get the benefit of initial priming from instructors.

And yet, in this study, students in the passive learning group had to, in the researchers’ words, “adjust to a complete elimination of any active engagement” — such as “demonstrations, … interactive quizzes, or conceptual questions.”

NO educational thinker feels surprise that students learn less in the total absence of active engagement.

That’s not “direct instruction.” That’s … well … that’s a very bad idea. (To be clear: a very bad idea that happens all too frequently.)

A (Potential) Resolution

Because the “passive learning” condition subjected the students to pure lecture, then this study seems much less surprising (to me).

With “passive learning,”

Students learned LESS from uninterrupted lecture. (Why? They didn’t do any independent mental work with the material.)

Because the professor’s explanation made sense, on the other hand, they FELT they understood the material better.

With “active learning,”

Students learned MORE, because they interacted with the concepts and problems individually.

Alas, they FELT they understood less because they experienced the “difficult” half of “desirable difficulties.”

In other words: the study results seem confusing because the labels don’t mean what we thought they meant.

Until we know EXACTLY what happened in both “passive” and “active” learning, we can’t really judge how well those phrases align with our preconceptions — and with our own teaching practices.

One more thought

If a particular diet benefits, say, professional athletes, will it benefit me?

I’m going to be honest: I’m not a professional athlete.

A diet that benefits their level of physical fitness, metabolism, professional goals, etc., might not be healthy for me. (In his swimming prime, Michael Phelps ate 8000-10,000 calories a day. I suspect my doctor would discourage me from doing so.)

If Harvard even remotely lives up to its reputation, then students in Harvard physics classes understand an impressive amount of science. They have a great deal of motivation to learn more about science. They’ve been impressively successful in academic pursuits.

If a teaching method works with Harvard physics students, will it work with my 10th grade English students? Will it work with your 2nd graders? Maybe … but also, maybe not.

In general: I’m hesitant to apply research done at Harvard (or Stanford, or Oxford, or the US Naval Academy…) to most K-12 learning.

It’s entirely possible that the method “works” not because of the method, but because of the extraordinary background of the students who participate in it.

TL;DR

Before we embrace research “active learning” or “direct instruction,” we should know…

… EXACTLY what those labels mean in the research, and

… the GOODNESS OF FIT between those research participants and our students.

Dan Willingham has wisely written: “one study is just one study, folks.”

The Downsides of Desirable Difficulties
Andrew Watson
Andrew Watson

For several years now, we’ve been talking about the benefits of “desirable difficulties.”

For instance, we know that spreading practice out over time helps students learn more than does doing all the practice at once.

Why? Because that schedule creates greater mental challenges. Our students must think harder.

In other words: “spacing” creates “desirable difficulty.”

Likewise, we know that jumbling many topics together during practice helps students learn more than does practicing only one thing at a time.

Why? Students face greater cognitive challenges as they try to figure out which strategy to use or topic to notice.

Desirable difficulty.

And: requiring students to use retrieval practice helps them lots more than simple review.

Yup, you guessed it: d_______ d__________.

A theory that is simultaneously counter-intuitive and common-sense. What’s not to love?

Not So Desirable

I’ll tell you what’s not to love: the big silence.

The phrase “desirable difficulty” implies, obviously, that our students might face UNdesirable difficulties.

And yet, very few people ever discuss — much less research — this topic.

So, what exactly would an undesirable difficulty be? How can I predict or spot them?

I discuss this question with teachers quite often, and I have two sets of suggestions.

The First Strategy

At a Learning and the Brain conference a few years ago, Dr. Robert Bjork (who coined the phrase “desirable difficulty” with his wife, Dr. Elizabeth Ligon Bjork) explained that DDs have two core features.

First: they require students to think harder about the material.

Second: despite the difficulties, students ultimately succeed.

By implication, difficulties that don’t meet those criteria aren’t desirable.

For instance, I’ve just assigned a final project on Macbeth to my sophomores: they must think about the play, create a new something (a set design, a costume plot, a new scene, etc.), and then explain their thinking.

I’ve warned my students quite strictly: they may use technology, but they should NOT get carried away with all the cool tech possibilities at hand.

If they know how to edit videos and want to shoot a scene, that’s fine. But they should not simply throw in 1001 cool editing effects. Those edits would make them think harder, perhaps, but not think harder about the play.

The work would be difficult, but not desirably difficult.

So, too, I might ask my students to write a sentence that uses noun clauses as both the subject of the verb and as an appositive, and also uses an introductory subordinate clause as an adverb.

In this case, my students would think harder (that’s Bjork’s first criterion), but they almost certainly wouldn’t succeed (Bjork’s second criterion).

Again: a difficulty, but not a desirable one.

In other words, we want to ramp up the difficulty — but not too far — without letting the focus subtly shift to another topic.

The Second Strategy

So, difficulties aren’t desirable if they don’t meet both of Bjork’s criteria.

Another way to recognize UNdesirable difficulties: MOST difficulties are undesirable.

So, I can make attention more challenging by — say — playing loud music while students read.

More difficult? Yes. Desirable? No.

I can vex my students’ working memory by giving them ten verbal instructions to remember and follow.

More difficult? Still yes. Desirable? Still no.

I could fiendishly reduce my students’ motivation by inculcating a fixed mindset.

You know the answer. That additional difficulty would in no way be desirable.

In other words, a few specific difficulties (spacing, interleaving, retrieval practice) can be desirable. Most others, however, simply are not.

TL;DR

Desirable difficulties — which require students to think harder before they succeed at their work — can foster deeper learning.

However, most classroom difficulties don’t meet that definition, and therefore aren’t desirable.

Whenever we champion desirable difficulties, we should be sure to mention and guard against the undesirable ones that imperil students’ learning.

Too Good to be True: When Research and Values Collide
Andrew Watson
Andrew Watson

Let’s start with some quick opinions:

Flipped classrooms…

… can transform education and foster students’ independence, or

… are often a waste of time, and at best just rename stuff we already do.

A growth mindset…

… allows students to learn and become anything, or

… is just an over-hyped fad with little research support.

Multiple-choice questions…

… help me see what my students already know (and help them learn), or

… reduce knowledge to trivia, and enforce an authoritarian view of learning.

It seem strange that our profession can contain such starkly contrasting beliefs about core practices.

But if your experience is like mine, you know that debates among teachers can quickly arrive at these extremes. (If you hang out on Twitter, you probably see these clashes at their fiercest.)

Resolving Conflict with Research (?)

When we come across such vehement debates, we might propose an obvious way to settle them: research.

If the science shows X, well then, we teachers should believe X. And, we should run our classes and our schools the X way.

Obviously.

Alas, this solution might not work as well as we would hope. A recent essay by Brendan Schuetze (Twitter handle, @BA_Schuetze) helps explains why.

As Schuetze outlines, Mindset Theory lives in a strange place in the world of education.

On the one hand: research suggests that specific growth-mindset strategies offer some students modest benefits under particular circumstances. (Better said: they sometimes or probably do.)

On the other hand: lots of teachers and school systems think that…well…a growth mindset means that “anyone who tries can succeed at anything.”

How can it be that researchers (often) have one view of an educational theory, and teachers (often) have such a dramatically different understanding of that same theory?

The Values We Hold Influence the Beliefs We Adopt

To answer this question, Schuetze focuses on “values-alignment.” That is: we (teachers specifically, people generally) are quick to endorse research that aligns with values we already hold.

If (and this is my example, not Schuetze’s) we value innovation and the transformative power of technology, we’re likelier to think that flipped classrooms will radically improve education.

We might even come across research supporting this value-aligned position.

If we value tradition and the transformative power of face-to-face conversation, we’re likelier to think that this flipped-classroom nonsense will fail quickly and spectacularly, and we’ll go back to the methods that have always worked.

We can easily discover research supporting this position as well.

In his essay, Schuetze takes the example of growth mindset.

In a well-sourced recap, Schuetze explains:

Teacher education programs tend to endorse transformative constructivist pedagogy (as opposed to more traditional pedagogy), where social justice and the socio-emotional needs of students are increasingly seen as legitimate educational concerns…

In line with this affective turn, teachers are encouraged to be concerned not only with intellectual development, but also with molding, inspiring, and caring for their students–or what might be summarized in one word as the “growth” of students.

Because teacher training programs encourage us to value students’ “growth” quite broadly, our profession tends to believe any research that holds up growth as an outcome.

And we might not ask hard questions before we embrace that belief.

More Concerns, Possible Solutions

In fact (I’m inferring this from Schuetze’s essay), we’re likelier to over-interpret the plausibility and effectiveness of that theory.

Imagine a modest, research-based suggestion aligns with our values:

Researchers say, “X might help these students a bit under these circumstances.”

We teachers hear, “X transforms students — it’s almost magic!”

In my experience — and here I’m going WAY beyond Schuetze’s essay — our hopeful beliefs then call up the very “evidence” we need to persuade ourselves:

Well-meaning teachers write hopeful books that extrapolate substantially from the research they cite.

Blog posts — in an effort to make complex research clear — gloss over the nuance and uncertainty that researchers typically highlight.

Edu-Tweeps with thousands of followers simplify complex ideas into 280 characters.

Suddenly, it seems “everybody believes” that “research shows” what we already value.

To face this problem, I think we need to combine several steps.

Too Good

In the first place, I think it helps to focus on Schuetze’s troubling insight. We might find, someday, that a teaching practice BOTH helps our students learn AND contradicts our values.

Perhaps flipped classrooms really do help students (for the most part), even though we value tradition and face-to-face pedagogy.

Or, to reverse the case,

Perhaps growth mindset strategies don’t really help, even though we value students’ overall growth above their narrow academic achievement.

In these cases, we should honestly accept the tension between research and values. If we act as if they align when they don’t, we won’t make decisions as effectively or thoughtfully as we should.

That is: we can quite appropriately say:

This intervention might not help students learn more. But it aligns with a core value in our community, so we’ll do it anyway.

In the second case, I think we should hone an odd kind of skepticism:.

If a research-based teaching suggestion sounds deeply good — that is, if it aligns with our values — then we have an extra responsibility to assume it’s too good to be true.

Does “authenticity” sound good to you? You should BEWARE a pedagogical strategy called “authentic exploration.”

Does “mindfulness” sound uplifting? You should BEWARE mindfulness initiatives.

Have you (like me) always enjoyed being outdoors with the trees? You (like me) should BEWARE any teaching initiative with the words “woods” or “nature” or “wilderness” in the title.

Of course, when you warily undertake a review of the research literature, you just might find that it does in fact support this core value. (Quick: let’s all study in a forest!)

But we owe it to our profession and our students to admit: the values we hold dear might lead us into too credulous acceptance of the next new thing.

I (hope I) value my students’ development too much to let that happen.