Skip to main content
“Before You Change Your Teaching, Change Your Thinking”
Andrew Watson
Andrew Watson

When I attended my first Learning and the Brain conference, more than a decade ago, I had a simple plan:

Step 1: Listen to the researcher’s advice.

Step 2: Do what the researcher told me to do.

Step 3: Watch my students learn more.

Step 4: Quietly glow in the satisfaction that my teaching is research-based.

In fact, I tried to follow that plan for several years. Only gradually did I discover that it simply couldn’t work.

Why?

Because researchers’ advice almost always applies to a very specific, narrow set of circumstances.

The teaching technique they use to help — say — college students learn calculus might not help my 10th graders write better Macbeth essays.

Or: their teaching strategy encourages a technology that my Montessori school forbids.

Or: research on American adolescents might not yield results that help teens raised in other cultures.

In other words: psychology and neuroscience research don’t provide me a handy checklist. I don’t just need to change what I do; I need to change how I think. I really wish someone had said to me:

“Before you change your teaching, change your thinking.”

Example the First

I thought of this advice when I saw a recent Twitter post by Otto Warman (@MrOWarman), a math teacher in Britain.

Warman has gone WAY beyond following a researcher’s checklist. Instead, he has synthesized an impressive amount of research, and reorganized it all into a lesson-planning system that works for him.

As you can see, his lesson plan form (which he has generously shared) prompts him to begin class with retrieval practice, then to introduce new information, then to check for understanding, and so forth. (You can click on the image to expand it.)

Each circle and slice of the diagram includes helpful reminders about the key concepts that he’s putting into action.

That is: he’s not simply enacting someone else’s program in a routinized way. He has, instead, RETHOUGHT his approach to lesson planning in order to use research-supported strategies most appropriately and effectively.

To be clear: I DO NOT think you should print up this sheet and start using it yourself. That would be a way to change what you do, not necessarily a way to change what you think. The strategies that he has adopted might not apply to your students or your subject.

Instead, I DO THINK you should find inspiration in Warman’s example.

What new lesson plan form would you devise?

Are there cognitive-science concepts you should prioritize in your teaching?

Will  your students benefit especially from XYZ, but not so much from P, Q, or R?

The more you reorganize ideas to fit your particular circumstances, the more they will help your teaching and your students.

Example the Second

Over on his blog (which you should be reading), Adam Boxer worries that we might be making a mess of retrieval practice.

Done correctly, retrieval practice yields all sorts of important benefits. Done badly, however, it provides few benefits. And takes up time.

For that reason, he explains quite specifically how his school has put retrieval practice to work. As you’ll see when you review his post, this system probably won’t work if teacher simply go through the steps.

Instead, we have to understand the cognitive science behind retrieval practice. Why does it work? What are the boundary conditions limiting its effectiveness? How do we ensure that the research-based practice fits the very specific demands of our classes, subjects, and students?

Retrieval practice isn’t just something to do; it’s a way to think about creating desirable difficulty. Without the thinking, the doing won’t help.

To Sum Up

What’s the best checklist for explaining a concept clearly? There is no checklist: think differently about working memory and schema theory.

What’s the best daily schedule for a school? There is no best schedule: think differently about attention.

What steps help are most powerful to help students manage stress? Before we work steps, we have to think differently about students’ emotional and cognitive systems.

To-do lists are straightforward and easy. Teaching is complex and hard. Think different.

“Successive Relearning”: 1 + 1 = +10%
Andrew Watson
Andrew Watson

We teachers get LOTS of advice from cognitive science. Research tells us to…

…monitor and manage our students’ stress levels.

…use mid-class exercise to enhance attention.

interleave topics to create desirable difficulties.

Each individual suggestion has lots of research behind it, and we’re glad to get these ideas.

But: what happens when we start thinking about combinations?

If we use more than one technique at a time, do the benefits add onto each other? Do they conflict with each other? How can we link up research-informed strategies to create the best overall learning experience?

Focus on Memory

In the last ten years, we’ve seen a real explosion in research about long-term memory formation (aka, learning).

We’ve seen that retrieval practice takes no more time than simple review, but results in lots more learning.

We’ve seen that spreading practice out (aka, spacing) helps students learn better than bunching practice together.

So, here’s the big question: what happens if we do both? Is retrieval practice + spacing more powerful than each technique by itself? Is 1+1 greater than 2?

A research team at Kent State recently explored this question.

In this study, researchers developed a complex study paradigm that created both retrieval practice and spacing. Unlike some retrieval practice exercises — which simply require students to try to remember the answer — this paradigm required students to get questions correct before they were done.

Researchers called this combination successive relearning. Students used successive relearning for some of the topics they learned in an advanced biopsychology course. They used their regular (“business-as-usual”) study techniques for the others.

Did successive relearning help students learn?

Answers, and More Questions

To some degree, the answer to that question is: it depends on what you compare to what.

Should the researchers compare this year’s students to last year’s students in the same course?

Should they compare students’ memory of topics where they did use successive relearning to topics where they didn’t?

Being thorough, this research team compared lots of variables to lots of other variables. Quite consistently, they found that “successive relearning” helped.

That is: this year’s students learned more than last year’s. Among this year’s students, successive relearning helped students remember more than their “business-as-usual” strategies.

Roughly speaking, students remembered at least 10% more using successive relearning than other strategies.

So: 1 + 1 = +10%

Case closed.

Case Reopened?

But wait just a minute here.

If you’ve got a good memory, this blog post might be ringing some bells.

Back in February of this year, I wrote about a study in which spacing helped students learn, but — in the long run — retrieval practice didn’t.

If you’ve got an AMAZING memory, you might recall a blog post from June of 2018. Researchers polled students about their study habits. They found that students did benefit from their own retrieval practice, but did not benefit from their spacing — the exact opposite result.

What’s going on here? Why did the two studies differ from each other? Why are they different from this study I’ve just described?

My hypothesis: specifics matter.

In those previous studies, the classes already included one of these techniques.

That is: the study I wrote about in February looked at a math class. Math classes already include lots of retrieval practice, because problem solving is a kind of RP. So, additional retrieval practice didn’t help. But the spacing did.

And, in the class I wrote about in 2018, the professor’s syllabus already included lots of spacing: cumulative review sheets and cumulative exams. So additional spacing done by the students  didn’t help. But their retrieval practice did.

In this most recent study, students benefitted from both because the biopsychology course didn’t include either.

In other words: the best combination of retrieval practice and spacing will depend — in part — on the structure and content of the course you’re teaching.

Final Thoughts

Here’s how I concluded my post back in February:

In my own view, we can ask/expect our students to join us in retrieval practice strategies. Once they reach a certain age or grade, they should be able to make flashcards, or use quizlet, or test one another.

However, I think spacing requires a different perspective on the full scope of a course. That is: it requires a teacher’s perspective. We have the long view, and see how all the pieces best fit together.

For those reasons, I think we can (and should) ask students to do retrieval practice (in addition to the retrieval practice we create). But, we ourselves should take responsibility for spacing. We — much more than they they — have the big picture in mind. We should take that task off their to do list, and keep it squarely on ours.

That’s an opinion, not a research conclusion. But I still think it’s true.

The Benefits of “Testing” Depend on the DEFINITION of “Testing.” And the TIMING. And…
Andrew Watson
Andrew Watson

Whenever I want to start a food fight at a faculty meeting (admit it: you know the feeling), I contemplate shouting: “What do we all think about testing?”

Almost certainly, several colleagues will decry the use of discriminatory high-stakes tests that stress and mis-categorize our students.

Others will angrily retort that research into retrieval practice shows that well-designed tests help students learn. Heck, “retrieval practice” yields great benefits because of “the testing effect.”

Still others will heatedly contrast formative assessment with summative assessment.

Like the Goddess of Discord with my Golden Apple, I will have inspired an enduring battle.

Of course, this battle results from confusion about definitions.

Scholars who champion retrieval practice or formative assessment might be said to “favor testing,” but they favor very specific kinds of testing. The testing they champion (probably) looks nothing like the high-stakes test that my first respondents so loathe.

In other words: my colleagues don’t necessarily disagree with each other. Because this one word has different meanings, they’re probably arguing about different topics without realizing it.

Confusion Gets Clearer

Even if I narrow my question to “pre-testing,” I’ve still created many opportunities for confusion.

I might, for instance, “pre-test” my students about the myth of the Golden Apple because I want to know what they already know.

My goal, in other words, isn’t to evaluate them. Instead, I want to align my lessons with their current knowledge. After all, I need one lesson plan for a class which has never heard of Zeus, Aphrodite, or Troy; and a completely different lesson plan for a class that read The Iliad last year.

I’m “pre-testing” as an early step in my own lesson planning.

On the other hand, we have a fascinating research pool suggesting that “pre-testing” itself might help students learn. That is: the act of taking that pretest can improve their ultimate understanding of the material later in the unit.

Amazingly, according to this research pool, these pre-tests benefit students even if they get all the answers wrong. (Of course they get the answers wrong. They haven’t yet done the unit with me.)

For instance, Dr. Lindsay Richland and Co. pre-tested students on a passage about color-blindness. They found that students who took a pretest (and got all the answers wrong) ultimately learned more than students who used that extra time to study the passage. (As I’ve written earlier, I love this study because Richland works SO HARD to disprove her own hypothesis.)

Getting the Specifics JUST RIGHT

So far, we’ve seen that the benefits of testing depend on the definition of testing. While we know that some tests stress and demotivate students, we’ve also got research suggesting that a very specific kind of pre-testing might help students learn.

Of course, we know that psychology research always includes boundary conditionsA teaching technique that works in one set of circumstances might not work in others.

So, for instance, a teaching technique that helps 3rd graders learn might not help college students. Or, a strategy to help students write well might not help them learn to pirouette in dance class.

We know that there will be boundaries for this (very specific kind of) pre-testing. What are they?

As is so often the case, this question has led to complexity and controversy. For instance: several scholars have found that pretesting helps students make new connections only if they already know a little bit about the material.

A study from 2014, however, suggests that pre-testing helps students even if they don’t know anything about the material.

For teachers, this distinction matters.

If students need at least a little prior knowledge for pretesting to be helpful, we will use this technique less often. If, however, they don’t need that prior knowledge, then our decision to limit its use robs them of a chance to learn.

To use this technique correctly, we really need to know the right answer.

Today’s Research: Activating Prior Knowledge

A recent study, led by Dr. Tina Seabrooke, tries to sort through this intricate and important question. In fact, because the details require such nuanced distinctions, she ended up running five different experiments. Only by considering all five together could she and her team reach a conclusion.

Of course, with five different studies underway, Seabrooke’s work has lots of nooks and subtleties to explore. Instead, let me cut to the chase:

TL; DR

Pre-testing will probably help students learn and understand a topic better if they already know something about it.

If students don‘t know anything about the subject, pretests don’t help much.

More specifically: pretests might help them remember some words from the questions, and some words from the answers. But — crucially — pretests won’t help students make connections between the questions and the answers.

Of course, we really want students to make those connections. Any definition of “understanding” a topic will include the ability to answer meaningful questions about it.

You might think about pretesting this way:

Pretests help students activate useful prior knowledge. If they don’t have relevant prior knowledge, then pretests don’t have anything useful to draw upon.

Final Word

Research into “pretesting” is still ongoing, and we’re still learning new and useful information.

I suggest that teachers use this technique from time to time as a way to activate prior knowledge. I wouldn’t require it as part of my daily routine.

And: I would keep my eyes peeled for the next relevant study. We’ve still got lots more to learn on this subject…

Can We Improve Our Students’ Executive Function? Will That Help Them Read Better?
Andrew Watson
Andrew Watson

Here’s a fun exercise. I’m going to give you a list of words. You try to sort them into two groups, based on the sound they begin with:

cup, bag, bread, can, box, cookie, cake, bucket, corn, beans, crate, banana

Presumably you came up with /k/ sounds and /b/ sounds:

cup, can, crate, cookie, cake, corn

beans, bread, banana, box, bag, bucket

Okay, now go back and RESORT those words into two groups, based on the category they belong to.

Presumably you came up with containers and foods:

cup, can, crate, box, bag, bucket

cookie, cake, corn, beans, banana, bread

If you succeeded, BRAVO! You demonstrated cognitive flexibility: an executive function that allows you to change your thought process mid-stream.

Believe it or not, we have to learn this particular skill.

In video below, for instance, 3-year-olds sort cards according to their color (“red goes here, blue goes there”). They’re usually good at that. However, when the rules change to focus on shape (“trucks go here, flowers go there”), they struggle to follow the different instructions.

https://www.youtube.com/watch?v=tXZau5VIIvU

Why? Because they haven’t yet developed the executive function of cognitive flexibility.

New Research: Improving Reading

For a number of reasons, we might think that this general executive function (cognitive flexibility) might support a specific academic skill (reading).

If that’s true, then maybe we can help struggling readers by training their cognitive flexibility. (This possibility relies on several assumptions; the scholars who did this work have lots of research supporting each one.)

To test this possibility, Kelly Cartwright & Co. had teachers spend several weeks training a group of 2nd – 5th  graders in cognitive flexibility.

Basically, those students repeated that word-sorting/resorting exercise you did at the top of this post. And, they tried a more complicated fill-in-the-blank version of that task as well.

The results?

Compared with other struggling readers, these students got better at cognitive flexibility. And — here’s the big news — they got better at reading as well. (More specifically: they didn’t get better at individual word recognition, but they got better at reading comprehension and grade level reading.)

So, in this research, Cartwright’s team found that training a particular executive function helps struggling readers do better.

Technically speaking, that’s awesome.

As Always, the Caveats

First: as Dan Willingham says in his Twitter bio, “One study is just one study, folks.” Even if Cartwright and Co. did everything right, it’s possible their results are a fluke. We won’t know until many other scholars succeed in replicating and extending this finding.

Second: We shouldn’t extrapolate too far based on this study. We don’t know if training other executive functions would help struggling readers. We don’t know if training EF benefits typical readers; or, people first learning to read; or, improves the performance of sophisticated readers.

Those questions are important — but not addressed directly by this research.

Third: Both reading instruction and executive function are hotly controversial topics. (Heck, I wrote a post about a month ago questioning the very idea of a “general” executive function.) I wouldn’t be surprised if this research (or my summary of it) prompted stern rebukes from scholars/practitioners with a different understanding of reading/EF processes.

I wouldn’t even be surprised if those stern rebukes were correct. If you’ve got an alternative perspective (and some research behind it), please let me know.

But, with those caveats in mind, this research strikes me as exciting and potentially powerful. Any strategy to help struggling readers should get our attention. One that a) costs essentially no money, b) doesn’t take very long, and c) can be done so easily might be a real boon to schools, students, and readers.

Watch this space…

The Best Length of Time for a Class [Repost]
Andrew Watson
Andrew Watson

Quite consistently, this post has been among the most searched for and most popular on the blog.

Teachers and administrators REALLY want to know: What is the optimal amount of time for our students to meet? What’s the very best schedule?

Here’s the best answer I have:


I met yesterday with several thoughtful teachers who had resonant questions about education research.

class length

How do we balance factual learning and deep thinking?

What’s “the right amount of stress” during a test?

How can we promote collaboration while honoring individual differences?

And:

What’s the optimal class length?

This question comes up often. Should we have lots of short classes, so every subject meets every day? Should we have a few longer classes, so that we can dig deeply into a particular topic without interruption?

Debates sometimes fall along disciplinary lines. Foreign language and math teachers often want frequent class meetings; English and History teachers typically like bigger chunks of time for discussions.

Science teachers just gotta have 80 minutes to run a lab well.

But: what does research show?

Class Length: What Research Tells Us

As far as I know, we just don’t have a clear answer to that question.

Over at the Education Endowment Fund, for example, they’ve investigated the benefits of block scheduling: that is, a few long periods rather than several short ones.

The finding: we can’t really say. Or, to quote EEF: “There is no consistent pattern in the evidence.”

More precisely:

The evidence suggests that how teachers use the time they are allocated is more important than the length of lesson or the schedule of lessons, and hence that the introduction of block scheduling is unlikely to raise attainment by itself.

By implication, a change away from block scheduling shouldn’t raise attainment either.

The point is not how long we teach but how well we teach with the time we’ve got.

For this reason, I often counsel schools and teachers: before you change your schedule, study human attention systems.

Once teachers know how attention works — and, it’s A LOT more complicated that we might have thought — we’ll be much better at helping students learn. (If you have the chance to attend a Learning and the Brain session about attention: RUN, don’t walk.)

Class Length: What Research Can’t Tell Us

Research doesn’t answer this question, I think, because it can’t. There’s no one correct answer.

If you teach 2nd graders or 7th graders or 11th graders, you’ll probably find that different lengths of time work better.

If you teach in cultures that inculcate patience and concentration, longer classes will work better than in cultures with a more get-up-and-go kind of pace.

The number of students in the class might matter.

The experience of the teacher almost certainly matters.

When your school starts investigating schedules, therefore, I suggest you start with these essentials:

First: study human attention.

Second: don’t design “the optimal schedule.” Design the optimal schedule for your school and your students. It might not work at anyone else’s school, but it doesn’t need to.

A schedule that works for you and your students is the closest to optimal that you can get.

Laptop Notes or Handwritten Notes? Even the New York Times Has It Wrong [Reposted]
Andrew Watson
Andrew Watson

You’ll often hear the claim: “research says students remember more when they take notes by hand than when they use laptops.”

The best-known research on the topic was done in 2014.

You’ll be surprised to discover that this conclusion in fact CONTRADICTS the researchers’ own findings. Here’s the story, which I wrote about back in 2018…


Here’s a hypothetical situation:

Let’s say that psychology researchers clearly demonstrate that retrieval practice helps students form long-term memories better than rereading the textbook does.

However, despite this clear evidence, these researchers emphatically tell students to avoid retrieval practice and instead reread the textbook. These researchers have two justifications for their perverse recommendation:

First: students aren’t currently doing retrieval practice, and

Second: they can’t possibly learn how to do so.

Because we are teachers, we are likely to respond this way: “Wait a minute! Students learn how to do new things all the time. If retrieval practice is better, we should teach them how to do it, and then they’ll learn more. This solution is perfectly obvious.”

Of course it is. It’s PERFECTLY OBVIOUS.

Believe It Or Not…

This hypothetical situation is, in fact, all too real.

In 2014, Pam Mueller and Dan Oppenheimer did a blockbuster study comparing the learning advantages of handwritten notes to laptop notes.

Their data clearly suggest that laptop notes ought to be superior to handwritten notes as long as students learn to take notes the correct way.

(The correct way is: students should reword the professor’s lecture, rather than simply copy the words down verbatim.)

However — amazingly — the study concludes

First: students aren’t currently rewording their professor’s lecture, and

Second: they can’t possibly learn how to do so.

Because of these two beliefs, Mueller and Oppenheimer argue that — in their witty title — “The Pen is Mightier than the Laptop.”

But, as we’ve seen in the hypothetical above, this conclusion is PERFECTLY OBVIOUSLY incorrect.

Students can learn how to do new things. They do so all the time. Learning to do new things is the point of school.

If students can learn to reword the professor’s lecture when taking notes on a laptop, then Mueller and Oppenheimer’s own data suggest that they’ll learn more. And yes, I do mean “learn more than people who take handwritten notes.”

(Why? Because laptop note-takers can write more words than handwriters, and in M&O’s research, more words lead to more learning.)

And yet, despite the self-evident logic of this argument, the belief that handwritten notes are superior to laptop notes has won the day.

That argument is commonplace is the field of psychology. (Here‘s a recent example.)

Even the New York Times has embraced it.

The Fine Print

I do need to be clear about the limits of my argument:

First: I do NOT argue that a study has been done supporting my specific hypothesis. That is: as far as I know, no one has trained students to take reworded laptop notes, and found a learning benefit over reworded handwritten notes. That conclusion is the logical hypothesis based on Mueller and Oppenheimer’s research, but we have no explicit research support yet.

Second: I do NOT discount the importance of internet distractions. Of course students using laptops might be easily distracted by Twinsta-face-gram-book. (Like everyone else, I cite Faria Sana’s research to emphasize this point.)

However, that’s not the argument that Mueller and Oppenheimer are making. Their research isn’t about internet distractions; it’s about the importance of reworded notes vs. verbatim notes.

Third: I often hear the argument that the physical act of writing helps encode learning more richly than the physical act of typing. When I ask for research supporting that contention, people send me articles about 1st and 2nd graders learning to write.

It is, I suppose, possible that this research about 1st graders applies to college students taking notes. But, that’s a very substantial extrapolation–much grander than my own modest extrapolation of Mueller and Oppenheimer’s research.

And, again, it’s NOT the argument that M&O are making.

To believe that the kinesthetics of handwriting make an essential difference to learning, I want to find a study showing that the physical act of writing helps high school/college students who are taking handwritten notes learn more. Absent that research, this argument is even more hypothetical than my own.

Hopeful Conclusion

The field of Mind, Brain, & Education promises that the whole will be greater than the sum of the parts.

That is: if psychologists and neuroscientists and teachers work together, we can all help each other understand how to do our work better.

Frequently, advice from the world of psychology gives teachers wise guidance. (For example: retrieval practice.)

In this case, we teachers can give psychology wise guidance. The founding assumption of the Mueller and Oppenheimer study — that students can’t learn to do new things — simply isn’t true. No one knows that better than teachers do.

If we can keep this essential truth at the front of psychology and neuroscience research, we can benefit the work that they do, and improve the advice that they give.

Growing Mindsets in Argentina? [Repost]
Andrew Watson
Andrew Watson

Since I first published this post a year ago, there’s been an important change to its argument: the study I’m writing about now HAS been published in a peer-reviewed journal.

As the Mindset skepticism movement gains further steam, I was struck by a comment on this study from the invaluable Dan Willingham. If I remember this correctly, he tweeted (roughly): “The mystery is that we haven’t been able to make this theory work in the classroom.”

Note the elegant middle ground this comment finds. Willingham acknowledges both the decades of scrupulous work that Dweck and her colleagues undertook, and that classroom interventions haven’t had the effect we’d like (for most students).

He doesn’t say (as others mean-spiritedly imply) that Dweck is a fraud. He doesn’t say (as others blithely imply) that we don’t need to worry about the rising number of classroom non-replications.

Instead, he says: “this intervention works under some circumstances, but not under others. We don’t yet know why. If we did, that would be SUPER helpful.”

I myself — as I argue below — think that we went too far thinking that upbeat posters on the wall would radically change students’ motivation, and now we’re going too far in arguing the whole Mindset thing is bunk.

OF COURSE one-time interventions don’t work. Are we truly surprised by this? PERHAPS creating a different school climate will work. Is that so preposterous an argument?

In any case: here’s what I wrote in July of 2019…


Mindset theory has faced increasing skepticism in recent years.

For four decades — literally!–Carol Dweck and other researchers ran thoughtful studies with thousands of students. Over and over, they found that students who think about about their work in particular ways (shorthand, “growth mindset”) do better than those who don’t (“fixed mindset”).

Like other areas of psychology (think “power poses”), Mindset Theory has been caught up in the “replication crisis.”

In brief: if Mindset theory is true, then a mindset intervention should help no matter who does the intervening. It should work when Dweck’s team does it with her students, and when I do so with mine.

If it works only for Dweck, well, that doesn’t really help the rest of us.

And, several researchers have found that various strategies didn’t replicate.

A much publicized meta-analysis, published last summer, suggests that Mindset interventions had very small effects. (I myself think this meta-analysis has been over-interpreted; you can see my analysis here.)

Today’s News

Researcher and NYU professor Alejandro Ganimian has published research about a large-scale mindset intervention in Argentina.

Ganimian had 12th graders at 100 (!) schools read a passage arguing that “persisting through difficult challenges can develop the brain.”

The 12th graders then wrote “a letter to a classmate of their choice on the three main lessons from the reading and how they might help him/her.”

To keep the growth mindset message fresh, those letters were posted in the classroom.

He compared these students to 12th graders at 102 other schools that had not used this intervention.

The results? Nada. Nothin’. Bupkis.

Specifically:

This intervention had “no effect on students’ propensity to find challenging tasks less intimidating.”

It didn’t increase the likelihood that they would pay attention in class.

By some rough/indirect measures, it didn’t have an effect on the participants’ academic success.

As Ganimian sums up his results:

In nearly all outcomes, I can rule out even small effects. …

This study suggests that the benefits of growth mindset interventions may be more challenging to replicate and scale in developing countries than anticipated.

What Should Teachers Do?

First: two clarifying points. a) Ganimian’s research hasn’t been peer reviewed and published in a journal. It is currently a working paper, hosted on his website. [Ed. 8/2020: Ganimian’s research now has been published: see link at the top of this post.]

And b) I myself am not a neutral source in this debate. I’ve written a book about mindset research, and so I read Ganimian’s work through that lens.

Second: I think mindset strategies are likeliest to have an effect when used all together as a consistent, unified approach to student motivation.

That is: I’m not at all surprised that a “one-shot” intervention doesn’t have big results. (Some research has found success with “one-shot” interventions; I’ve always been skeptical.)

So, if you want to use mindset research in your classrooms, don’t do just one thing, once. A motivational poster really won’t accomplish much of anything.

Instead, understand the interconnecting strategies that promote a growth-mindset climate, and use them consistently and subtly. Heck, I can even recommend a book that will show you the way.

Third: Here’s what I wrote last October:

We should not, of course, ask mindset to solve all our problems. Nor should we ask retrieval practice to solve all problems. Or short bursts of in-class exercise.

No one change fixes everything.

Instead, we should see Mindset Theory as one useful tool that can help many of our students.

Obsessed with Working Memory [Reposted]
Andrew Watson
Andrew Watson

I’m on vacation for the month of August, and so we’ll be reposting some of our most-viewed articles.

We’re starting with our series on working memory: one of the most essential concepts from the field of cognitive science.


When I attended my first Learning and the Brain conference, I had never even heard of working memory.

Now, I obsess over working memory. And, I think all classroom teachers should join me.

Heck, I think everyone who cares about learning, curriculum, teacher training, and education should think about working memory. All. The. Time.

In this series of posts, I’ll start by defining working memory (WM) today. And in succeeding posts, I’ll talk about using that knowledge most helpfully.

Trust me: the more we think about WM, the more our students learn.

Working Memory: An Example

As an example of WM in action, I’m going to give you a list of 5 words. Please put those words in alphabetical order. IN YOUR HEAD. (That’s right: don’t write anything down…)

Okay, here’s the list:

Think of the five workdays of the week. (Hint: if you live in a Western society, the first one is ‘Monday.’)

Now, go ahead and put those five words into alphabetical order. Don’t peek. I’ll wait…

 

Probably you came up with this list:

Friday, Monday, Thursday, Tuesday, Wednesday

I do this exercise with teachers often. For most everyone, that’s fairly simple to do. I’m guessing you got it right quite easily.

Working Memory: A Definition

To succeed at that task, you undertook four mental processes.

First, you selected relevant information. Specifically, you selected the instructions that you read. And, you looked into your long-term memory to select the workdays of the week.

Next, you held that information. If you had let go of the instructions, or of the days of the week, you couldn’t have completed the task.

Third, you reorganized the days of the week according to the instructions. You started with a chronological list (Monday, Tuesday, Wednesday…), and converted it into an alphabetical lest (Friday, Monday, Thursday…).

In many WM tasks (but not this one), you might not only reorganize, but also combine information. If, for instance, you added up 7+12+4+18+6 in your head, you selected, held, and combined those numbers into a new number.

So:

Working memory is a limited, short-term memory capacity that selects, holds, reorganizes, and combines information from multiple sources.

In a later post, I’ll talk about some finer points in the definition of WM. For the time being, focus on those four verbs: select, hold, reorganize, combine.

Working Memory: An Acronym

Because WM is so important, it would be great if there were a handy acronym. Happily, there is!

Select

Hold

REorganize

Kombine

What does that get you? SHREK! (I know: I misspelled ‘combine.’ But: I lived in Prague for a year, so you can forgive me for that useful alteration.)

Working Memory in the Classroom

Now, ask yourself: which of these classroom tasks requires working memory?

That is: in which of these cases do your students have to select, hold, reorganize, and/or combine information?

Solving a word problem.

Comparing W.E.B. du Bois and Booker T. Washington.

Transposing a song into a new key.

Applying a new phonics rule to various combinations of letters.

Choreographing a dance routine.

The correct answer is: ALL OF THEM.

In fact, practically everything we do in school classrooms requires working memory. Often, it requires A LOT of working memory.

To Sum Up

We use WM to select, hold, reorganize, and combine (SHREK) information.

Students use WM constantly in classrooms, for practically everything they do.

Simply put: no academic information gets into long-term memory except through working memory. It’s that important.

Up next: we’ll highlight key facts about WM. Then we’ll talk about using that knowledge in your teaching.


This series continues:

Part II: Three Core Ideas for Working Memory

Part III: Anticipating Overload

Part IV: Identifying Overload

Part V: Working Memory Solutions

Part VI: Working Memory Resources

Deliberate Practice Doesn’t Align with Schooling (Well: Not Precisely)
Andrew Watson
Andrew Watson

With his research into expertise – concert-level violinists, world-ranked chess players, elite runners – Anders Ericsson more-or-less created a new field of study.

How can we become amazingly awesome at challenging tasks? Ericsson has a system: deliberate practice.

As described in his book Peak (written with Robert Pool), deliberate practice has four key components:

Well defined, specific goals,

Focus,

Feedback (often from an expert, or an experienced teacher), and

Getting out of your comfort zone.

Gosh, that sounds a lot like school, doesn’t it? If we could structure our school thinking according to Ericsson’s research, perhaps we could help all our students become concert-level chemists, world-ranked fraction multipliers, and elite poetry analysts.

In fact, we already try to do so much of this, don’t we? We write goals on the board, encourage students to concentrate, give lots o’ feedback, and encourage students to try new things.

In other words: deliberate practice seems a perfect fit for schools. Obviously…

Or then again: maybe not.

The Popular Mistakes

Ericsson’s work has been most popularized by Malclom Gladwell’s book Outliers. You might oversimplify that book with this sentence: “The Beatles succeeded so spectacularly because they practiced 10,000 hours in Berlin.”

Peak briskly summarizes Gladwell’s inaccuracies:

First: 10,000 hours is a catchy round number, but lots of other numbers would have been just as accurate. 10,000 hours applies to one category of budding experts (musicians) at a particular stage (the age of 20) of learning one specific skill (the violin).

Second: even this much-touted number is correct only as an average. Half of the violinists whose data went into this number has practiced LESS than 10,000 hours.

Third: the Beatles weren’t practicing. They were performing. Ericsson’s research shows clearly: deliberate practice looks substantially different from ultimate successful performance.

These inaccuracies – important in themselves – also remind us: if we want to apply Ericsson’s research to our school work, we have to be more careful than Gladwell.

With that guidance in mind, let’s consider the fit between deliberate practice and education.

The GOALS Are Different

Research into deliberate practice focuses quite narrowly on specific kinds of learning.

He studied people wanting to be world champions in one (and only one) very specialized skill: chess, or hurdling, or concert piano playing.

He did NOT study what most teachers do: helping students be good enough at one skill to move on to the next.

For instance: I don’t want my students to win the “Angle-Side-Angle World Mathlympic Championship Gold Medal.” I want them to understand angle-side-angle well enough to move on to side-angle-side; and, ultimately, well enough to solve complex geometry proofs.

I don’t want them to win more National Mann Booker Nobel Book Prizes than anyone else. I want them to write good enough Macbeth essays so they’ll write even better Kindred essays.

In fact, I don’t want them to focus single-mindedly on any one thing. I want them to make gradual progress in all sorts of disciplines and skills: pottery, cooperation, Spanish, history, citizenship, driver’s ed.

It’s possible that deliberate practice will improve all kinds of learning – including school learning. But: let’s not be like Gladwell and simply make that assumption.

Our UNDERSTANDING OF TEACHING Is Different

Ericsson puts it this way:

One of the things that differentiates violin training from training in other areas – soccer, for example, or algebra – is that the set of skills expected of a violinist is quite standardized, as are many of the instruction techniques.

Because most violin techniques are decades or even centuries old, the field has had the chance to zero in on the proper or “best” way to hold the violin, to move the hand during vibrato, to move the bow during spiccato, and so on.

The various techniques may not be easy to master, but a student can be shown exactly what to do and how to do it. (Peak, p. 91)

Does that sound like education to you? Heck, we can’t get the field to agree on teaching strategies for one of education’s most foundational skills: learning how to read. Almost everything in our world is up for contentious debate.

Note that Ericsson is explicit: instruction techniques for algebra do not fit the pattern he studies. We don’t have decades-old tried-and-true techniques for teaching algebra (or grammar, or bunting).

That’s why education is hanging out with psychology and neuroscience: to develop and understand new techniques.

The Role of FEEDBACK Is Different

Ericsson’s model follows a precise feedback pattern:

The student practices a discrete skill.

The teacher provides specific feedback.

The student tries again, and improves.

The student recognizes her immediate progress, and continues to grow.

In education, however, the cause/effect relationship between feedback and progress gets MUCH more complicated.

Specifically, we know that short-term performance does not reliably predict long-term learning. In a research review that I cite often, Nick Soderstrom makes this important claim:

“Improvements in [short-term] performance can fail to yield significant [long-term] learning—and, in fact, … certain manipulations can have opposite effects on learning and performance.” (Emphasis added)

In fact, we’ve got an entire field of memory research that focuses on “desirable difficulties.” The relevant headline: if students get everything right immediately, their work isn’t difficult enough. We need them to be struggling more to ensure long-term learning.

If Soderstrom and the “desirable difficulties” team are right – and I certainly think they are – then the feedback pattern essential to deliberate practice doesn’t align with the kind of teaching and learning that schools prioritize.

We Think Differently about FUN

Throughout Peak, Ericsson and Pool emphasize that deliberate practice requires determination and focus, and rarely results in fun.

Experts don’t become experts because they enjoy this work more. They keep going despite their lack of enjoyment.

For instance, he describes a study of participants taking a singing lesson. Those participants who were NOT professional singers felt relaxed, energized, and elated after the lesson; it allowed them to express themselves in a way they didn’t usually get to do.

However, the participants who WERE professional singers felt relaxed and energized, but NOT elated. They were working, not expressing themselves. In Ericsson’s words, “there was focus but no joy” (p. 151).

Schools, however, want at least a little fun – maybe even a little joy – during the day. We needn’t focus excessively on making everything delightful. But, more than a deliberate practice model, we should keep in mind our students’ rightful need for connection and even elation.

In Conclusion

First: although I’m arguing that deliberate practice doesn’t necessarily promote the kind of learning that schools undertake, I do (of course!) admire this research pool, and Ericsson’s towering role in it.

Second: Education suffers from a strange problem right now: we’ve got too many varieties of plausible-sounding guidance.

The problem isn’t finding something to try. It’s deciding which of the dozens (hundred?) of options to choose.

I certainly think that a deliberate practice model might be useful for teachers to know – especially teachers who focus on creating world-level experts.

But: I don’t think it should be the primary educational model for most of us.

We should think about managing working memory overload. And fostering attention. And creating the optimal level of desirable difficulty.

Let’s not be like Gladwell and simplistically apply Ericsson’s model to our work. Let’s find the parts that fit us perfectly, and use those to help students reach their Peak.


If you’re the sort of person who reads this blog, you’re also the sort of person likely to know that Anders Ericsson passed away at the beginning of July. In addition to being a world-renowned scientist, he was also famous for being an immensely kind person.

Certainly that was our experience here at Learning and the Brain. We have so many reasons to miss him.

Retrieval Practice is GREAT. Can We Make It Better?
Andrew Watson
Andrew Watson

By now you know that retrieval practice has lots (and lots) (and LOTS) of research behind it. (If you’d like a handy comprehensive resource, check out this website. Or this book.)

The short version: don’t have students review by putting information back into their brains — say, by rereading a chapter. Instead, have them pull information out of their brains — say, by quizzing themselves on that chapter.

It’s REALLY effective.

When we know that a technique works in general, we start asking increasingly precise questions about it.

Does it work for children and adult learners? (Yes.)

Does it work for facts and concepts? (Yes.)

Does it work for physical skills? (Yes.)

Does it work when students do badly on their retrieval practice exercises? Um. This is awkward. Not so much

That is: when students score below 50% on a retrieval practice exercise, then retrieval practices is less helpful than simple review.

How do we fix this problem?

“Diminishing Cues” and Common Sense

Let’s say I want to explain Posner and Rothbart’s “Tripartite Theory of Attention.” In their research, attention results from three cognitive sub-processes: “alertness,” “orienting,” and “executive attention.”

Depending on the complexity of the information I provide, this explanation might get confusing. If a retrieval practice exercise simply asks students to name those three processes, they might not do very well.

Common sense suggests a simple strategy: diminishing cues.

The first time I do a retrieval practice exercise on this topic, I provide substantial cues:

“Fill in these blanks: Posner and Rothbart say that attention results from al______, or_____, and ex_______ at______.”

A few days later, I might ask:

“Fill in these blanks: Posner and Rothbart say that attention results from ______, _____, and _______  ______.”

A week later:

“What three sub-processes create attention, in Posner and Rothbart’s view?”

And finally:

“Describe how attention works.”

The first instance requires students to retrieve, but offers lots of support for that retrieval. Over time, they have to do more and more of the cognitive work. By the end, I’m asking a pure retrieval question.

“Diminishing Cues” and Research

So, common sense tells us this strategy might work. In fact, I know teachers who have stumbled across this approach on their own.

Here at Learning and the Brain, we like common sense and we REALLY like research. Do we have research to support our instincts?

Yes.

In 2017, two researchers put together an impressive combination of studies.

They looked at different study strategies: review, retrieval practice, diminishing-cues retrieval practice.

They tested participants after different lengths of time: right away, 24 hours later, a week later.

They tested different amounts of studying: 3 sessions, 6 sessions…

You get the idea.

Because they ran SO MANY studies, they’ve got LOTS of data to report.

The short version: “diminishing cues retrieval practice” ALWAYS helped more than traditional review (rereading the chapter). And it OFTEN helped more than plain-old retrieval practice (self-quizzing on the chapter).

If you want the details, you can check out the study yourself; it’s not terribly jargony. The process is a bit complicated, but the key concepts are easy to grasp.

To Sum Up

Retrieval practice helps students learn.

If we want to ensure that it works optimally, we should use it multiple times — and successively remove more and more scaffolding from the retrieval practice questions we ask.

Common sense and research agree.