Skip to main content
How Does Self-Control Really Work? Introducing a Debate
Andrew Watson
Andrew Watson

Every teacher I know wishes that our students could control themselves just a little bit better. Or, occasionally, a whole lot better.

Rarely do we worry that students have too much self-control.

All these observations prompt us to ask: how does this thing called self-control really work?

In the field of psychology, that question has led to a fierce debate. If you’d like to enter into that debate, well, I’ve got some resources for you!

A Very Brief Introduction

Roy Baumeister has developed a well-known theory about self-control. You can read about it in depth in his book Willpower: Rediscovering the Greatest Human Strength, written with John Tierney.

Think of self-control as a kind of inner reservoir. My reservoir starts the day full. However, when I come down for breakfast, I see lots of bacon. I know I…MUST…RESIST…BACON, and that self-control effort drains my reservoir a bit.

However, once I finish my oatmeal and leave the kitchen, the bacon no longer tempts me so strongly. I’ve stopped draining the reservoir, and it can refill.

Baumeister’s theory focuses on all the things that drain the reservoir, and all the strategies we can use to a) refill it, or b) expand it.

Baumeister calls this process by a somewhat puzzling name: “ego depletion.” The “depletion” part makes good sense: my reservoir is depleted. The “ego” part isn’t as intuitive, but we’ll get used to that over time.

The key point: in recent years, the theory of ego depletion has come under debate — especially as part of the larger “replication crisis” in psychology.

Some say the theory has (literally) hundreds of studies supporting it. Others note methodological problems, and worry that non-replications languish in file drawers.

Welcome Aboard

Because self-control is so important to teachers, you just might be intrigued and want to learn more.

One great resource is a podcast, charmingly titled “Two Psychologists, Four Beers.” A couple times a month, Yoel Inbar and Michael Inzlicht get together over a few brews and chat about a topic.

In this episode, they talk about this controversy at length and in detail. SO MUCH interesting and helpful information here.

One key point to know: Inzlicht himself is a key doubter of Baumeister’s research. He’s not a dispassionate observer, but an important critic.

Friendly On Ramp

However interested you are in the topic of self-control, you might not have 80 minutes to devote to it.

Or, you might worry it will be overly complex to understand the first time through.

Good news! Ahmad Assinnari has put together a point-by-point summary of the podcast. 

You could read it as an introduction to an upcoming debate, and/or follow along to be sure you’re tracking the argument clearly. (BTW: Assinnari refers to Inzicht both as “Inzlicht” and as “Michael.” And, beware: it’s easy to confuse “Michael” with “Michel,” another scholar in the field.)

So, if you’d like to learn more, but you’re not sure you want to read Baumeister’s book, this post serves as an introduction to Assinnari’s summary. And, Assinnari’s summary introduces the podcast.

With these few steps, you’ll be up to speed on a very important debate.

A Fresh Approach to Evaluating Working Memory Training
Andrew Watson
Andrew Watson

Because working memory is SO IMPORTANT for learning, we would love to enhance our students’ WM capacity.

Alas, over and over, we find that WM training programs just don’t work (here and here and here). I’ve written about this question so often that I’ve called an informal moratorium. Unless there’s something new to say, or a resurgence of attempts to promote such products, I’ll stop repeating this point.

Recently I’ve come across a book chapter that does offer something new. A research team led by Claudia C. von Bastian used a very powerful statistical method to analyze the effectiveness of WM training programs.

This new methodology (which I’ll talk about below) encourages us to approach the question with fresh eyes. That is: before I read von Bastian’s work, I reminded myself that it might well contradict my prior beliefs.

It might show that WM training does work. And, if it shows that, I need to announce that conclusion as loudly as I’ve announced earlier doubts.

In other words: there’s no point in reading this chapter simply to confirm what I already believe. And, reader, the same applies for you. I hereby encourage you: prepare to have your beliefs about WM training challenged. You shouldn’t read the rest of this post unless you’re open to that possibility.

New Methodology

One problem with arguments about WM training is that sample sizes are so small. In one recent meta-analysis, the average sample size per study was 20 participants.

In a recent book on cognitive training, von Bastian, Guye, and De Simoni note that small sample sizes lead to quirky p-values. In other words, we struggle to be sure that the findings of small studies don’t result from chance or error.

Instead, von Bastian & Co. propose using Bayes factors: an alternate technique for evaluating the reliability of a finding, especially with small sample sizes. The specifics here go WAY beyond the level of this blog, but the authors summarize handy tags for interpreting Bayes factors:

1-3               Ambiguous

3-10            Substantial

10-30         Strong

30-100      Very Strong

100+         Decisive

They then calculate Bayes factors for 28 studies of WM training.

Drum Roll, Please…

We’ve braced ourselves for the possibility that a new analytical method will overturn our prior convictions. Does it?

Well, two of the 28 studies “very strongly” suggest WM training works. 1 of the 28 “substantially” supports WM training. 19 are “ambiguous.” And 6 “substantially” suggest that WM training has no effect.

In other words: 3 of the 28 show meaningful support of the hypothesis. The other 25 are neutral or negative.

So, in a word: “no.” Whichever method you use to evaluate the success of WM training, we just don’t have good reason to believe that it works.

Especially when such training takes a long time, and costs lost of money, schools should continue to be wary.

Three Final Notes

First: I’ve focused on p-values and Bayes factors in this blog post. But, von Bastian’s team emphasizes a number of problems in this field. For instance: WM training research frequently lacks an “active” control group. And, it often lacks a substantial theory, beyond “cognitive capacities should be trainable.”

Second: This research team is itself working on an intriguing hypothesis right now. They wonder if working memory capacity cannot be trained, but working memory efficiency can be trained. That’s a subtle but meaningful distinction, and I’m glad to see they’re exploring this question.

So far they’re getting mixed results, and don’t make strong claims. But, I’ll keep an eye on this possibility — and I’ll report back if they develop helpful strategies.

Third: I encouraged you to read von Bastian’s chapter because it might change your mind. As it turns out, the chapter probably didn’t. Instead it confirmed what you (and certainly I) already thought.

Nonetheless, that was an important mental exercise. Those of us committed to relying on research for teaching guidance should be prepared to change our approach when research leads us in a new direction.

Because, you know, some day a new WM training paradigm just might work.


von Bastian, C. C., Guye, S., & De Simoni, C. (2019). How strong is the evidence for the effectiveness of working memory training? In M. F. Bunting, J. M. Novick, M. R. Dougherty & R. W. Engle (Eds.), Cognitive and Working Memory Training: Perspectives from Psychology, Neuroscience, and Human Development (pp. 58–75). Oxford University Press.

Where Should Students Study?
Andrew Watson
Andrew Watson

We’ve got lots of advice for the students in our lives:

How to study: retrieval practice

When to study: spacing effect

Why study: so many answers

Where to study: …um, hold please, your call is very important to us…

As can happen, research provides a counter-intuitive — and sometimes contradictory — answers to that last question.

I grew up hearing the confident proclamation that we should create a perfect study environment in one place, and always study there. (The word “library” was spoken in reverent tones.)

As I think about the research I’ve seen in the last ten years, my own recommendations to students have been evolving.

Classic Beginnings

In a deservedly famous study, Smith, Glenberg and Bjork (1978) tried to measure the effect on environment on memory.

They found that, in the short run, I associate the words that I learn in this room with the room itselfThat is: if I learn words in room 27, I’ll do better on a test of those words in room 27 than in room 52.

One way to interpret those findings is that we should teach in the place where students will be tested.

If the final exam, inevitably, is in the gym, I should teach my students in the gym. And they should study in the gym. This approach ensures that they’ll associate their new knowledge with the place they have to demonstrate that knowledge.

In this theory, students should learn and study in the place they’ll ultimately be tested.

Priority Fix #1

This interpretation of Smith’s work makes sense if — and only if — the goal of learning is to do well on tests.

Of course, that’s not my goal. I don’t want my students to think carefully about literature for the test; I want them to think carefully about literature for life.

I want them to have excellent writing skills now, and whenever in the future they need to write effectively and clearly.

We might reasonably worry that a strong association between the room and the content would limit transfer. That is: if I connect the material I’ve learned so strongly with room 27, or the gym, I might struggle to remember or use it anywhere else.

Smith worried about that too. And, sure enough, when he tested that hypothesis, his research supported it.

In other words, he found that students who study material in different locations can use it more flexibly elsewhere. Students who study material in only one location can’t transfer their learning so easily. (By the way: Smith’s research has been replicated. You can read about this in Benedict Carey’s How We Learn. Check out chapter 3.)

This finding leads to a wholly different piece of advice. Don’t do what my teachers told me to do when I was a student. Instead, study material in as many different places as reasonably possible. That breadth of study will spread learning associations as widely as possible, and benefit transfer.

That’s what I’ve been telling students for the last several years.

Voila. Generations of teaching advice overturned by research!

Priority Fix #2

Frequent readers have heard me say: “Researchers work by isolating variables. Schools work by combining variables.”

The longer I do this work, the longer I think that this “where to study” advice makes sense only if I focus exclusively on that one variable.

If I start adding in other variables, well, maybe not so much.

True enough, research shows that I’ll remember a topic better if I study it in different places … as long as all other variables being held constant. But, in life, other variables aren’t constant.

Specifically, some study locations are noisier than others. Starbucks is louder than the library: it just is. And, some locations are visually busier than others.

And, as you would expect, noise — such as music — distracts from learning. So, too, do visually busy environments.

So, a more honest set of guidelines for students goes like this:

You should review material in different places. But, you want each of those places to be quiet. And, you don’t want them to have much by way of visual distraction.

You know what that sounds like to me? The library.

I suppose it’s possible for students to come up with several different study locations that are equally quiet and visually bland. Speaking as a high school teacher, I think it’s unlikely they’ll actually do that.

So, unless they’ve got the bandwidth to manage all those demands even before they sit down to study, then I think the traditional advice (“library!”) is as good as anything.

Final Thoughts

People occasionally ask me where I am in the “traditional vs. progressive” education debate.

The honest answer is: I’m indifferent to it. I (try to) focus on practical interpretations of pertinent psychology and neuroscience research.

If that research leads to a seemingly innovative suggestion (“study in many locations!”), that’s fine. If it leads to a traditional position (“library”), that’s equally fine.

I think that, for the most part, having teams in education (prog vs. trad) doesn’t help. If we measure results as best we can, and think humbly and open-mindedly about the teaching implications, we’ll serve our students best.

Today’s Humble Pie: 206 Bones
Andrew Watson
Andrew Watson

Back in early November, I wrote about a study seeming to contrast direct instruction with more constructivist approaches.

I argued that those labels simply didn’t apply to the actual teaching methodologies measured in the research.

So, the “inquiry and problem-based pedagogy” [IPP] used all sorts of direct instruction. Here’s the authors’ summary of that method; I’ve put some words in bold:

“When done well, IPP includes elements of explicit instruction and scaffolding.

Teachers facilitate learning by guiding students through a series of steps and explicitly relating learning to students’ prior knowledge and experiences.

Teachers guide learners through complex tasks with explicit instructions that are relevant to the problems at hand.

They provide structure and scaffolding that help students not only carry out specific activities, but also comprehend why they are doing those activities and how they are related to the set of core concepts they are exploring.”

So, as you can see “direct instruction” techniques are built into this method.

And, the method described as “traditional” seems to me an obvious straw man. Again, quoting from the research:

“Students copy facts about bone tissues and the names of 206 bones of the human skeleton that teachers have written on the blackboard into notebooks.”

I mean, seriously, who does that? Copies the names of 206 bones? Except for Charles Dickens’s Gradgrind — “facts, facts, facts!” — who would do such a thing?

Slice of Pie

I was discussing this study with a friend recently, and it turns out: her college professor would do such a thing. Right here in Massachusetts — home of the very first Learning and the Brain conference!  — her vertebrate anatomy professor put up slides of skeletons and labeled all the bones.

Slide after slide after slide. After slide. (Apparently he interspersed them with funny stories, just to keep the students awake. In my friend’s telling: his students remembered the stories, but not the anatomy.)

Except for the funny stories, Gradgrind would be proud indeed.

In any case, it’s clear that this “traditional” method is NOT a straw man, and at least one professor seems to think it a good idea.

So, to be clear: I do think asking students to memorize some core facts is not only plausible but beneficial. Without essential information in long-term memory, working memory will be overloaded by too much external information.

But: I can’t think of any research-based argument for an entire class — much less an entire course! — devoted to listing bones. That’s not direct instruction. That’s purgatory.

Two Core Points

Point from November’s post: as I wrote back in November, we can’t use this research to champion a pure constructivist approach to learning, because IPP includes lots o’ direct instruction.

Point from today’s post: “direct instruction” does not mean “presenting unbroken lists of facts, and then calling for them to be repeated.” Even if that really happens [shudder], that’s a profound misunderstanding of research and terminology.

“Direct instruction” does mean introducing enough factual or conceptual information to allow students to work thoughtfully — and increasingly independently — on a well-scaffolded series of problems.

Of course, this definition can be expanded and formalized. But: whatever you call “copy the names of 206 bones,” please don’t call it direct instruction.

Is it Better to be a “Natural” or a “Striver”?
Andrew Watson
Andrew Watson

Consider top performers in a given field: inventors, artists, athletes, academics, and so forth.

Presumably, their elite performance results from some mysterious combination of innate ability and effortful practice.

But which of those two variables matters more, the ability or the practice?

And — here’s a super interesting question — does my answer to that explicit question line up with the implicit value judgments that I make in real life?

In other words: I might say I prefer ability (or practice), but end up valuing the practice (or ability).

How might we measure such a troubling possibility?

Expert Evaluators

Two researchers — Chia-Jung Tsay and Mahzarin Banaji — developed a clever strategy to answer this question.

Tsay and Banaji gave professional musicians brief bios of two pianists. One bio emphasized the all the hard work that the pianist had put into her growth as a musician. (In the researchers’ language, she was a “striver.”)

The other bio emphasized the innate ability that the pianist had. (She was a “natural.”)

The expert musicians then heard brief excerpts of recordings of these two musicians. They rated the performances on various scales, including their “musical achievement,” and whether or not they would like to hear the performance again.

Finally, they answered questions asking them directly whether they valued “effortful training” or “natural talent.”

What did the researchers learn from all these questions and evaluations?

The Envelope, Please

Tsay and Banaji’s research paradigm includes a surprise: the two brief musical excerpts came from the same pianist playing the same piece. Heck, they were from the same recording.

In other words: they were of identical musical achievement. And, we would predict that the expert evaluators would be equally eager to hear these two performances again — because they were the same performance.

When asked explicitly, the evaluators said they valued practice more than talent. (The d value here is 0.57, which is noteworthy.) So, presumably, given this set of circumstances, they might prefer the performance by the striver.

But, nope.

They preferred the natural. (The d value here is 0.79. That’s really big.)

So, even though the performances were equally accomplished, and the evaluators said they valued effort, their evaluations suggest that they actually valued talent.

Teaching Implications

First: we shouldn’t panic. This is one study looking at a specific evaluation of a specific kind of expert performance. Yes: Tsay and Banaji did all the responsible things to test their hypothesis in different ways — I haven’t summarized two related experiments they did.

But: before we extrapolate too zealously, we should be curious about other research into this question.

Second: Specifically, I wonder how much this preference for “naturals” over “strivers” has a cultural influence. This research was done in an American cultural context. Are Americans unusually keen on talent over effort? What do we find when we look within other cultural norms?

Third: Even with these caveats, I myself will be even more skeptical about my ability to judge between talent and effort objectively. I’m sure that, if you ask me, I’ll tell you I value the effort. But, this research suggest I’ll make decisions based on my appreciation of your talent.

To take a provocative example: when I talk with people who manage “Gifted and Talented” programs, I often hear they value hard work as much as “gifts and talents.” In the future, I will encourage people with those (laudable) values to look under the hood.

Do they have systems in place to measure hard work? Do those measurements, in fact, influence program decisions? Do they — more specifically — benefit people who truly work harder?

In sum: if we in fact value striving, then we should be sure we reward striving — even though it might not feel natural to do so.

“How We Learn”: Wise Teaching Guidance from a Really Brainy Guy
Andrew Watson
Andrew Watson

Imagine that you ask a neuro-expert: “What’s the most important brain information for teachers to know?”

The answer you get will depend on the expertise of the person you ask.

If you ask Stanislas Dehaene, well, you’ll get LOTS of answers — because he has so many areas of brain expertise.

He is, for example,  a professor of experimental cognitive psychology at the Collège de France; and Director of the NeuroSpin Center, where they’re building the largest MRI gizmo in the world. (Yup, you read that right. IN THE WORLD.)

He has in fact written several books on neuroscience: neuroscience and reading, neuroscience and math, even neuroscience and human consciousness.

He’s also President of a newly established council to ensure that teacher education in all of France has scientific backing: the Scientific Council for Education. (If the United States had such a committee, we could expunge Learning Styles myths from teacher training overnight.)

If that’s not enough, Dehaene is interested in artificial intelligence. And statistics. And evolution.

So, when he writes a book called How We Learn: Why Brains Learn Better than Any Machine…for Now, you know you’re going to get all sorts of wise advice.

Practical Teaching Advice

Dehaene wants teachers to think about “four pillars” central to the learning process.

Pillar 1: Attention

Pillar 2: Active engagement

Pillar 3: Error feedback

Pillar 4: Consolidation

As you can see, this blueprint offers practical and flexible guidance for our work. If we know how to help students pay attention (#1), how to help them engage substantively with the ideas under discussion (#2), how to offer the right kind of feedback at the right time (#3), and how to shape practice that fosters consolidation (#4), we’ll have masterful classrooms indeed.

Learning, of course, begins with Attention: we can’t learn about things we don’t pay attention to. Following Michael Posner’s framework, Dehaene sees attention not as one cognitive process, but as a combination of three distinct cognitive processes.

Helpfully, he simplifies these processes into three intuitive steps. Students have to know:

when to pay attention

what to pay attention to, and

how to pay attention.

Once teachers start thinking about attention this way, we can see all sorts of new possibilities for our craft. Happily, he has suggestions.

Like other writers, Dehaene wants teachers to focus on active engagement (pillar #2). More than other writers, he emphasizes that “active” doesn’t necessarily mean moving. In other words, active engagement requires not physical engagement but cognitive engagement.

This misunderstanding has led to many needlessly chaotic classroom strategies, all in the name of “active learning.” So, Dehaene’s emphasis here is particularly helpful and important.

What’s the best way to create cognitive (not physical) engagement?

“There is no single miraculous method, but rather a whole range of approaches that force students to think for themselves, such as: practical activities, discussions in which everyone takes part, small group work, or teachers who interrupt their class to ask a difficult questions.”

Error Feedback (pillar #3) and Consolidation (#4) both get equally measured and helpful chapters. As with the first two, Dehaene works to dispel myths that have muddled our approaches to teaching, and to offer practical suggestions to guide our classroom practice.

Underneath the “Four Pillars”

These four groups of suggestions all rest on a sophisticated understanding of what used to be called the “nature/nurture” debate.

Dehaene digs deeply into both sides of the question to help teachers understand both brain’s adaptability (“nurture”) and the limits of that adaptability (“nature”).

To take but one example: research with babies makes it quite clear that brains are not “blank slates.” We come with pre-wired modules for processing language, numbers, faces, and all sorts of other things.

One example in particular surprised me: probability. Imagine that you put ten red marbles and ten green marbles in a bag. As you start drawing marbles back out of that bag, a 6-month-old will be surprised — and increasingly surprised — if you draw out green marble after green marble after green marble.

That is: the baby understands probability. They know it’s increasingly likely you’ll draw a red marble, and increasingly surprising that you don’t. Don’t believe me? Check out chapter 3: “Babies’ Invisible Knowledge.”

Of course, Dehaene has fascinating stories to tell about the brain’s plasticity as well. He describes several experiments — unknown to me — where traumatized rats were reconditioned to prefer the room where the traumatizing shock initially took place.

He also tells the amazing story of “neuronal recycling.” That is: the neural real-estate we train to read initially housed other (evolutionarily essential) cognitive functions.

Human Brains and Machine Learning

Dehaene opens his book by contemplating definitions of learning — and by contrasting humans and machines in their ability to do so.

By one set of measures, computers have us beat.

For instance, one computer was programmed with the rules of the game Go, and then trained to play against itself. In three hours, it became better at the game than the human Go champion. And, it got better from there.

However, Dehaene still thinks humans are the better learners. Unlike humans, machines can’t generalize their learning. In other words: that Go computer can’t play any other games. In fact, if you changed the size of the Go board even slightly, it would be utterly stumped.

And, unlike humans, it can’t explain its learning to anyone else.

And, humans need relatively little data to start learning. Machines do better than us when they can crank millions of calculations. But, when they calculate as slowly as we do, they don’t learn nearly as much as we do.

As his subtitle reassures us, brains learn better than any machine. (And, based on my conversation with him, it’s clear that “…for now” means “for the long foreseeable future.”)

Final Thoughts

At this point, you see what I mean when I wrote that Dehaene has an impressive list of brain interests, and therefore offers an impressive catalog of brain guidance.

You might, however, wonder if this much technical information ends up being a little dry.

The answer is: absolutely not.

Dehaene’s fascination with all things brain is indeed palpable in this book. And, his library of amazing studies and compelling anecdotes keeps the book fresh and easy-to-read. I simply lost track of the number of times I wrote “WOW” in the margin.

This has been a great year for brain books. Whether you’re new to the field, or looking to deepen your understanding, I recommend How We Learn enthusiastically.

https://www.youtube.com/watch?time_continue=62&v=23KWKoD8xW8&feature=emb_logo

Whose Online Teaching Advice Do You Trust?
Andrew Watson
Andrew Watson

Many people who offer teaching advice cite psychology and neuroscience research to support their arguments.

If you don’t have time to read that underlying research — or even the expertise to evaluate its nuances — how can you know whom to trust?

There are, of course, MANY answers to that question (for instance: here and here and here). I want to focus on a very simple one today.

My advice comes in the form of a paradox: You should be likelier to TRUST people who tell you to DOUBT them.

Example #1

I thought of this paradox last week when reading a blogger’s thoughts on Jeffrey Bowers. Here’s the third paragraph of the blogger’s post:

I am a researcher working in the field of cognitive load theory. I am also a teacher, a parent and a blogger with a lot of experience of ideological resistance to phonics teaching and some experience of how reading is taught in the wild. All of these incline me towards the systematic teaching of phonics. I am aware that Bowers’ paper will be used by phonics sceptics to bolster their argument and that predisposes me to find fault in it. Bear that in mind.

In this paragraph, the blogger gives the reader some background on his position in an ongoing argument.

He does not claim to read Bowers’s work as an above-the-fray, omniscient deity.

Instead, he comes to this post with perspectives — let’s just say it: biases — that shape his response to Bowers’s research.

And he explicitly urges his reader to “bear [those biases] in mind.”

Of course, in the world of science, “bias” doesn’t need to have a negative connotation. We all have perspectives/biases.

By reminding you of these perspectives — that is, his limitations — the blogger gives you reasons to doubt his opinion.

And my argument is: because he reminded you to doubt him, you should be willing to trust him a little bit more.

The blogger here is Greg Ashman, who writes a blog entitled Filling the Pail. Lots of people disagree with Ashman quite vehemently, and he disagrees right back.

My point in this case is not to endorse his opinions. (I never write about reading instruction, because it’s so complicated and I don’t know enough about it to have a developed opinion.)

But, anyone who highlights his own limitations and knowledge gaps in an argument gets my respect.

Example #2

Over on Twitter, a professor recently tweeted out a quotation from the executive summary of a review. (The specific topic isn’t important for the argument I’m making.)

Soon after, he tweeted this:

“When I tweeted out [X’s] new review of [Y] a few days ago, I pulled a non-representative quote from the exec summary.

It seemed to criticize [Y] by saying [Z] … [However, Z is] not the key criticism in the review. Here I’ve clipped more serious concerns.”

He then posted 4 substantive passages highlighting the review’s core critiques of Y.

In other words, this professor told you “I BLEW IT. I created an incorrect impression of the review’s objections.”

You know what I’m about to say now. Because this professor highlighted reasons you should doubt him — he blew it — I myself think you should trust him more.

We all make mistakes. Unlike many of us (?), this professor admitted the mistake publicly, and then corrected it at length.

In this case, the professor is Daniel Willingham — one of the most important scholars working to translate cognitive psychology for classroom teachers.

He’s written a book on the subject of skepticism: When Can You Trust the Experts. So, it’s entirely in character for Willingham to correct his mistake.

But even if you didn’t know he’d written such a book, you would think you could trust him because he highlighted the reasons you should not.

In Sum

Look for thinkers who highlight the limitations of the research. Who acknowledge their own biases and gaps in understanding. Who admit the strengths of opposing viewpoints.

If you hear from someone who is ENTIRELY CERTAIN that ALL THE RESEARCH shows THIS PSYCHOLOGICAL PRINCIPLE WORKS FOR ALL STUDENTS IN ALL CLASSROOMS — their lack of self-doubt should result in your lack of trust.

I’m Curious: Does Curiosity Promote Learning?
Andrew Watson
Andrew Watson

Conventional wisdom tells us that curiosity is bad for cats but good for learning.

What does psychology research tell us?

We’ve got a few decades of research showing links between curiosity and learning. A precise description of those links a) would be REALLY helpful for teachers, and b) is hard to complete.

In a recent study, Dr. Shirlene Wade and Dr. Celeste Kidd tried to fill out that description.

Four Key Variables

Wade and Kidd invited adults to take a trivia test. The test included quite challenging questions: “What U.S. president’s face graces a $100,000 bill?” (In case you haven’t handled any $100,000 bills lately, the answer is: Woodrow Wilson.)

After the participants guessed the answer, they rated a) their confidence that they got the answer right, and b) their curiosity about the actual answer. They then saw the correct answer to the question.

After being distracted for a while, they then tried to answer the same trivia questions again.

This research paradigm allowed Wade and Kidd to measure

Participants’ curiosity: how much did they want to know the answer?

Their confidence: how much did they think they already knew?

Their prior knowledge: how much did they actually know?

and

Their learning: how many additional answers did they get right?

And, of course, Wade and Kidd could start looking for relationships among these variables.

What Promotes Curiosity?

Participants, of course, weren’t equally curious about all the answers. Instead, their curiosity depended on their confidence.

Specifically, when participants were almost sure — but not completely sure — that they knew the right answer, then they were most curious.

Notice, crucially, that their actual prior knowledge didn’t predict curiosity. So, if they thought they were probably right (high confidence) but were actually quite badly wrong (low prior knowledge), they still were highly curious about the answer.

What Promotes Learning?

The early part of the study shows that confidence (not actual knowledge) predicts curiosity.

But: what predicts learning? If a participant got a question wrong initially, what helped him/her learn the correct answer and get it right on the later test.

The answer is: not curiosity. Instead, the answer is actual prior knowledge.

So, back to the question about the $100,000 bill. If I had predicted that … say … Mahatma Gandhi’s picture were on the bill, well, that’s just wildly wrong.

But, if I had predicted that William Howard Taft’s face were on the bill, well, I was pretty close. If nothing else, Taft served as president immediately before Wilson. And, Taft was also Chief Justice of the Supreme Court — so his historical importance might justify being on such a big bill.

So: students who think they’re almost right will be more curious; students who are almost right will learn faster.

Teaching Implications

As always, I should emphasize that this is just one study. And, in this one study, adults learned answers to trivia questions. They were tested almost right away.

This research paradigm leads to interesting findings, but it doesn’t tell us exactly how to teach our students (who might not be adults) our curriculum (which, almost certainly, isn’t answers to trivia questions). And, we can’t be 100% certain that it resulted in long-term learing.

In any case, I think the teaching implications are: we should focus both on our students’ curiosity and on their prior knowledge.

That is: we want them to reasonably believe that they’re close to learning the answer. And, we want them to have enough prior knowledge to absorb the answer when they get it.

That interpretation doesn’t sound shocking.

However, it does offer some useful warnings. If we hear of a teaching methodology that focuses entirely on curiosity, or entirely on prior knowledge, we should hesitate before embracing it.

After all: curiosity inspires students to keep working. And prior knowledge allows them to learn from their curiosity-inspired efforts.

Retrieval Grids: The Good, the Bad, and the Potential Solutions
Andrew Watson
Andrew Watson

Retrieval Practice is all the rage these days — for the very excellent reason that it works.

Study after study after study suggests that students learn more when they pull information out of their brains (“retrieval practice”) than by putting information back into their brains (“mere review”).

So, teachers naturally want to know: what specific teaching and studying strategies count as retrieval practice?

We’ve got lots (and lots) of answers.

Flashcards — believe it or not — prompt retrieval practice.

Short answer quizzes, even if ungraded. (Perhaps, especially if ungraded)

Individual white boards ensure that each student writes his/her own answer.

So, you can see this general research finding opens many distinct avenues for classroom practice.

Retrieval Grids: The Good

One strategy in particular has been getting lots of twitter love recently: the retrieval grid.

You can quickly see the benefits.

In the retrieval-grid quiz below, students answer short questions about Macbeth. Crucially, the grid includes questions from this week (in yellow), last week (in blue), and two weeks ago (in red). 

Because students get more points for answering older/harder questions, the form encourages retrieval of weeks-old information.

So much retrieval-practice goodness packed into so little space. (By the way: this “quiz” can be graded, but doesn’t need to be. You could frame it as a simple review exercise.)

Retrieval Grids: My Worries

Although I like everything that I’ve said so far, I do have an enduring concern about this format: the looming potential for working memory overload.

Students making their way through this grid must process two different streams of information simultaneously.

In part of their working memory, they’re processing answers to Macbeth questions.

And, with other parts of their working memory, they’re processing and holding the number of points that they’ve earned.

And, of course, those two different processing streams aren’t conceptually related to each other. One is Macbeth plot information; the other is math/number information.

As you know from my summer series on working memory, that’s a recipe for cognitive overload.

Retrieval Grids: Solutions?

To be clear: I’m not opposed to retrieval grids. All that retrieval practice could help students substantially.

I hope we can find ways to get the grid’s good parts (retrieval practice) without the bad parts (WM overload).

I don’t know of any research on this subject, but I do have some suggestions.

First: “if it isn’t a problem, it isn’t a problem.” If your students are doing just fine on retrieval grids, then obviously their WM isn’t overwhelmed. Keep on keepin’ on.

But, if your students do struggle with this format, try reducing the demands for simultaneous processing. You could…

Second: remove the math from the process. Instead of requiring 15 points (which requires addition), you could simply require that they answer two questions from each week. You could even put all the week-1 questions in the same row or column, in order to simplify the process. Or,

Third: include the math on the answer sheet. If they write down the points that they’ve earned at the same time they answer the question, they don’t have to hold that information in mind. After all, it’s right there on the paper. So, a well-designed answer sheet could reduce WM demands.

Fourth: no doubt, as you read this, you are already coming up with your own solutions. If you have an idea that sounds better than these — try it! (And, I hope you’ll share it with me.)

To Sum Up

Researchers work by isolating variables. Teachers and classrooms work by combining variables.

So: researchers who focus on long-term memory will champion retrieval practice and retrieval grids.

Researchers who focus on working memory might worry about them.

By combining our knowledge of both topics, we teachers can reduce the dangers of WM overload in order to get all the benefits of retrieval practice.

That’s a retrieval-grid win-win.

Should Students Exercise DURING Learning? A Twitter Debate Rages…
Andrew Watson
Andrew Watson

Edu-Twitter loves a good battle, and one irrupted just this week.

A teacher posted a video of students reading while peddling exercise bikes.

Comments roared in thick and fast.

Several people responded with “AWESOME” or “<3 this” or some other emoji for upbeat enthusiasm. But — at least in my Twitter feed — the angry skeptics were as scathing as the early fans were enthusiastic. (The word “bonkers” showed up frequently.)

Twitter doesn’t allow space for nuance (one reason I still write thousand word blog posts). In this case, I think, the twitter “debate” would have been greatly improved by context. In fact, it really needed two distinct categories of context.

Context, Part I: The Teacher*

Skeptics who responded to this post — reasonably enough — worried that reading while exercising might interfere with the students’ ability to do either thing well. (I’ll explore this concern in the next section.)

However, I didn’t see any commentary that focused on this important fact: the teacher who created this initiative is a physical education teacher. That is: it’s his job to think about and promote his students’ physical health.

In fact, he has quite a history of trying out imaginative approaches to that goal.

He’s got students playing drums with glow-in-the-dark drumsticks. (And, yes: they’re playing drums in the dark.) He’s got them doing fun obstacle courses. He’s got them kicking field-goals in the gym…by projecting goal-posts on the gym’s upper wall! If nothing else, you know his students will enjoy his class.

And, he cites lots o’ research showing the benefits of aerobic exercise for long-term memory formation. (Again, I’ll talk about this research below.)

So: we might quite reasonably worry that this biking-while-reading initiative won’t have the effect that the PE teacher wants it to. At the same time, any teacher who experiments as frequently as this teacher does will, no doubt, try some things that don’t work along the way.

But, heaven knows, I try plenty of things that don’t work in my teaching — that’s simply the price of being committed to trying new things.

And — again — this guy teaches PE. In my view, he SHOULD be trying to find ways to get more physical activity into his students’ daily schedules. Even if exercise bikes aren’t exactly the right answer, he’s questing in the right direction.

Context, Part II: The Research on Exercise and Learning

So, what does research say about exercise and learning?

In the first place, we can state with real confidence that physical fitness improves learning. We can measure this effect in many ways. But, for instance, if I increase my fitness — trust me, I’m trying! — research strongly suggests I’ll improve several cognitive functions: attention, reaction time, and so forth.

We also know a lot about the neuro-biological mechanisms. For instance: exercise boosts production of brain-derived neurotrophic factor (BDNF). And, BDNF does lots of splendid things to improve synapse formation.

To explore this general pool of research, you can start with John Ratey’s book Spark: The Revolutionary New Science of Exercise and the Brain. It’s more than a decade old now, but still a great introduction to this field.

If you’d like to watch a super-upbeat neuroscience+exercise Ted Talk, click here.

Research on Exercise DURING Learning

But, in the second place, can we conclude that exercise during learning provides benefits, above and beyond the exercise itself? I asked Dr. Steve Most, whose lab has done some really interesting work on exercise and cognition. (You can follow him on Twitter: @SBMost.)

Here’s part of an email Dr. Most sent me:

Most of the research cited [by the teacher who posted the video] seems to do with links between general exercise/fitness and cognition. I think that’s a pretty well established link, but most of it doesn’t say anything about exercise during learning…

I’m not really convinced. One could even imagine that the scenario in the clip entails divided attention (depending on how much attention kids pay to the exercise itself), in which case it could be counter-productive.

I am aware of a study here and there that suggests that mild exercise during study can increase memory, but I don’t think the findings rise to the level of a consistent body of evidence (there may even be findings here and there of the opposite effect).

Like many Twitter objectors, Dr. Most worries that the bike riding might distract from attention to the reading.

At the same time, he added an important caveat. The hypothesis that bicycle exercise during reading harms learning is plausible but also insufficiently tested.

That is, when I speculated to him that the exercise bikes would most likely divide the students’ attention and interfere with their learning, I was speculating as much as the teacher who hoped it would improve their learning. 

My speculation was reasonable, given evidence on the fragility of attention. But so too were the PE teacher’s hopes, given evidence about physical fitness and learning.

And, to be clear, we don’t have lots of research on this precise question, but we do have some. This study and this study both found that moderate-to-vigorous exercise during lessons improved learning.

There are important differences between those research paradigms and the exercise bikes used in the video. (The exercises themselves reinforced the concepts being learned.)  And, some of the research cited by the teacher is conspicuously light-weight. (No, “crossing the three mid-lines” doesn’t do anything special for your brain. It really doesn’t.)

But to me, at least, the tentative evidence we have suggest that the teacher’s hopes were far from “bonkers.”

I am, to be clear, skeptical myself. But I do think the idea worthy of study, for a number of reasons.

To Sum Up

First: we know quite confidently that exercise and fitness generally improve learning.

Second: we don’t have much research on the more specific question of exercise during learning. And, the research we do have doesn’t provide a consistent pattern of results.

That is: reading while riding an exercise bike might improve understanding, or impede it, or have no effect. We just don’t have enough research to say with Twitterable confidence.

Third: that being true, I think we should encourage teachers — especially PE teachers — to try plausible (if unproven) hypotheses in their classrooms. If they have plans in place to gather data, they can offer us real insight into new teaching possibilities.

Fourth: Twitter battles — especially those that devolve to emojis and insults — benefit from context. If you see a hot debate, look beyond it for research to guide your understanding.

 


*At the time that I’m writing this post (January 9), the teacher who posted the video has taken it down from twitter. I’m assuming (but I do not know) that the strong negative reaction prompted him to do so.

For that reason, I am not identifying him in this post, and am not linking to his account. Basically, I’m inferring a request for some degree of peace and privacy in his decision to take the video down.

I have reached out to the teacher to get his perspective on a) the goals of the initiative, and b) his students’ response to it. If I hear from him, I’ll write a follow-up post.