Skip to main content

Andrew Watson About Andrew Watson

Andrew began his classroom life as a high-school English teacher in 1988, and has been working in or near schools ever since. In 2008, Andrew began exploring the practical application of psychology and neuroscience in his classroom. In 2011, he earned his M. Ed. from the “Mind, Brain, Education” program at Harvard University. As President of “Translate the Brain,” Andrew now works with teachers, students, administrators, and parents to make learning easier and teaching more effective. He has presented at schools and workshops across the country; he also serves as an adviser to several organizations, including “The People’s Science.” Andrew is the author of "Learning Begins: The Science of Working Memory and Attention for the Classroom Teacher."

Book Review: Teaching Secondary Science, by Adam Boxer
Andrew Watson
Andrew Watson

Let’s start by making this simple:

First: You should absolutely buy Adam Boxer’s Teaching Secondary Science: A Complete Guide. Sooner is better than later.

Second: You will probably not READ Boxer’s book so much as you will STUDY it. Have a pen handy; some sticky notes; your favorite memory app. Whatever system you use to keep track of big ideas and vital details — have it ready to work.

Now that I’ve been bossy, let me explain why.

Two Big Surprises

Surprise #1:Book Cover for Adam Boxer's Teaching Secondary Science: A copmlete guide.

I myself don’t teach high-school science. (I taught 10th and 12th grade English, and worked at a summer camp for 8-14 year olds.)

So, the title (Teaching Secondary Science) might suggest that the book isn’t for me.

Well, Boxer’s book (and the precision of his thinking) will absolutely make me a better English teacher; I suspect his approach will benefit almost any teacher.

Here’s why…

Surprise #2:

Longtime readers know my mantra: “don’t just do this thing; instead, think this way.”

That is: cognitive science research cannot provide us with a script (“do this thing”). Instead, that research CAN give us ways to think about memory and attention and motivation and stress. When we “think this way” about those topics, we’ll have better ideas about our teaching.

Well, Boxer’s book comes as close as any to effectively defying this mantra.

His book includes a GREAT MANY “do this thing” kind of instructions.

Phrase your question this way, not that way.

Present topics in this order, not that order.

Calculate cognitive load with this formula, not that formula.

You might think, given my mantra, I’d resist the specificity of his advice.

And yet, over and over, I found myself agreeing with his logic, and believing that I’ll do better classroom work if I understand and follow several of his scripts.

To my astonishment, I’m highly tempted to “do things Boxer’s way.” Why? Because he’s already done so much thinking for me.

Case in Point

I recently discussed Boxer’s book with a group of friends. All of us had highlighted this specific advice:

When introducing a concept, start with examples, not definitions.

Why?

Because definitions are necessarily abstract, and abstraction increases working memory load.

Examples, in contrast, live comfortably in the familiar, concrete world. This very  familiarity and concreteness reduce WM load, and thereby makes learning easier.

When my friends and I tried to apply this advice to our own teaching world, we immediately saw its usefulness.

The Spanish teacher said: don’t start with the abstract definition of the subjunctive; start with familiar examples in English.

The PD provider said: don’t start with abstract definitions of “declarative” and “procedural” memory; start with concrete classroom examples.

And so forth.

Two points merit notice here.

First: although Boxer writes about science instruction, his guidance applies widely across disciplines and age groups.

Second: although Boxer’s advice stems from (abstract) cognitive psychology, he frames it in (concrete) teaching suggestions.

That is: over and over, Boxer’s book practices what it preaches. His book does what he tells us teachers should do.

You perhaps have heard a conference speaker give passionate teaching advice (“never talk for more than ten minutes!”), only to defy this advice in his hour-long talk. Boxer carefully avoids such hypocricy.

The Big One

A few of my opinions in this interdisciplinary field approach heresy. Here’s one:

In my view, cognitive load theory helps experts talk with other experts about working memory load in the classroom.

Paradoxically, however, cognitive load theory almost certainly overwhelms the working memory of non-experts. It is, after all, complicated and jargony. (Quick: define “element interactivity” and “germane load.”)

For that reason, cognitive load theory probably isn’t useful as a framework for discussing working memory load with teachers. (Several people whom I admire are howling as they read these paragraphs.)

Boxer does not articulate this heretical claim directly. However, he enacts its conclusion quite directly.

That is: he translates the abstractions of cognitive load theory into a concrete formula — a proportionality formula using words anyone can understand.

Rather than reproduce the mathematical version of the formula here, I’ll summarize it this way:

Task complexity and abstraction increase working memory load.

The student’s background knowledge and the teacher’s support reduce working memory load.

Therefore, to optimize working memory load, we should look out for those four variables and manage them appropriately. (He’s got CHAPTERS on each of those topics.)

If you speak cognitive load theory, you see exactly how Boxer has translated its abstractions into this concrete formulation.

But — crucially — you don’t need to speak cognitive load theory to get its benefits.

Boxer, again, has taken his own advice. He has started with concrete examples rather than abstract definitions; he has thereby made it MUCH easier to learn from this book.

Always with the Limitations

Having raved for several hundred words, let me add a few quick notes of caution.

First: I don’t agree with absolutely everything Boxer writes. (I don’t agree with absolulety everything I write.) For instance: he emphatically champions mini white boards; I don’t think they’ll work in my context.

Second: Boxer’s examples draw on science teaching in high school in England. All three of those truths require some degree of translation as you apply his ideas to your work.

The English education system thrives of mysterious acronyms; you’ll just have to figure them out. When the SLT talks with the NQT about Supply, well, I can’t help you there.

Third: Full disclosure, I should point out that Boxer’s publisher is also my publisher — so I might have a conflict of interest in writing such an enthusiastic review. I certainly don’t think this connection has skewed my perspective, but you should have that information to make your own decisions.

These few points aside, I return to my initial hearty recommendation.

When you read and study Boxer’s Teaching Secondary Science, you’ll get specific and wise guidance for applying the abstractions of cognitive science to your classroom.

You’ll enjoy it, and your students will learn more.

The Potential Benefits of Doodling and Chatting
Andrew Watson
Andrew Watson

This post will cover two topics simultaneously.

First, I’m going to describe recent research into the benefits (?) of doodling.

Second, I’m going to use a cool new artificial intelligence thingy to explore that research.

I found both fascinating; perhaps you will too.

The Story Begins

I don’t doodle. However, I know that some folks like to doodle; they say it helps them relax and focus. So, over the years, I’ve gotten increasingly zen about Team Doodle.

If doodlers don’t distract non-doodlers, I say: let the sketching begin!

Outline drawing of female student drawing

Recently I stumbled across actual research about doodling. I’m always happy to update my prior (mis)conceptions, so I reviewed this study enthusiastically.

At the same time, I thought it would be a fun opportunity to try out a new artificial intelligence platform: chatPDF.

In theory, this online tool:

… reads the PDFs you drop into it, and

… answers your questions about them.

Now, my earlier experience with chatGPT was — to put it mildly — frustrating. As I wrote then, it “hallucinated” citations for its claims.

Honestly, our field has enough difficulties without HAL just making up sources.

However, I thought, maybe chatPDF’s narrower mission will make it more effective. Why not give it a try?

So: here’s what I learned about doodling in the classroom, and about chatPDF…

First Steps

Using chatPDF couldn’t be simpler. Open the platform, drop in a PDF, start asking questions.

ChatPDF starts with a quick summary of the study:

Welcome to this informative PDF file about note-taking and doodling! In this article, you will learn about the surprising truth behind the effectiveness of doodling in reducing boredom and enhancing attention and retention of lecture material.

It also suggests a few questions you might want to ask.

What is the traditional view of doodling and fidgeting in educational contexts?
How many experiments were in this study and what was the total number of participants?
What were the findings of the study regarding the effectiveness of doodling in improving performance?

So, let’s start with the basics. I asked: “according to this study, what is the surprising truth about the effectiveness of doodling?” ChatPDF answered:

It does not appear to improve attention, retention of information, or reduce boredom or mind-wandering compared to other conditions such as note-taking or simply listening.

Wow!

In fact, in Experiment 1, participants who doodled did nominally worse on the memory assessment than those who solely listened. These findings contradict previous studies that have suggested that doodling can be an effective tool for enhancing attention and reducing boredom.

When I asked it about those contradictory studies, it responded:

Previous studies have suggested that doodling can be an effective tool for enhancing attention and reducing boredom. For example, a study by Andrade (2010) found that participants who doodled while listening to a phone message had better recall of the message content than those who did not doodle.

I followed up with some more specific questions: who were the participants? What exactly did they do in each experiment? How did the researchers measure “boredom” and “retention of information”? And so forth…

In every case, chatPDF provided prompt and clear answers. Helpfully, it also provided the page number on which it found the information — very useful for double checking!

Digging Deeper

So far, I have to say, I’m really impressed! When I asked the sorts of questions important to evaluating research, chatPDF found the answers and explained them clearly.

I started pushing a bit, asking the kinds of skeptical questions important in my work.

When I asked, “does the study mention boundary conditions?”, it responded:

Yes…The authors suggest that the lack of effect of doodling on attention and retention of information may be specific to certain types of tasks or situations.

For example, [the authors] note that previous studies have found that doodling can be effective in reducing anxiety during a stressful task (e.g., taking a test), and that it may be more effective for individuals who are more prone to mind-wandering or have difficulty focusing their attention.

A question about limitations got a list of 4 answers: everything from sample size to the fact that researchers measured retention right away — and so can’t make claims about long-term effects.

I tried to get chatPDF to speculate, but (happily) I didn’t succeed. For instance, I asked it about the benefits of doodling for students with ADHD, it responded:

The study does not specifically address whether doodling would be helpful for students with ADHD….

Given that ADHD is characterized by difficulties with attention and focus, it is possible that some students with ADHD may benefit from doodling as a tool for improving their attention and retention of information.

However, it is important to note that the current study found no evidence that doodling improves attention or retention of information.

In brief, chatPDF offered a nuanced balance: given previous research, it’s possible that doodling would benefits students with ADHD, but this research doesn’t answer that question.

In fact, it doesn’t even ASK that question.

Changing Gears

Up to this point, I’ve been so focused on chatPDF that I haven’t responded strongly to the study itself.

If you had asked me before I read this study, I would have predicted that:

Doodling probably distracts most neurotypical students in most cases (and is therefore bad for learning), but

It just might improve focus for many students with ADHD or other learning differences (and is therefore good for learning in some cases).

Having spend a while scanning available research, I’ve updated my thinking.

As far as I can tell, there’s no obvious consensus on any point – especially because the question includes so many variables.

For instance, a few studies look at the difference between doodling-as-drawing and doodling-as-shading-in-drawings (!).

The best known study — the Andrade study mentioned by chatPDF — focuses on doodling while listening to brief phone messages. While that’s an interesting place to start, it hardly sounds like a substantial analog for classroom experience.

So, this most recent study adds to a complex field…but the field remains (frustratingly) complex!

TL;DR

Based on this research adventure, I tentatively conclude that…

… chatPDF could be REALLY USEFUL for teachers who want to explore the specifics of education and psychology research, and

… despite strong claims to the contrary, we don’t have a good enough research basis to draw conclusions about doodling.

For the time being, we — teaachers and students — should rely on our best judgment.


Spencer Mueller, E. K. (2019). Note-Taking for the Win: Doodling Does not Reduce Boredom or Improve Retention of Lecture Material (Doctoral dissertation, University of Guelph).

Andrade, J. (2010). What does doodling do?. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition24(1), 100-106.

Have I Been Spectacularly Wrong for Years? Part 1
Andrew Watson
Andrew Watson

Over the years, I’ve used this blog to make several persistent arguments.

One of those arguments features in almost every post I write: context always matters.

That is: research might suggest that a particular classroom strategy works well. However, teachers should always consider the fit between that research-informed suggestion and their own teaching world.

Recently I read an article suggesting that my advice — whether or not it’s true — is almost certainly useless.

In brief: this article argues that too much focus on context and nuance increases complexity, and thereby makes the advice unpersuasive and easy to ignore. Wow.

Naturally, I followed up with the article’s author: Dr. Morgan Polikoff. This transcript of our conversation has been edited for brevity and clarity.


Andrew Watson:

Thank you Dr. Polikoff for taking the time to talk with me this afternoon.

I’ve been following your work on Twitter for a while because I like the topics you write about, and I’m always interested in your point of view.

So I was surprised recently to read an article you wrote that I disagreed with quite strongly.

I do want to get to the disagreement part, but before we get there, I wonder if you could tell me what it is that you do and why you do it.

Morgan Polikoff:

I’m an associate professor of K-12 policy at the USC Rossier School of Education. And my research primarily focuses on standards assessment and accountability policy and curriculum policy. That’s been my bread and butter over the last 13 years at USC.

Headshot of Dr. Morgan Polikoff, smiling at the camera

I think of myself as a very public facing academic. In addition to traditional publishing in peer review journals and getting grants, I’m really interested in bringing my research, and other people’s research, to bear on pressing policy topics — so that it is actually having an impact on policy and practice.

That was the spirit in which I wrote the piece for Larry Ferlazzo’s blog.

Andrew Watson:

Your Education Week article includes many ideas that I agree with.

For instance, you start by saying:

“When communicating with the public, researchers should have one message and stick to it.”

That idea makes lots of sense to me.

However, the next two points that you make post launch our disagreement.

The first is: when communicating with the public, researchers should be more prescriptive in the advice that they give:

“Academics in general are loath to offer specific recommendations about what policymakers and practitioners should do… This is unfortunately a recipe for your work not to matter, because if you don’t offer specific, actionable guidance, someone else (probably someone less informed than you) will do it.”

And second, while researchers should provide some context and nuance for that research, there’s always the danger of providing too much context and nuance:

“Researchers tend to want to nuance and contextualize their work to an excessive degree. And, of course, it’s true that every child is different and every school is different, so no one intervention is going to work in every context. But the reality is that research is never going to be sufficiently contextualized for every individual setting and what works ‘on average’ is, in most cases, going to work in individual settings, too.”

Have I summarized that correctly? Do you want to say more about those two points?

Morgan Polikoff:

No, I think that’s right.

Andrew Watson:

Okay. I’ll tell you why those two points troubled me and then maybe we can figure out if we still disagree.

In the work that I do – explaining psychology and neuroscience research to teachers – I find that most people boil research down into highly prescriptive, almost script-like messages.

These scripts oversimplify the research and make it “one size fits all.”

So I am regularly encouraging people to be less prescriptive and to make allowance for context as much as possible.

Teachers shouldn’t listen to the script that the speaker is providing, but instead to think about teaching the way researchers understand cognition, and then adapt that message to their own work.

Morgan Polikoff:

I think part of our disagreement is an audience difference.

You’re thinking primarily about teachers, I’m thinking primarily about policy makers at various levels from state leaders down to district folks.

And the second difference: this article was about impact.

How should researchers think about maximizing their impact on things that actually go on?

Now that said, I think that a lot of what I wrote about “being more prescriptive” does apply to working with teachers.

There are lots of instances in which I think we avoid being prescriptive because we think we’re not supposed to, or because we’re fetishizing local control, or because we think that the domain of the classroom is the teacher’s, or because we think that every child is utterly unique.

And some of those things I think are true, but I also think that if you really want to move practices, you have to move systems. And if you want to move systems, you have to tell people what to do a little more than I think we do now.

When I say “tell people what to do,” I don’t mean command them. I mean give them very specific advice about what you think is the best thing to do.

Andrew Watson:

Can you give me an example where prescription is beneficial, at least at the policy level?

Morgan Polikoff:

I gave the example on Twitter of high-quality instructional materials. I believe – and I will say this to any State Department of Ed person or any school district leader:

… that every teacher should have a high-quality core instructional material in every subject that they teach,

… having those materials should be a minimum expectation,

… for the most part, teachers should use those materials as the core of their instruction.

So that’s me being quite prescriptive.

But I think if I were to instead say something less prescriptive: like,

“Well, we think that curriculum materials on average are effective, but we really don’t have evidence on this, or evidence is stronger in math than it is in other subjects,”

That’s just a recipe for irrelevance. The purpose of the piece is about relevance.

Andrew Watson:

Just to play that out a little:

If it’s true that the evidence for high-quality instructional materials is better in math than in English (or foreign language teaching, or soccer coaching), I would hesitate to give prescriptive advice to anyone who isn’t teaching math because I don’t have evidence to support it.

Do you share that hesitation, or you’re all in on giving broadly prescriptive advice?

Morgan Polikoff:

I get what you’re saying.

I also think that the reality of evidence in education is so weak that if people listen to the advice that you just gave, that no one could really give advice about anything.

We just don’t have high quality causal studies with convincing evidence and lots of replication on virtually any topic. That’s certainly the case on curriculum, even in math, which is the place where we have the most evidence.

The best I can say, based on the highest quality evidence, is that I think that there’s convincing evidence that some curriculum materials are more effective than other curriculum materials.

So yeah, you’ve got to make a lot of leaps and you can appropriately caveat.

I think that you shouldn’t lie about what the strength of the evidence is. Nonetheless, people have to make decisions and they have to make decisions now. There are children in schools right now who don’t have curriculum materials, and one of the reasons for that is squishy people don’t want to tell school districts that they must provide children with quality curriculum materials.

And that is harming those children, I am certain of it – even though I don’t think that the convincing causal evidence is there.

Andrew Watson:

So let’s consider an example that you and I discussed in our email exchange.

In cognitive science world, people have insisted on something called the 10-minute rule: “people can’t pay attention to something they’re not intrinsically interested in for more than 10 minutes.”

So, if you’re a classroom teacher, you should design your lesson plan in 10-minute chunks.

Sadly, when we look at the research behind the “10-minute rule,” it’s comically weak. [link]

Would I be applying your way thinking correctly were I just to say,

“Well, the 10-minute rule doesn’t have great evidence, but it might be the only attention rule we’ve got. So I’m going proclaim that rule loud and strong. My claim isn’t a hundred percent true, but it’s better for teachers to plan in 10-minute chunks than 30-minute chunks.”

Morgan Polikoff:

Well, you’re the expert on that topic and I’m not, so I can’t evaluate the evidence. If you think that the evidence doesn’t support the 10-minute rule, then I think that you should say loud and clear that the evidence doesn’t support that: the teacher should not do that.

On the other hand, some people, I guess, think that the evidence is good, and I think they should feel free to … they’re trying to have impact by saying that.

The reality is: people who are wrong can use these strategies, just like people who are right can use these strategies, and there’s not that much I can do about that.

And I think lots of times, people who are wrong or people who are trying to sell you something actually use a strategy like this because they recognize that it’s effective.

On the other hand, people who might understand the literature and the evidence more are reluctant to use this kind of direct language.

Therefore, in some ways, this reluctance facilitates the people who are wrong and their negative impact on what goes on in classrooms.

Andrew Watson:

To contradict a false claim, should I say, “they’re wrong!” or should I provide my evidence for what’s right?

Morgan Polikoff:

I’m not an expert on how to refute wrongness.

My understanding of a refutation approach is: you say what the misconception is, you provide direct evidence that it’s wrong, and then you tell people what they should do instead.

I think being direct is the way to do that; not saying, “Well, it’s right under some circumstances, but it’s not right under others.”

No, you need to be equally as direct as the people who are giving teachers bad advice — advice that you say is bad.

Andrew Watson:

Which is complicated if the correct advice is, in fact, highly nuanced and context dependent.

Morgan Polikoff:

If the advice you think is correct really is very complicated and you need to give different advice to every individual teacher or in every individual setting, then yes, I think that’s going to be a hard message to get across. That’s just the reality of the situation.

If the message gets so complicated that you can’t explain it in a few sentences, or a one page concrete piece of guidance that you can give to a teacher and say, “Here’s what I think you should do” – then I think it’s going to be hard to get them to do what you want them to do, what you think is good.

Andrew Watson:

That’s dismaying but intriguing, isn’t it?

Morgan Polikoff:

Listen, it’s harder to get people to do complicated things than it is to get them to do simple things, and it’s harder to get people to understand complicated things than it’s to get them to understand simple things. Those seem like obvious statements.

Andrew Watson:

To some degree, those obvious statements create a challenge in this sort of work that I do, because the truth is:

… second graders aren’t 12th graders, and

… math teaching isn’t history teaching, and isn’t soccer coaching and

… the cultural context in Massachusetts isn’t the cultural context in Reykjavík, isn’t the cultural context in Korea.

So it’s just true that the specifics of teaching advice will be highly context dependent for all of those things, and many others.

Morgan Polikoff:

I don’t know if I agree with you.

I definitely agree with you, for instance, that kids develop, and so second graders are different from high schoolers.

But as for cultural differences, I don’t talk about Iceland or Korea; I talk only about the United States. The more relevant question is: are kids in Massachusetts that different from kids in Vermont or that different from kids in Nebraska?

There are certainly cultural differences. Obviously there are racial and ethnic differences, there might be language differences.

But I strongly suspect – and I think the literature is pretty clear on this — that much more of the variation in those kinds of things is within school than between schools.

And so the advice that you give about how to teach, I don’t think it’s going to be fundamentally different. Math or reading in Massachusetts is not fundamentally different from math or reading in Nebraska or Texas — at least in terms of the cultural dimensions you’re talking about.

Andrew Watson:

I think within the States, the cultural differences would be more — for instance — teachers at a Montessori school and teachers at a Catholic school and teachers at a military academy.

Although the surrounding culture itself is substantially the same, these school cultures are inviting a different way of thinking about teaching and learning, and the relationships between teachers and learners, which resemble — if they don’t equal – cross-national differences like Massachusetts and Iceland.

Morgan Polikoff:

I’m not sure how important that point is, but that makes sense.

Andrew Watson:

Okay. I guess to wrap things up: is there anything that you haven’t had a chance to say that you’d like to say?

Morgan Polikoff:

Sure.

Both in this conversation and also in the writing, I’m doing what I advise people to do: oversimplifying things.

And also, I’m pushing back at you to be provocative. That’s intentional. That’s a strategy that I’m using.

I think that provocation is useful to people. Oftentimes people are uncomfortable or unwilling to say things that are even as provocative as this discussion…and this is frankly pretty banal.

I’m playing up disagreement because I think that doing so is provocative and engaging to people.

Andrew Watson:

Well, it’s certainly been engaging to me – and I suspect it will be to our readers as well. Thank you for taking the time to chat today.


This post’s title asks if I’ve been wrong for years. I don’t (yet) have an answer to that question. But, here are my quick take-aways from this conversation:

A: Dr. Polikoff and I definitely disagree!

B: I don’t think I changed his mind.

C: I’ll let you know in a later post whether or not he has changed mine…

Is Teaching Golf Like Teaching Algebra?
Andrew Watson
Andrew Watson

My work in this field starts with a simple logical argument:

A: Learning happens in the brain and the mind.

B: Therefore, teachers might benefit from knowing more about the brain and the mind.

C: Therefore, we should hang out with people who study brains (neuroscientists) and who study minds (psychologists). We can learn from them, and they can learn from us.

So far, so good.

That seemingly simple logic, however, gets complicated quickly.

First — as I argue frequently — we benefit MUCH MORE from studying psychology than neuroscience.

Second — again, a refrain here on the blog — we need always to remember context and nuance.

For example:

Teaching 1st graders might require different skills and techniques than teaching 8th graders, or college students.

Sometimes neurotypical students benefit from different teaching strategies than non-neurotypical students.

Cultural differences shape classroom expectations, and might thus require or forbid various teaching strategies.

In other words, my simple idea — “improve my teaching by learning brainy stuff!!” — quickly requires all sorts of subtleties.

In fact, I’ve just stumbled across a new one. Let me try to explain.

To The Classroom, and Beyond

As an English teacher, I live in a wordy world.

We study poems and write essays and read Zora Neale Hurston and revel in grammar. (Well, I revel. My students graciously put up with me.)

As the Prince of Denmark once said: “Words, words, words.”

To teach these English-y topics, I’ve got lots of strategies:

Retrieval practice: “What’s the difference between a direct object and a predicate nominative…Sylvia?”

Managing alertness: “Alistair and Yazmeen, please write your answers on the board.”

Working memory load reduction: “What’s our acronym for the 4 key verbs?”

Research suggests all this wordiness will help my students learn.

HOWEVER, not everything that students learn boils down to words.

Yes, SOME knowledge is “declarative“: I can say it out loud.

Yet OTHER knowledge is “procedural“: something I do, not something I can say.

Imagine, then, that I’m teaching someone how to play golf. As they practice, should I use those same teaching strategies? Will my players benefit from translating their physical activity into words?

For instance:

Retrieval practice: “Describe the best stance for a putt…Sylvia.”

Working memory load reduction: “What’s our acronym for the ideal golf swing?”

Will words, words, words help golfers?

Plot Twist

Just this last week I’ve started finding research raising intriguing doubts.

The research suggests:

Some kinds of knowledge aren’t really verbal: say, for example, a golf swing.

Asking students to put not-verbal knowledge into words as they learn actually gets in the way of learning.

In other words, if I ask a golfer to describe her swing while learning, I’m asking her to cram procedural knowledge into declarative form.

Little boy blowing golf ball into hole.

That translation — put “not words” into “words” — makes learning harder.

I’ve been using golf as an example because the studies I’ve found focus on golf skills.

In this study, novice golfers learned less when asked to describe their golf strokes.

In this study, expert golfers improved less under similar circumstances.

But off the top of my head, I can think of all sorts of school topics that might (MIGHT!!) fit this category:

Pottery and painting and dancing

Handwriting

Manipulating microscopes or pipettes or other science-y tools

Shop

If the golf research applies to these procedural skills, then many of my word-based teaching strategies need a substantial rethink.

Not So Fast

In this highly speculative post, I should rush to include several cautions:

First: I haven’t yet found any research applying this idea to the school subjects I’ve mentioned. I’m extrapolating — always a perilous thing to do. (Most of the research, in fact, focuses on facial recognition.)

Second: this line of reasoning might lure some folks into “learning-styles” flavored teaching theories. Beware that siren song!

Third: I might be overstating the changes that flow from this possible conclusion. For example, my pottery students should still do retrieval practice — but they should respond to questions by showing me rather than telling me the answers.

As you can tell, I’m still working out these ideas in my head. If you have insights — or research suggestions — I hope you’ll share them with me.


By the way: this research topic is called the “verbal overshadowing effect.” That is: when I translate procedural knowledge into declarative terms, the mistranslation into words  (“verbal”) overshadows the actual content knowledge — which is at its root procedural.


Flegal, K. E., & Anderson, M. C. (2008). Overthinking skilled motor performance: Or why those who teach can’t do. Psychonomic Bulletin & Review15, 927-932.

Chauvel, G., Maquestiaux, F., Ruthruff, E., Didierjean, A., & Hartley, A. A. (2013). Novice motor performance: Better not to verbalize. Psychonomic bulletin & review20, 177-183.

“You Can Find Research that Proves Anything”
Andrew Watson
Andrew Watson

Sometimes teachers hear about research that SUPPORTS our current beliefs and teaching practice.

Honestly, that experience feels great. “Look,” says my interval voice, “I’ve been doing it right all along.”

And sometimes, we hear about research that CONTRADICTS our beliefs and practice.

Honestly, that’s a punch to the gut. “Wait,” says that voice, “I’ve been wrong all this time?”

This discomfort often prompts us to use a handy rejoinder: “well, you can find research that proves anything…”

The not so subtle implication: “yes, this research says that — but this research doesn’t really matter because even absurd positions can find research backing…”

So, what should we do when a colleague rejects our research-based advice with this claim? Or, what should we do when we find ourselves saying it?

Step #0: Let’s Check

In the first place, I’m honestly not so sure that we can find reseach that says anything.

Let’s take a common piece of teaching advice: “teachers should shake hands with their students as they enter the classroom.”

Can we find research supporting, or contradicting, this claim? If “we can find research that says anything,” we certainly should be able to.

Child wearing a bow tie and a happy expression standing in front of a chalkboard with a bar graph showing steady increases

Well, so far I haven’t found any research examining this question.

As I’ve written before, Dr. Clay Cook found that “positive greetings” at the door produced specific benefits for specific students.

But his research doesn’t remotely suggest that all teachers should “shake-in” at all times. (For one thing: “positive greetings” don’t have to be handshakes.)

I just asked Elicit.org this handshake-at-the-door question. The closest answer I found is a study showing that female professors get higher ratings on the first day of class when they shake in, whereas male professors get lower ratings.

But again: that study neither confirms nor contradicts the larger claim about daily handshakes.

It seems that we can’t always “find research that proves anything.”

Step #1: Start Reading

But let’s agree that we can find research supporting lots of strange conclusions — or, at least, conclusions I disagree with. What should we do when that happens?

Imagine that a friend tells me: “chewing gum increases learning.”

When I ask him if he’s found research supporting that position, he grins broadly and says: “check this out.”

So, let’s check it out!

A cursory glance suggests that — yes — my friend has found research supporting his position, but it’s not terribly persuasive research.

It includes exactly 16 participants.

It’s published in a journal that focuses on engineering (not, say, memory, or learning).

Its method of measuring attention is … well … HIGHLY unscientific.

In other words, my friend found research supporting this claim; however, I didn’t need to look very hard to find reasons to doubt it.

That is: it doesn’t really matter if I can find research that “proves anything.” What matters is if I can find GOOD research supporting a particular claim.

Step #2: Get Curious

But, is GOOD research enough? If I find one well-done study, should I accept that chewing gum does promote memory?

When I got started in this field, I noticed that the scholars I admired most shared a surprising intellectual habit:

They shift to curiosity.

That is:

Person 1 says: “research shows that chewing gum improves learning!

Person 2 says: “nope; research shows that chewing gum has no effect at all on learning.”

At this point, person 1 might say: “well, you can find research that shows anything. You’re obviously wrong. My research is correct.”

Or, person 1 might say: “wow, I’m curious that we have seen research that arrives at contradictory conclusions! Let’s explore…”

Over the years, I’ve come to rely on two sources when I feel curious and want to explore.

Scite.ai asks how often a particular study has been cited overall; how many times its findings have been confirmed; and how many times its findings have been contradicted.

Connectedpapers.com looks at the most frequently cited papers related to the topic, and creates a cool spiderweb diagram to show their connections.

Using these websites, person 1 and person 2 can plug in their studies, and see how many OTHER studies arrive at their conclusions.

That is: rather that rely on just ONE study, we can look at a WHOLE GROUP of studies to reach our conclusion.

When I use these tools to explore the chewing gum claim, as I’ve written before, I arrive at several conclusions:

First: researchers have done a surprising amount of work on this topic. (It seems like SUCH a niche-y question that I’m surprised folks have investigated it substantively.)

Second: even the quality research in this field (i.e., more than 16 participants) arrives at contradictory results.

This overview, noting that we can find clear evidence of both benefits and detriments, concludes that “the robustness of reported effects of gum chewing on cognition has to be questioned.”

So, at this point I don’t think we can claim we have a decisive, research-informed answer to this question.

In other words: the question is not “can we find research that proves anything?”; or even “can we find GOOD research that points in a clear direction?”; but “can we find SEVERAL studies all pointing in a clear direction — and more studies pointing this way than that way?”

Only if the answer to that last question is “yes” should we teachers start changing our practice because “research says so.”

TL;DR

Can we really find research that supports any claim about education?

First: no.

Second: we don’t want research, we want good research.

Third: we don’t just want good research, we want several good studies pointing roughly toward the same conclusion.

Until we have met these criteria, we can’t really say that a particular claim merits our attention and respect.


Cook, C. R., Fiat, A., Larson, M., Daikos, C., Slemrod, T., Holland, E. A., … & Renshaw, T. (2018). Positive greetings at the door: Evaluation of a low-cost, high-yield proactive classroom management strategy. Journal of Positive Behavior Interventions20(3), 149-159.

Wilson, J., Stadler, J., Schwartz, B., & Goff, D. (2009). Touching your students: The impact of a handshake on the first day of class. Journal of the Scholarship of Teaching and Learning, 108-117.

Read This Post with Your Right Brain First…
Andrew Watson
Andrew Watson

My Twitter feed is suddenly awash with one of those “how does your brain?” work tests. (I should say, “tests.”)

If you look at the picture and see an angel, you’re right-brained.

If you see a helicopter, you’re left-brained.

This “test” has several important flaws.

Flaw #1: it’s not a helicopter or an angel — it’s obviously a dog.

Flaw #2: left-brain/right-brain is one of those zombie myths that just keeps coming back, no matter how many times we kill it.

Of all the myths in this field, this one puzzles me the most. Let me try to unpack my confusion.

Not True: The Brain

At the most basic level, this brain myth suffers from the flaw that it lacks any meaningful basis in neurobiological truth. In the world of theories about the brain, that’s a big flaw.

We can’t in any meaningful way find people who “use more of the right brain,” or “rely on left-brain thinking.”

If you’d like a detailed explanation of the wrongness here, I recommend Urban Myths about Learning and Education by de Bruyckere, Kirschner, and Hulshof.

If you’d rather click a link, check out this study. In the mild language of research, it concludes:

Our data are not consistent with a whole-brain phenotype of greater “left-brained” or greater “right-brained” network strength across individuals.

Translation: “people and brains just don’t operate that way. No seriously. They just don’t.”

Yes, yes: a few mental functions typically take place more on one side than another.

A conceptual image of a brain, falsely suggesting that the left hemisphere is computational and the right hemisphere is artistic

Back in grad school, we learned that 95% of right-handed people rely more on the left side of the brain for some reading functions. But 95% =/= 100%. And [checks notes] left-handed people do exist.

In any case, this finding doesn’t support the LB/RB claim — which is that some people rely more on these synapses, and others rely on those synapses.

Honestly: at the basic level of “how we use our brains,” we’re all “whole brained.” *

Not True: The Mind

Okay, so maybe the LB/RB claim isn’t exactly about “the brain” and more about “the mind.”

That is: some folks are more analytical (“left-brained”) and others are more creative (“right-brained”).

This version of the myth doesn’t use the word “brain” literally. (“Who knows precisely where those mental functions happen in the brain? We were just joshing, kind of poetically.”)

It simply argues that people think differently — and we can tidily divide them into two groups.

In other words, this version simply repeats the “learning styles” argument. These theories say we can divide students into distinct groups (visual/auditory/kinesthetic; or,  creative/analytical; or, happy/grumpy/sleepy) and then teach them differently.

Of course, the LB/RB version of “learning styles” is no truer than the other versions; they all lack solid evidence to support them.

The Myers-Briggs Type Indicator sort of claims to measure this distinction (“thinking vs. feeling”). But here again, we just don’t have good evidence supporting this test. **

So, whether we’re talking about neuroscience or psychology, LB/RB ain’t true.

Beyond “True”

One of my favorite quotations is attributed to George Box:

All models are false; some models are useful.

In other words: psychologists can offer a good model for how — say — working memory works. That model is “useful” because it helps us teach better.

However, that model is a model. The staggering complexities of working memory itself defy reduction into a model.

So, maybe LB/RB isn’t true, but is useful?

Honestly, I just don’t see how it could be useful.

If the model were true (it’s not) and I could divide my students into left and right brained groups (I can’t), what would I then do differently?

Just maybe I could devise a “creative” lesson plan for one group and an “analytical” lesson plan for the other. (I’m not sure how, but I’m trying to make this work.)

Yet: doing so would be an enormous waste of time.

Neither group would learn any more than they would with the same lesson plan. And all that time I dumped into my dual planning can’t be used to create an effective lesson plan.

That sound you hear is George Box weeping.

TL;DR

Left-brain/right-brain claims are NEITHER true NOR useful.

Do not take teaching advice from people who make them.


* Yes, it’s true, some people have only one hemisphere. But that’s really rare, and not at all what the LB/RB myth rests upon.

** Some time ago, I tried quite earnestly to find evidence supporting the MBTI. To do so, I emailed the company that produces it asking for published research. They did not send me any research; they did, however, sign me up for their emails.


Nielsen, J. A., Zielinski, B. A., Ferguson, M. A., Lainhart, J. E., & Anderson, J. S. (2013). An evaluation of the left-brain vs. right-brain hypothesis with resting state functional connectivity magnetic resonance imaging. PloS one8(8), e71275.

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological science in the public interest9(3), 105-119.

You Should Not (or Should) Let Your Students Take Pictures of Slides
Andrew Watson
Andrew Watson

Back in October, I wrote a blog post about a surprise: it turns out that students REMEMBER STUFF BETTER when they take photos of lecture slides.

For several reasons — including common sense — I would have predicted the opposite. In fact, so did the researchers (led by Dr. Annie Ditta) who arrived at this conclusion.

But when Team Ditta ran their study and crunched their numbers, they found that slide photos improved students’ recall.

Woman holding up mobile phono to take photo of speaker and slides

Having written that pro-photo blog post, I was genuinely alarmed to see a tweet from Prof. Dan Willingham — one of the greats in this field. He describes taking photos as “a terrible way to take notes.”

And Dr. Willingham should know. He’s just written a book focusing on study strategies — including note-taking.

What’s going on here? Have I given you terrible advice?

It turns out: Professor Willingham’s advice derives from this study, published in 2021 by Wong and Lim.

My blog post came from the Ditta study, published in 2022.

How do we explain — and choose between — studies that ask the same question and arrive at entirely different answers?

Untangling the Knot

Step 1: don’t panic.

It might seem that contradictory results explode the field of psychology. If THIS study shows “yes” and THAT study shows “no,” then the whole enterprise looks foolish and broken.

But here’s the thing:

Psychology is complicated.

Teaching and learning are complicated.

PEOPLE are complicated.

When psychology researchers study people who are teaching and learning, they’re studying FANTASTICALLY complicated topics.

For that reason, psychology researchers regularly produce contradictory results. That’s just how they roll.

And for that reason, no one study answers a question for good. To quote Dr. Willingham once again: “One study is just one study, folks.”

We should look not for one study to answer a question definitively, but for clusters of studies to point in a consistent direction.

If 10 studies show YES, and 2 studies show NO, and 2 more show CONFUSION — well then, “yes” strikes me as a plausible conclusion. (At least for now.)

Start Here

How can we know if most researchers have arrived at Wong’s 2021 conclusion (“photos = bad”) or at Ditta’s 2022 conclusion (“photos = good”)?

Step 2: Get curious.

Replace advocacy (“I know for sure that photos are good/bad!”) with curiosity (“I wonder what I’ll find? This should be fun…”)

For my curiosity projects, I rely on three websites: scite.ai, connectedpapers.com, and elicit.org. *

They all have different approaches and yield different kinds of results. And, they all help answer the question: “do we yet have a cluster of studies that mostly point to the same conclusion?”

So, what did I find when I asked those resources about the Wong (“photes = bad”) study?

When I looked on connectedpapers.com … it identified exactly ZERO other studies that asked questions about taking photos of lecture slides.

When I asked elicit.org a question on the topic … it came up with nothing.

Scite.ai did identify one other study responding to Wong. Sure enough, it’s the Ditta study: “photos = good.”

So, unless I’m missing something, we just don’t have much research on this topic. We can’t know where a “cluster of studies” might point because we don’t have anything remotely like a cluster.

Getting Specific

We’ve got at least one more research avenue to pursue:

Step 3: explore the boundaries.

Let’s imagine for a minute that Wong did her study with 3rd graders, and found that photos = bad; and (still imagining), Ditta did her study with college students, and found that photos = good.

In that case, we could reasonably imagine that they got different results because they studied participants in different grades.

Or (more imagining) maybe Wong studied photos of slides during a music class, and Ditta studied photos during an art history class.

Here again we could make a reasonable guess: slide photos will help in some disciplines (art!) but not others (music).

Researchers call these “boundary conditions”: as in, “this finding applies to people within these boundaries, but not outside them.

Potential examples: a conclusion applies to …

… math class but not history class, or

… a Montessori school but not a military academy, or

… for dyslexic students, but not for neurotypical readers, or

… in Icelandic culture, but not Brazilian culture.

You get the idea.

When we look at Wong’s and Ditta’s studies, however, we find they’re very similar. Adults watch short-ish videos, and do (or don’t) take photos or notes.

The studies differ slightly — Wong looks at mind wandering as an important variable, for instance — but not enough to draw strong conclusions.

At this point, neither our online resources nor our exploration of boundary conditions gives us any reason to prefer one study to the other.

End at the Beginning

No matter how the journey goes up to this point, we always end with …

Step 4: Look to your experience, and your colleagues.

In other words: we teachers should be curious (step 2) and informed (step 3). And, we always ultimately rely on our own judgement.

In this case — in my view — we simply don’t have a good research consensus to push us strongly one way or another. So, relying on my experience, here’s the policy I would follow with my 10th grade English students:

You may take pictures of photos or complex diagrams — anything that would be hard to put into words.

However, if you can put the material into words, I’m going to ask you to do so.

Why?

Because the more time you spend processing the information, the likelier it is you will understand and remember it.

This policy would, of course, have nuances and exceptions. (E.g.: dysgraphic students shouldn’t have to write as much.)

I want to emphasize, however, that your policy needn’t resemble my policy.

If you teach different kinds of students, or teach in a photo-friendly discipline (art history!), or if your experience tells you something else…you should follow your own wisdom.

TL;DR

Should students take photos of slides as a way to remember the material?

At present, we have so little research on the topic that it really can’t answer that question — ESPECIALLY because the studies contradict one another.

Instead, we should rely on our research-informed judgement.


* As I’ve written elsewhere, I would not use ChatGPT for this kind of inquiry. In my first forays into that world, the website simply MADE UP citations. Ugh.


Ditta, A. S., Soares, J. S., & Storm, B. C. (2022). What happens to memory for lecture content when students take photos of the lecture slides?. Journal of Applied Research in Memory and Cognition.

Wong, S. S. H., & Lim, S. W. H. (2021). Take notes, not photos: Mind-wandering mediates the impact of note-taking strategies on video-recorded lecture learning performance. Journal of Experimental Psychology: Applied.

Beware the Experts: The Danger of Popular Science Writing
Andrew Watson
Andrew Watson

Here’s a little expert advice on nutrition:

Michael Phelps — the most decorated Olympic athelete in any sport ever — obviously had to take EXCELLENT care of his body. He thought A LOT about fitness and nutrition.

While he was training for the Olympics, he ate roughly 10,000 calories a day.

So: if I want to attain peak fitness, I too should eat 10,000 calories a day.

If it’s good enough for Olympic medals winners, it’s good enough for me.

Wait a minute. [insert sound of record scratch]

That’s terrible advice.

10,000 calories per day might have been a good idea for Phelps. However — physically speaking — he and I have very little in common.

During his Olympic career, Phelps was in his teens and 20s.  I’m 57.

He was in peak human physical condition. I am — well — in very average physical condition.

He (I assume) undertook ferociously vigorous physical exercise — and burned calories — most of the day. I spend much of my day sitting here writing blog posts.

Basing my nutritional plan on Phelps’s example just makes no sense.

Simply put: stories of extreme human performance fascinate us. Alas, they rarely produce useful models for everyday life — or for teaching.

Danger, Will Robinson

That last paragraph, sadly, creates real problems for popular science writers.

In my experience, their formula goes something like this:

“Here’s a fascinating story abouts something EXTRAORDINARY that happened.

Now that I’ve got your attention, notice this AMAZING X FACTOR in my story.

Here’s some wonky research roughly related to Amazing X.

You should enact Amazing X in your life, too.”

Whether the extraordinary story focuses on burning planes or impossible inventions or heroic feats, those stories — we’re asked to believe — all have something to tell us about improving our lives.

Underwater picture of a young boy swimming directly toward the camera

But if it’s true, as I wrote above, that “stories of extreme human performance rarely produce useful models for teaching,” then the narrative structure above invites — heck, demands — our skepticism.

Amazing X might benefit extraordinary folks in outlier conditions. But, by definition, few of us teach in outlier conditions. Amazing X just won’t help us much. It might, in fact, be a very bad idea in our classrooms. (10,000 calories, anyone?)

Don’t Start Here

You have, perhaps, heard the story of the Mann Gulch Fire. (If not, you should check it out. It’s an AMAZING story.)

In brief:

Back in 1949, a group of trained “smoke jumpers” battled a wildfire that was burning toward the Missouri river. The fire abruptly turned towards them, and they realized they were trapped … and likely doomed.

In an instant, the group’s leader — “Wag” Dodge — came up with an astonishing solution. He set his own fire, and then stepped into its”shadow”: the area that his fire had burned clear. The wildfire burned around him — but not over the area that his fire had scorched.

Sadly, none of his men followed him into the shadow. Two other men outran the fire; most died.

This story appears in more than one book I know. The message: we want our students to think the way Dodge thought. We want them to be creative thinkers, who can come up with novel solutions to important problems.

I agree with those goals. I want my students to be able to think for themselves, and think past the knowledge that I have.

However: Dodge’s example tells us exactly nothing about helping students develop that capacity.

Dodge was a highly experienced firejumper. And he was in immediate danger of his life.

Our students are not highly experienced in the topic we’re teaching them. (If they were, we wouldn’t need to be teaching them.) And — except in very rare circumstances — they don’t face immediate peril.

Dodge’s thought process, in other words, has almost nothing to do with our students’ thinking. Until they know as much as Dodge knew, and have roughly as much experience as he had, we should have no expectation that they can “think the way he thought.”

We shouldn’t use his example to inform our work — even if it’s a great story.

Familiar Problems

Another example, from another popular science book:

Dr. K reads X-rays for a living. He found that he got bored and tired as the day progressed. He worried — reasonably enough — that he was getting sloppy as the day progressed.

So, he installed a “walking desk” in his office. He walked at a moderate pace as he read the X-rays, and felt much more alert and perceptive.

Dr. K wondered: does this technique benefit others?

He ran a study, and — sure enough!! — found that Doctors Who Walked spotted suspicious masses more often that Doctors Who Sat.

Clearly, walking is good for thinking. Therefore, teachers should have students walk as they learn.

Please insert a second [record scratch] here.

Once again: a great story about experts doesn’t meaningfully apply to the work we do in schools.

Doctors who read X-rays are highly trained experts. They’ve been in school for roughly two decades.

And: reading X-rays is a perception task.

If walking helps highly trained experts stay alert enough to perceive patterns better, we can ask if walking helps students learn better.

But both the people involved (experts vs. novices) and the cognitive task (perceiving established patterns vs. learning new patterns) are meaningfully different.

We really need research looking at this question directly before we make strong recommendations.

Based on my the research I know — and my experience as a classroom teacher:

Yes: exercise is good for the body, and good for the brain.

Yes: physical activity before learning provides lots of benefits. (Link)

No: physical activity during learning hasn’t been studied much. (Link)

And: based on my classroom experience, walking my students around outside while trying to discuss Macbeth with them seems like a deeply bad idea.*

Dr. K’s treadmill might help him and his colleagues; I don’t think it does much of anything for teachers and students.

TL;DR

When reading popular science books that include teaching advice, be aware:

The stories about extraordinary people doing extraordinary things fascinate and compel us.

However:

Before we make changes to our teaching practice, we should see research that looks at students like ours studying a topic like ours.

If we don’t, we’ll end up doing the teaching equivalent of eating 10,000 caleries a day.


* Yes, of course, if students are studying something that is in fact outside, it makes sense to go outside and look at it.

For instance: when I taught Where The Crawdads Sing — a book that relies heavily on the symbolism of marshes and swamps — I took my class out to see the marshes on school property.

I’m not saying: never take students for a walk. I am saying: do so with a very specific pedagogical purpose in mind.

Think, Pair, Share: Does It Help? If Yes, Why?
Andrew Watson
Andrew Watson

On some days, I find myself drawn to esoteric research studies.

A few months ago, for example, I wrote about the effect of earworms on sleep. (Yes, scholars really do research earworms.)

Two students in conversationToday, I’ve found as straightforwardly practical a study as I’ve seen.

Teachers everywhere have been encouraged to have students “think, pair, and share.”

That is:

I ask my students a question: “what are the metaphors in this poem?”

We all pause for several seconds, so that students can think about their individual answers.

Students then pair up: “okay, everyone, chat with the person next to you about your answers.”

Finally, I ask students to share their thoughts: “who has spotted a metaphor they want to discuss?”

Voila: they thought, they paired up, they shared.

The Bigger Picture

In truth, LOTS of classroom strategies have such popular currency that we don’t really think to question them.

Is it a good idea to have students write answers on the board? (I’ve never thought to ask; after all, ALL teachers have students write on the board.)

Should I really do those handshake dances at the door? (My colleagues LOVE a good handshake dance.)

College professors everywhere are banning laptops, because handwritten notes are superior. (Aren’t they? Like, obviously?)

In other words, we don’t need to have a research basis for absolutely everything we do in the classroom.

At the same time, when a scholar does explore my classroom practice in a research-y way, I ought to be at least a little curious about the results. (If you clicked on those links above, you’ll notice that our teacherly instincts might be wrong…)

So, what happens when researchers turn to “Think, Pair, Share”?

Will our beloved habit get the research seal of approval? Or, do we need to rethink this standard practice…?

Worth Checking

Researchers in Erfurt, Germany — led by Lukas Mundelsee — undertook a straightforward study with 9th graders.

Researchers introduced students to a topic, and then asked questions.

In some cases, they just asked students to raise their hands (SHARE only).

In some cases, students THOUGHT individually, and then SHARED.

And, of course, they sometimes THOUGHT and PAIRED and SHARED.

Finally, the researchers measured other useful variables — for instance, how shy do students report themselves to be?

So, what do you think Team Mundelsee found?

Sure enough, “think, pair, share” led to more handraising than “share” alone.

And, in particular, this strategy helped students who reported higher levels of shyness.

In other words: researchers got the result that (I suspect) most of us predicted. And, the underlying explanation makes sense.

If I’m shy, I don’t want to get the answer wrong in public. But if I can pair up to discuss my answer first, then I’m less worried about my potential wrongness.

Beyond the Research

Since “sharing” is under discussion here, I’ll share my own approach to “think, pair, share.”

When it comes to the final step — “who wants to share with the group” — I myself cold call.

That is, I don’t ask for someone to volunteer; I call on a student at random.

Now, I should be clear:

First: Mundelsee’s research does NOT investigate this approach. He’s looking at voluntary sharing.

Second: “cold-calling” does generate some controversy. Some folks consider it stress-inducing, even (brace yourself) ‘carceral.’

Now, I don’t doubt that cold-calling can be done badly. (If pizza can be bad, anything can be bad.)

But I explain my thinking to my students at the beginning of the year, and they seem to get in the grove fairly easily.

In this case, I worry that students need a little incentive to think. After all, if the student knows s/he has a pair coming up, then s/he can simply use the other students’ idea as a share.

ESPECIALLY because students have time to test-drive their ideas when they pair, I think cold-calling should be low stakes enough for them to feel plenty comfortable sharing.

Of course, my classroom climate might not match yours; teachers always adapt and balance to get their teaching strategies just right.

TL;DR

Yes, “think, pair, share” helps students feel comfortable sharing.

And, yes, it does so by reducing anxiety.


Hat tip to Alex Quigley, who first pointed out this study to me.


Mundelsee, L., & Jurkowski, S. (2021). Think and pair before share: Effects of collaboration on students’ in-class participation. Learning and Individual Differences88, 102015.

Cultural Field Trips: Do They Really Enhance SEL?
Andrew Watson
Andrew Watson

Here at Learning and the Brain, we like research-informed teaching suggestions.

At the same time, we remember Prof. Dan Willingham’s timeless motto: “one study is just one study, folks.”

Young girl looking intently into a museum display case

That is: one study might show a particular conclusion – but one study isn’t entirely persuasive.

Instead, we’d like SEVERAL studies looking at roughly the same question. If those studies all point the same direction, then we can feel increasingly confident that this conclusion has merit.

For instance:

Several years ago I blogged about an impressive study by Dr. Jay Greene. It suggested …

… that attendance at live theater improved students’ knowledge of the play (not a big surprise)

… and, it improved students’ tolerance (surprise!)

… and, it enhanced their desire see the world from another’s perspective.

Because I REALLY WANT those statements to be true, I looked at this research with extra skepticism. But I was ultimately persuaded that the study design made lots of sense.

Of course, “one study is just one study.”

It would be GREAT to see more research on this topic. Well, I have good news…

Take Two

I’ve recently come across a second study looking at the benefits of live theater — this one led by Angela Watson (no relation that I know of).*

As is often the case, this research team looked at some of the same questions, and some additional questions as well.

So, they asked:

Does live theater provide benefit for students? How about art museums? How about symphonies?

Do these cultural field trips benefit students in different demographic groups?

The Greene study looked at tolerance and “social perspective taking” – the desire to see the world from another’s perspective. Watson’s research also asked questions about conscientiousness and empathy.

So, what did they find?

Methods and Conclusions

As you can imagine, these questions require lots of logistical complexity.

We can study – say – “retrieval practice” fairly easily in a classroom. But, schlepping large groups of students to plays, museums, and symphonies takes a lot more coordination, planning, and cooperation.

Basically, these researchers had students attend different numbers of plays, symphonies, and museums: as few as one trip, or as many as six.

And, they had students fill out questionnaires. And, they followed up for two years!

So, do additional “cultural” field trips make a difference for these students?

As is so often the case, the answer is both “yes” and “no.”

So, for instance, attending the theater/symphony/museum did NOT make students more interested in going back.

And, contrary to Greene’s study, it did NOT make students more “empathetic.”

Specifically, students responded to statements like these: “After seeing a play or a movie, I have felt as though I were one of the characters.”

Their responses did not vary depending on the number of field trips they went on.

At the same time, Team Watson did find some benefits.

Attending field trips did increase “social perspective taking.”

That is: students were asked questions like “When you are angry at someone, how often do you try to ‘put yourself in his or her shoes’?”

Students who attended more field trips put on those shoes more often.

Also, female students — although not males — demonstrated higher levels of conscientiousness after participating in more cultural field trips. (Alas, this change didn’t last as long as the others.)

The news isn’t all good, but not all bad either.

What About Tolerance?

The case of “tolerance” is especially tricky.

When Watson & Co. ran the numbers, these field trips didn’t affect the students’ answers to questions like:

“Some people have views you oppose very strongly. Do you agree that these people should be allowed to come to your school and give a speech?”

So, strictly speaking, field trips didn’t increase tolerance.

However, Watson speculates that the wording of these questions was especially complicated. And, she notes that the students in this cohort had relatively low reading scores.

So, perhaps, students really did feel more tolerant, but the question’s complexity masked that change.

Watson supports this hypothesis by noticing that the students at the high end of reading comprehension DID have higher tolerance scores.

In other words: the students who understood the question better expressed higher levels of tolerance — and might reflect a truer understanding of the field trips’ effects.

TL;DR

We now have even more research showing some SEL benefits for field trips to cultural events.

This trips…

… consistently help students “put themselves in someone else’s shoes,”

… might increase students’ tolerance of others’ opinions,

… enhance girls’ conscientiousness (albeit temporarily).

We have less certain evidence that field trips enhance empathy, or that they encourage students to attend more cultural events.

Equally important: having MORE studies on one topic allows us to contemplate these SEL benefits with greater nuance and sophistication.


* To be precise, Watson’s study is a “working paper.” It hasn’t been peer reviewed in an academic journal.


Watson, A., Greene, J., Holmes Erickson, H., & Beck, M. I. (2019). Altered attitudes and actions: Social-emotional effects of multiple arts field trips.