Andrew Watson – Page 18 – Education & Teacher Conferences Skip to main content

Andrew Watson About Andrew Watson

Andrew began his classroom life as a high-school English teacher in 1988, and has been working in or near schools ever since. In 2008, Andrew began exploring the practical application of psychology and neuroscience in his classroom. In 2011, he earned his M. Ed. from the “Mind, Brain, Education” program at Harvard University. As President of “Translate the Brain,” Andrew now works with teachers, students, administrators, and parents to make learning easier and teaching more effective. He has presented at schools and workshops across the country; he also serves as an adviser to several organizations, including “The People’s Science.” Andrew is the author of "Learning Begins: The Science of Working Memory and Attention for the Classroom Teacher."

Teaching with Images: Worth the Effort?
Andrew Watson
Andrew Watson

According to Richard Mayer’s “multimedia principle,”

People learn better from words and pictures than from words alone.

If that’s true, then we should — obviously — be sure to include pictures in our teaching.

However…

Whenever we see a broad principle like that, we should always look for specific limitations.

That is…

… does this principle apply to kindergarteners as well as 5th graders and adult learners?

… does it apply for students with an ADHD diagnosis?

… is it true when teaching Civil War history, theorems about similar triangles, and bunting strategies?

And so forth.

Researchers call such limits “boundary conditions,” and we should ALWAYS look for boundary conditions.

So, let’s look at that broad principle ( “pictures + words” > “words”) and ask this boundary question:

Does the content of the picture matter?

Possibilities and Perils

Happily, one of the people asking that question is…Richard Mayer himself.

In his career, he’s come up with a whole suite of useful principles. And, he spends lots of time looking for boundary conditions.

Specifically, in a usefully straightforward study, he and Eunmo Sung study several different kinds of images:

Instructive images: “directly relevant to the instructional goal.”

I’m teaching Macbeth right now, and focusing on the play’s tension between order and chaos. So, I might show students a picture of Scotland’s craggy wildernesses (chaos) and one of a highly structured royal ceremony (order).

Seductive images: “highly interesting but not directly relevant to the instructional goal.”

A movie version of Macbeth — starring Denzel Washington and Frances McDormand — just came out. I could show my students a picture of these two movie stars on the Red Carpet at an Oscar ceremony.

Decorative images: “neutral but not directly relevant to the instructional goal.”

Macbeth can be a grim play: so much beheading, so much unseaming. So: I could include pictures of waterfalls and sunrises on my handouts to raise my students’ spirits a bit.

Once we start exploring these potential boundary conditions — perhaps not all images benefit learning equally — we might get even more useful guidance about combining words and images.

Predictions and Results

Sung and Mayer measured the effects of such images on students’ learning AND on their enjoyment of a lesson.

Take a moment to make some predictions on your own.

Which, if any, of those graphics will help students learn more?

Which, if any, will help students enjoy the lesson more?

[I’ll pause while you think about those questions.]

 

 

Perhaps you, like Sung and Mayer, predicted that ALL the images would increase students’ enjoyment.

And perhaps you predicted that the INSTRUCTIVE images would help students learn, but not the others.

Sure enough, you and they were right. Students LIKE images, but LEARN FROM images that focus their attention on the learning goal. (If you’re interested in the specific numbers, look at the 6th page of the study.)

We should, I think, focus on this key finding: students do not always learn more when they enjoy a lesson more.

We shouldn’t deliberately make our lessons dull.

But: we shouldn’t assume that an enjoyable lesson necessarily results in more learning. In this case, those photos of Macbeth movie stars piqued my students’ curiosity and interest, but didn’t help them learn anything about the play.

Three Final Points

First: the benefits of dual coding have gotten lots of attention in recent years.

To get those benefits, we should remember these boundary conditions. Dual coding helps if — and only if — the images highlight the learning goal.

Second: a recent meta-analysis about “seductive details” nicely complements this study.

Third: Like many teachers, I see the good and the vile in Twitter.

Yes (YES!!), it can be a sink of repulsive yuckiness.

And (surprise!!), it can also be supportive and helpful.

I bring up this point because: a wise soul on Twitter mentioned this Sung & Mayer study recently, and reminded me of its importance.

I can’t remember who brought it up (I would credit that tweep if I did), but I’m grateful for the nudge.

Such useful research! Such helpful guidance!


Sung, E., & Mayer, R. E. (2012). When graphics improve liking but not learning from online lessons. Computers in Human Behavior28(5), 1618-1625.

Let’s Get Practical: How Fast Should Videos Be?
Andrew Watson
Andrew Watson

Research often operates at a highly abstract level.

Psychologists and neuroscientists study cognitive “tasks” that stand in for school work. If we’re being honest, however, we often struggle to see the connection between the research task and actual classroom learning.

HOWEVER…

Every now and then, a study comes along that asks a very practical question, and offers some very practical answers.

Even better: it explores the limits of its own answers.

I’ve recently found a study looking at this (incredibly practical) question:

Because students can easily play videos at different speeds, we need to know: which video speed benefits learning the most?

So: what advice should we give our students about learning from videos?

Exploring The Question

Let’s start with a specific example:

If a student watches a video at double speed, she (obviously) spends only half as much time mentally interacting with its information.

Does that reduction in time lead to an equal reduction in learning? Will she learn half as much as if she had watched it at regular speed?

Dr. Dillon Murphy starts with that question, and then quickly gets interested in crucial related questions:

What about other video speeds? That is: what about watching the video at 1.5x speed? What about 3x speed?

Does the topic of the video matter?

And, here’s a biggie: what should students do with the time they save?

Even before we look at the results of this study, I think we can admire its design.

Murphy’s team ran multiple versions of this study looking at all these different variables (and several others).

They did not, in other words, test one hypothesis and then — based on that one test — tell teachers what to do. (“Best practices require…”)

Instead, they invited us into a complex set of questions and possibilities.

Maybe 1.5x is the most efficient speed for learning.

Maybe 3x is the best speed if students use the time they saved to rewatch the video.

Maybe regular speed is best after all.

Because Murphy’s team explores so many possibilities with such open-minded curiosity, we have a MUCH better chance of figuring out which results apply to us. *

The Envelope Please

Rather than walk you through each of the studies, I’ll start with the study’s overall conclusions.

First: watching videos at higher speeds does reduce learning, but not as much as you might think.

That is: spending half as much time with the video (because a student watched it at double speed) does NOT result in half as much learning.

To be specific: students watched ~ 14 minute videos (about real-estate appraisals, or about Roman history).

A week later, those who watched them at regular speed scored a 59% on a quiz. Those who watched at 2x speed scored a 53%.

59% is higher that 53%, but it’s not twice as high. **

Second: students can use that “saved” time productively.

What should a student do with the 7 minutes she saved? She’s got two helpful choices.

Choice 1: rewatch the video right away.

Students who used their “saved” time to rewatch the video right away recaptured those “lost” points. That is: they had the same score as students who watched the video once at regular speed.

Choice 2: bank the time and rewatch the video later.

In another version of the study, students who watched the 1x video once scored a 55% on a quiz one week later.

Other students watched the 2x video once, and then once again a week later. They scored a 63% on that quiz. (For stats types, the d value is 0.55 — a number that gets my attention.)

In other words: rewatching at double speed a week later leads to MORE LEARNING in the THE SAME AMOUNT OF TIME (14 minutes).

Practical + Practical

Murphy takes great care to look at specific combinations.

His example encourages us to take care as well. For instance:

His team worked with college students. Will this result hold for 8th graders, or 2nd graders?

You can look to you your teacherly experience and judgment to answer that question.

Will this effect hold for longer videos: 30 minutes, or one hour?

We don’t know yet.

These videos included a talking head and slides with words — but not closed captions. Will some other combination (no talking head? closed captions on?) lead to different results?

We don’t know yet.

In other words: Murphy’s study gives us practical guidance. We should use our judgment and experience to apply it to our specific teaching circumstances.


* I should note: This study is unusually easy to read. If the topic interests you, you might look it over yourself.

** Important note: I’ve seen news reports about this study saying that watching once at double speed results in the same amount of learning as watching once at regular speed. That claim is untrue. And: Murphy’s study does not make that claim.

Murphy, D. H., Hoover, K. M., Agadzhanyan, K., Kuehn, J. C., & Castel, A. D. (2021). Learning in double time: The effect of lecture video speed on immediate and delayed comprehension. Applied Cognitive Psychology.

The Benefits of Direct Instruction: Balancing Theory with Practice
Andrew Watson
Andrew Watson

When teachers hear that “research shows we should do X,” we have at least two broad questions:

First Question: what’s the research?

Second Question: what EXACTLY does X look like in the classroom?

People who have the expertise to answer the first question (researchers) might not have the K-12 classroom experience to answer the second question.

And, of course, people who can make it work in the classroom (teachers) might not know or understand the research.

Wouldn’t it be great if we could find one book that answers both sets of questions?

In fact, it would be especially great if that book focused on a controversial topic. In that case, we could see a complete argument – both the why and the how – before we make a judgment about the controversy.

Does that sound tempting? I have good news…

Embracing Controversy

A feisty battle has raged in edu-circles for many years now: “direct instruction” vs. “constructivist pedagogy.” *

In one corner, “constructivists” argue that problems or projects or independent inquiries help students discover and build enduring understanding. And, such exploration fosters authentic motivation as well.

In the other corner, “direct instruction” advocates argue that working memory limitations sharply constrain students’ cognitive workspace. For that reason, teachers must explicitly shape learning experiences with small steps and carefully-designed practice.

Both approaches can be – and frequently are – parodied, misunderstood, and badly practiced. So, a book explaining the WHY (research) and the HOW (classroom practice) would be greatly helpful.

Sage on the Page

Adam Boxer teaches chemistry at a school in London, and has been blogging about his work for some time now. (If you follow our twitter account, @LearningandtheB, you’ve seen links to his work before.)

In his book Explicit & Direct Instruction: An Evidence-Informed Guide for Teachers, Boxer gathers eleven essays that explain the research background and then then get SUPER specific with classroom suggestions.

In the first chapter, Kris Boulton tells the history of “Project Follow Through,” a multi-decade program to discover the best way of teaching children.

Researchers tracked more than 200,000 children in 13 different programs over several years, and compared their learning across three dimensions: basic skills, cognitive skills, and affective skills.

Which approach proved most effective?

Direct Instruction, created by Siegfried Engelmann.** It was, in fact, the only program of the 13 that benefitted students in all three dimensions.

When advocates of Direct Instruction (and direct instruction) insist that research shows its effectiveness, they reasonably enough point to Project Follow Through. (Can others critique this study? Of course…)

Both Boulton and Greg Ashman (in the second chapter) then emphasize the alignment of direct instruction with psychology models: cognitive load theory, schema theory, and so forth.

In brief: we’ve got LOTS of research explaining why direct instruction should work, and showing that it does work.

Let’s Get Practical

After Boulton and Ashman explain the why, the next several chapters deliver on the classroom how.

For me, the book’s great success lies in the number, variety, and specificity of these chapters.

What does direct instruction look like for teaching math?

How about science?

How about writing?

What’s the best number of examples to use?

And so forth.

I especially enjoyed Sarah Cullen’s chapter on fading. Cullen begins with an important question/critique:

How, then, can a teaching method that so depends on instruction – on teachers leading learning and controlling the content to which pupils are exposed – foster autonomy?

Her answer focuses on having scaffolds and removing scaffolds – aka, “fading.”

In particular, Cullen wisely conceptualizes fading over many different time spans: fading across grades (which requires planning across years), fading within a term’s curriculum (requiring planning across months), and fading within a lesson (requiring skill, insight, and practice).

Like the book’s other chapters, Cullen’s offers many specific examples for each of her categories. In other words, she ground theoretical understanding with highly specific classroom realities.

In Brief

If you already think direct instruction sounds right, you’ll be glad to have a how-to guide.

If you think it sounds suspect (or even oppressive), you’ll be glad to read a straightforward explanation of the research behind the approach. (You might not be persuaded, but you’ll understand both sides of the argument more clearly.)

And, if you want realistic classroom examples explained with loving detail, this book will launch 2022 just right.


* I’ve put those labels in quotation marks because both are familiar, but neither one really works.

** Direct Instruction (with capital letters) is the name of Engelmann’s specific program. On the other hand, direct instruction (without capital letters) is a broader approach to thinking about teaching and learning.

The Best Kind of Practice for Students Depends on the Learning Goal
Andrew Watson
Andrew Watson

In some ways, teaching ought to be straightforward. Teachers introduce new material (by some method or another), and we have our students practice (by some method or another).

Result: THEY (should) LEARN.

Alas, both classroom experience and psychology/neuroscience research suggest that the process is MUCH more complicated.

For instance:

When we “introduce new material,” should we use direct instruction or more of an inquiry/problem-based pedagogy? *

When we “have our students practice,” what’s the very BEST kind of practice?

Around here, we typically offer two answers to that 2nd question: retrieval practice and interleaving.

Retrieval practice has gotten lots of love on this blog — for instance, here. I have written less about interleaving, mostly because we have less research on the topic.

But I’ve found some ripping good — and very practical — research to share here at the end of 2021.

“What?,” “Why?,” and Other Important Questions

Let’s start with definitions.

Let’s say I teach a particular topic today: “adjectives.” And tomorrow I teach “adverbs.” Next day, “prepositions.” Next: “coordinating conjunctions.”

How should I structure students’ homework?

They could do 20 adjective practice problems tonight. Then 20 adverb problems the next night. Then 20 prepositions. And so forth.

Let’s call that homework schedule blocking.

Or, they could do 5 adjective problems a night for the next 4 nights. And 5 adverb problems a night starting tomorrow night. And so forth.

If I go with this system, students will practice multiple different topics (adjectives, adverbs, prepositions…) at the same time. So, let’s call that homework schedule interleaving.

For the most part, when we compare these two approaches, we find that interleaving results in more learning than blocking. (Lots of info here. Also in this book.)

That’s an interesting conclusion, but why is it true?

In the first place, probably, interleaving is a desirable difficulty. Students must THINK HARDER when they interleave practice, so they learn more.

In the second place, well, we don’t exactly know. Our confusion, in fact, stems in part from an arresting truth: interleaving usually helps students learn, but not always.

Of course, NOTHING ALWAYS WORKS, so we’re not fully surprised. But if the exceptions helped explain the rule, that could be mightily helpful…

An Intriguing Possibility…

Two scholars — Paulo F. Carvalho and Robert Goldstone — have been studying a potential explanation.

Perhaps blocking and interleaving enhance different kinds of memories. And so, research produces contradictory results because researchers use different kinds of memory tests.

Specifically, they propose that:

During blocked study, attention and encoding are progressively directed toward the similarities among successive items belonging to the same category,

whereas during interleaved study attention and encoding are progressively directed toward the differences between successive items belonging to different categories.

In other words: blocking focuses students on the properties of a particular category (“adjectives”). Interleaving focuses students on the distinctions among different categories (“adjectives, adverbs, prepositions”).

And so: if I want students to DEFINE ONE topic or idea or category (“adjectives”), blocking will help them do that well.

If I want students to COMPARE/CONTRAST MANY topics or ideas or categories, interleaving will help them do that well.

To repeat the title of this blog post: “the best kind of practice for students depends on the learning goal.”

In their most recent study, Carvalho and Goldstone test this possibility.

Sure enough, they find that students who block practice do better at defining terms, whereas those who interleave practice do better at multiple-choice questions.

The study gets splendidly intricate — they work hard to disprove their own hypothesis. But once they can’t do so, they admit they they just might be right.

Caveats and Classroom Implications

Caveat #1: “one study is just one study, folks.” (Dan Willingham.)

Although, to be fair, Carvalho and Goldstone have been building a series of studies looking at this question.

Caveat #2: The researchers worked with adults (average age in the 30s) studying psychology topics.

Does their conclusion hold true for K-12 students learning K-12 topics? Maybe…

Caveat #3: Practically speaking, this research might focus on a distinction that evaporates over time.

In truth, I always want my students to know specific definitions — like “tragedy” — well. And, I want them to compare those well-known definitions flexibly to other definitions — like, say, “comedy.”

An an English teacher, I — of course! — want my students to define adjective. AND I — of course!! — want them to compare that definition/concept to other related ideas (adverbs; participles; prepositional phrases acting as adjectives).

In other words, I suspect the ultimate teaching implication of this research goes like this:

We should have students BLOCK practice until they know definitions to some degree of confidence, and then have them INTERLEAVE practice to bring those definitions flexibly together.

To be clear: I’m extrapolating, based on my classroom experience and on my reading in this field.

Until my interpretation gets more research behind it, Carvahlo and Goldstone’s research suggests this general plan:

START BY DECIDING ON THE GOAL.

If you mostly want your students to know individual concepts, have them block their practice.

If you mostly want them to bring several topics together, have them interleave practice.

As your goal changes, their homework changes too.

As is so often the case, this research doesn’t tell teachers what to do. It helps us think more clearly about the work we’re doing.

In my view, that’s the most helpful research of all.


* I think that’s a false choice; both approaches make sense under different circumstances. More on that in another blog post.


Carvalho, P. F., & Goldstone, R. L. (2021). The most efficient sequence of study depends on the type of test. Applied Cognitive Psychology35(1), 82-97.

When Does Technology Distract Students? The Benefits of Research that Contradicts My Beliefs
Andrew Watson
Andrew Watson

I spoke with several hundred students last week about research-based study strategies.

As always, students were fascinating to hear about psychology and neuroscience research: for instance, the benefits of retrieval practice.

And, as always, they did not love my alarming news about multi-tasking. My advice goes like this:

“If you want to study less and learn more, do one thing at a time.”

No insta-snap-gram-tweet-flix-chat-tok while you’re studying. One thing at a time.

Since that talk, I’ve found some interesting research about the LIMITS of that advice, so I thought I’d share it here.

Tech Problems, Tech Benefits

Our reasons to worry about technology use during class seem perfectly obvious. If I am paying attention to my tweets, I am not paying attention to my academic work.

Divided attention = less learning. Obviously.

At the same time, we can easily see ways that technology benefits learning.

If — during a lecture — students text one another to reinforce their understanding of the material (“What did the prof just say?”), they might solidify their learning.

If they look up complementary information on the interwebs, their quest might boost their comprehension. (I’ve been surprised how often my students want to do this in class, and I occasionally allow them to do so.)

So, we need a more precise question than “is technology good or bad?” We need to know — under what precise circumstances does it help? Or hurt?

Technology and Higher Order Thinking

For instance: does off-topic texting during a lecture interfere with both “lower order” and “higher order” thinking, as defined by Bloom?

And, by the way, what role does note-taking play?

A study from 2018 explores this question.

The details, of course, get complicated, but the short version goes like this. Students watched a 24 minute lecture about psychiatric diagnoses: PTSD, ADHD, OCD, and so forth. They also took notes.

Some students received and answered off-topic texts during the lecture — one about every 2 minutes.

After the lecture, students took a test.

Some of those test questions focused simply on recalling details: “How long must an individual display symptoms of PTSD in order to be diagnosed?”

The researchers designed these questions to measure knowledge and comprehension — that is, “Bloom’s level 1 & 2.”

Four questions, however, focused on deeper understanding: “Compare PTSD and ADHD. How do these disorders differ? Are there ways in which they are similar?”

That is: these questioned aimed to measure application and analysis: Bloom’s level 3 & 4.

So: what effect did the OFF-TOPIC TEXTS have on the students’ NOTES, and on their UNDERSTANDING?

The Envelope Please

The researchers’ results surprised them — and certainly surprised me.

Unsurprisingly, students distracted by texts took less complete notes.

And, also unsurprisingly, they did substantially less well on the factual questions. Texters averaged a 62 on those questions, while non-texters averaged a 71.  (If you speak stats, the Cohen’s d was 0.64. That’s an impressively large difference, at least to me.)

Here’s the surprise: researchers did NOT find a statistically significant difference between students’ scores on the application and analysis questions.

How do we explain this finding?

First: let’s admit the possibility that texting distractions do interfere with factual recall but not analysis.

Honestly, I would not have anticipated that finding, but it could be true.

Second: perhaps the timing matters. That is: these students took the test immediately after the lecture. Perhaps application and analysis — unlike mere factual recall — require more time for steeping.

That is, if the “higher order thinking skills” had been tested the next day, perhaps we would have seen a difference in those scores.

Third: perhaps the number of questions mattered. Because the researchers asked only 4 application/analysis questions, they might not have had enough data to discern a difference.

Perhaps a greater number of questions would have revealed a difference.

The Big Picture(s)

Based on this research, will I tell students “it’s okay to text during lectures”?

NO I WILL NOT.

Here’s why:

First, facts matter. If off-topic texting interferes with factual learning, that finding itself means that texting during lectures is bad.

Second, taking notes properly (almost certainly) matters. If texting  interferes with good note-taking, that finding itself should dissuade students from doing so.

Third, I’m willing to believe that texting doesn’t interfere with application/analysis, but only if other studies — with more questions and later tests — consistently demonstrate that result.

Another point also jumps out at me from this research. This study contradicts my firmly held belief that multitasking vexes learning.

I genuinely believe that IT’S A GOOD THING when research contradicts my firmly held beliefs.

If research never contradicted my beliefs, then I would never learn anything from it.

In fact, I would never need to look at research because it shows me only what I already know.

Research might prove most useful to us when it contradicts our beliefs.

Who knows, maybe I’ll go back to those students and update my advice…


Waite, B. M., Lindberg, R., Ernst, B., Bowman, L. L., & Levine, L. E. (2018). Off-task multitasking, note-taking and lower-and higher-order classroom learning. Computers & Education120, 98-111.

Why I Still Love Learning and the Brain Conferences
Andrew Watson
Andrew Watson

I attended my first Learning and the Brain in 2008; I believe the topic was “The Science of Attention.”

Since then, I’ve attended at least two dozen: in New York, Chicago, Washington, San Francisco. Discussing Stress, and Memory, and Ethics, and Technology. And, of course, learning.

At some point, you might reasonably think, I’d get tired of the handouts and the slides and the coffee.

But, no: I still can’t get enough.

Reason #1: Old Friends

Being an interdisciplinary endeavor, the field of Mind, Brain, and Education is dramatically large…and comfortably small. When you come back, you start recognizing folks right away.

John Almarode presents regularly (and, with his bow tie, vivaciously) about applying cognitive science to the classroom. A post-presentation chat with John is one of the great learning experiences you’ll ever have.

I met Sarah Flotten — currently the interim Director of the Peter Clark Center for Mind Brain Education — through a friend several years ago. It’s now an annual event to catch up with her insights, her school, and her center.

Joanna Christodoulou (a former professor of mine) combines knowledge of neuroscience, knowledge of reading, and enthusiasm so compellingly that I’m still learning from her. I get to catch up with here every year or so at LatB.

This list could go on at length: Pooja Agarwal and Ellen Anderson, and even David Daniel (who doesn’t like it when I mention him in the blog).

In brief: if you want to find colleagues who think the way you do about teaching and learning, you’ll find them here. Even better: you’ll build relationships and coalitions that grow over the years.

Reason #2: New Friends

Once you enter the world of Mind, Brain, and Education — on this blog, on twitter, at the conferences — you’ll start meeting people from (literally) across the globe.

At this most recent conference in Boston, I FINALLY got to meet people I’ve been online chatting with for years.

Beth Hawks (twitter handle @PhysicsHawk) — a science teacher, who blogs here — offers a rare twitter presence. She is encouraging, wise, well-informed, and unwilling to be bamboozled by uplifting-but-empty slogans. I’ve been liking her posts for years, and got to meet her in Boston.

Kristin Simmers (@KristinASimmers) — currently studying the intersection of neuroscience and education — reached out to me about my first book AGES ago, and we’ve been in e-conversation ever since. Perhaps 2 years after that first e-exchange, we got to have lunch at the conference. Where else would I get to meet her in person?

Your MBE colleagues are out there — sometimes a continent away. You can meet them at the conferences.

Reason #3: SO MUCH TO LEARN

Of course, depending on your interests, this could be reason #1.

Even after 14 years, I still have so much to learn in this field. The speakers explore their research and insights — challenging me (and each other) to rethink settled ideas in light of new information.

For instance: on the very first day of the Boston conference, two speakers (politely, curiously) squared off on this important question: can we use conscious strategies to respond to stressful environments?

If the answer is “yes,” then we can guide our students (and our colleagues, and ourselves) down one path.

If the answer is “no” — because “stress turns off the pre-frontal cortex” — then we need a different path entirely.

What’s the correct answer? Honestly: check out Judson Brewer and Bessel van der Kolk to see whose analysis you find more persuasive.

https://www.youtube.com/watch?v=gv-CmqMecVY

https://www.youtube.com/watch?v=d_YApSkqsxM

The best place I know to hear these debates and have these conversations: Learning and the Brain.

Beyond FOMO

If you’re worried that you’ve missed out, I’ve got good news: the schedule for the February Conference in San Francisco has been posted!

Is “Cell Phone Addiction” Really a Thing?
Andrew Watson
Andrew Watson

I recently read a tweet asserting “the fact that cell phones are proven to be as addictive as drugs.”

Of course, people casually use the word “addictive” about all sorts of things: chocolate, massages, pumpkin-spice lattes. (No doubt somewhere Twitter is being described as “addictive.” My mother tells me that this blog is addictive.)

But all that casual language doesn’t add up “proving the fact” that cell phones are “as addictive as drugs.” So I started wondering: has this “fact” been “proven”?

Good News, Bad News (Good News, Meh News)

Over the years I’ve adopted a simple strategy. When someone makes a factual claim about research, I ask for the research.

In this case, I simply asked the tweep for the research behind the claim.

Good news: He answered!

I’ve been amazed over the years how often people make “research-based” claims and then refuse to provide research to support them.

In this case, he did in fact point toward research on the topic. (I found one website claiming that 16% of adolescents, and 6.3% of the population, are addicted; alas, no sources cited. Happily, they do provide treatment…)

The tweep’s willingness to cite evidence enhances his credibility. Let’s check it out…

Bad news: hmm. “Evidence,” in this case, means “links to newspaper articles.” Generally speaking, USA Today and Vice.com aren’t the best places to find research. Research is published in journals. (Heck, I’ve even criticized the New York Times for its research credulity.)

So: the tweep’s credibility clicks down slightly.

Good news: All three of the links do, in fact, point to underlying research! I didn’t get a direct connection to the promised research, but I can keep digging to find it.

Credibility clicks back up.

Meh news: it turns out that all three articles point to the same underlying research. That is: I didn’t find three studies supporting the claim that “cell phones are proven to be as addictive as drugs”; I got one.

Now: one study isn’t nothing. But [checks notes] one isn’t three.

This Just In: Correlation Isn’t…

Given how much is riding on this one study, let’s check it out.

First off, we can see right there in the title that the study focuses on correlation. As you’ve no doubt heard dozens (thousands?) of times, “correlation isn’t causation.”

In this case, the authors asked 48 people questions about their cell-phone usage. Based on their answers, they categorized some of those people as “addicted.” And they then found brain differences between the “addicted” and “not addicted” people.

This quick summary leads to several concerns.

First: one study of 48 people doesn’t “prove a fact.” It might be an interesting data point, but that’s all.

Second: this study doesn’t claim to “prove a fact.” Using a questionnaire, it DEFINES some folks as addicted and others as not addicted.

Third: “brain differences” always seems like a big deal, but trust me — they might not be.

People who throw the javelin probably have a different “average muscular profile” than people who run marathons, because they’re using different muscles.

People who play the piano probably have different neural profiles than people who dance ballet, because they’re spending more time using THIS part of the brain than THAT part.

It seems likely people who score high on this “cell-phone addiction” questionnaire behave differently than those who don’t; so it’s not dramatically surprising that their brains are different.

Did the phone cause to brain differences, or brain differences cause phone use? We don’t know. (Because, “correlation isn’t …”)

Important to Note

One interesting point does jump out. The brain differences found by this research team do — in some ways — align with plausible predictions about addiction.

Now, the researchers don’t make strong claims here: the word “drugs” appears only once in the body of the study.

This finding isn’t a big surprise to me. Very roughly, the  brain differences have to do with “our ability to control what we pay attention to.” It’s not hugely surprising that heavy cell-phone users have brain differences there (and that people addicted to drugs do too).

Don’t Stop Now

If the tweep’s study doesn’t support the claim that “cell phones are proven to be addictive,” does other research?

To answer that question, I did a simple google search (“cell phone addiction real”). The first scholarly article that pops up says…not so much.

Here’s their summary:

Although the majority of research in the field declares that smartphones are addictive or takes the existence of smartphone addiction as granted, we did not find sufficient support from the addiction perspective to confirm the existence of smartphone addiction at this time.

The behaviors observed in the research could be better labeled as problematic or maladaptive smartphone use and their consequences do not meet the severity levels of those caused by addiction.

In brief: “maladaptive,” yes; “addictive,” no.

As I continued clicking, I found other skeptical reviews (for instance, here), and also found some that embrace the category (with some open questions, here).

Oh, and, by the way: “cell phone addiction” isn’t included in the DSM-5.

In other words, I think we can reasonably describe the category of “cell phone addiction” as an active scholarly debate. To be clear, this conclusion means we can’t reasonably describe it as “a proven fact.”

Why I Care

I am, believe it or not, open to the idea that cell phones might be addictive. If they are — if at some point research “proves that fact” — then this label might help us treat a real problem effectively.

My objection springs from another source entirely.

I worry when debate about measurable claims sinks to applying insulting labels.

If I think that asking students to memorize is a bad idea, I could study research on the topic. Or, I could dismiss it as “drill and kill.” The insulting label replaces the argument.

If I think that teacher talk is bad, I could study research on the topic. Or, I could mock it as “sage on the stage.” The dismissive label replaces the argument.

If I think that cell-phone usage is bad for teens, I could study research on the topic. Or, I could call it “an addiction.” The alarming label replaces the argument.

If we’re going to rely on research to make decisions about teaching and education (which is, after all, the GOAL of our organization) we should never replace research with labels.

Instead, let’s try something else. Let’s replace labels with research…


Horvath, J., Mundinger, C., Schmitgen, M. M., Wolf, N. D., Sambataro, F., Hirjak, D., … & Wolf, R. C. (2020). Structural and functional correlates of smartphone addiction. Addictive behaviors105, 106334.

Panova, T., & Carbonell, X. (2018). Is smartphone addiction really an addiction?. Journal of behavioral addictions7(2), 252-259.

Billieux, J., Maurage, P., Lopez-Fernandez, O., Kuss, D. J., & Griffiths, M. D. (2015). Can disordered mobile phone use be considered a behavioral addiction? An update on current evidence and a comprehensive model for future research. Current Addiction Reports2(2), 156-162.

Gutiérrez, J., & Rodríguez de Fonseca, F. (2016). Gabriel Rubio.: Cell Phone Addiction: A Review. Front. Psychiatry7, 175.

The Best Way to Take Class Notes
Andrew Watson
Andrew Watson

Teachers often ask me: “how should my students take notes?”

That question typically springs from a heated debate. Despite all the enthusiasm for academic technology, many teachers insist on hand-written notes. (Long-time readers know: I have a provocative opinion on this topic.)

For the time being, let’s set that debate aside.

Instead, let’s ask a more important question: what kind of mental processing should my students do while they take notes?

If students get the mental processing right, then perhaps the handwriting/laptop debate won’t matter so much.

Possibilities and Predictions

To study complicated questions, we start by simplifying them. So, here’s one simplification: in class, I want my students to…

…learn specific facts, ideas, and procedures, and

…learn connections and relationships among those facts, ideas, and procedures.

Of course, class work includes MANY more complexities, but that distinction might be a helpful place to start.

So: should students’ note-taking emphasize the specific facts? OR, should it emphasize the connections and relationships?

The answer just might depend on my teaching.

Here’s the logic:

If my teaching emphasizes facts, then students’ notes should focus on relationships.

If my teaching emphasizes relationships, then their notes should focus on factual specifics.

In these cases, the note-taking strategy complements my teaching to be sure students think both ways.

Of course, if both my teaching and students’ notes focus on facts, then mental processing of relationships and connections would remain under-developed.

In other words: we might want notes to be complementary, not redundant, when it comes to mental processing.

In fact, two researchers at the University of Louisville — Dr. David Bellinger and Dr. Marci DeCaro — tested such a prediction in recent research

Understanding Circulation

Bellinger and DeCaro had college students listen to information-heavy lecture on blood and the circulatory system.

Some students used guided notes that emphasized factual processing. This note-taking system — called “cloze notes” — includes a transcript of the lecture, BUT leaves words out. Students filled in the words.

Bellinger, D. B., & DeCaro, M. S. (2019). Note-taking format and difficulty impact learning from instructor-provided lecture notes. Quarterly Journal of Experimental Psychology, 72(12), 2807-2819.

Others students used guided notes that emphasized conceptual/relational processing. These notes — “outline notes” — organized the lecture’s ideas into conceptual hierarchies, which the students filled out.

And, to be thorough, Bellinger and DeCaro used both “more challenging” and “less challenging” versions of these note systems. As you can see, examples A and B above leave much larger blanks than examples C and D.

So, which note-taking system helped students more?

Because the lecture was “information heavy,” a note-taking system that highlights facts (the “cloze notes”) would be “redundant,” while a system that highlights conceptual relationships (the “outline notes”) would be “complementary.”

That is: students would get facts from the lecture, and see relationships highlighted in the outline notes.

For this reason, Bellinger and DeCaro predicted that the outline notes would help more in this case.

And, sure enough, students remembered more information — and applied it more effectively — when they used the challenging form of the outline notes.

Classroom Implications

Based on this study, do I recommend that you use outline notes with your students?

NO, READER, I DO NOT.

Remember, the “outline notes” worked here because (presumably) they complemented the factual presentation of the lecture.

If, however, the lecture focused more on relationships and connections, then (presumably) “cloze notes” would help more. They would be “complementary.”

As is so often the case, I don’t think we teachers should DO what research says we should DO.

Instead, I think we should THINK the way researchers help us THINK.

In this case, I should ask myself: “will my classroom presentation focus more on facts, or more on relationships and connections?”

Honestly: that’s a difficult question.

In the first place, I lecture only rarely.

And in the second place, my presentations (I hope) focus on both facts and relationships.

But, if I can figure out an answer — “this presentations focuses on relationships among the characters” — then I should devise a complementary note system. In this case, “cloze notes” would probably help, because they highlight facts (and my presentation highlights connections).

In other words: this research — and the theory behind it — doesn’t offer a straightforward, simple answer to the question that launched this post: “how should my students take notes?”

Because learning is complicated, such a usefully intricate answer might be all the more persuasive.


Bellinger, D. B., & DeCaro, M. S. (2019). Note-taking format and difficulty impact learning from instructor-provided lecture notes. Quarterly Journal of Experimental Psychology72(12), 2807-2819.

What is “Mind, Brain, Education”? Defining the Undefinable…
Andrew Watson
Andrew Watson

Here at Learning and the Brain, we bring together psychology (the study of the MIND), neuroscience (the study of the BRAIN), and pedagogy (the study of EDUCATION).

That is: we bring together THREE complex fields, and try to make sense of their interactions, differences, and commonalities.

Such interdisciplinary work creates meaningful challenges.

In any one of those fields, scholars argue about basic definitions and concepts. So, you can imagine the debates that rage when all 3 disciplines together. (Quick: what does the word “transfer” mean? Each field defines that word quite differently…)

So, who decides what “we” think in the field of MBE? What core beliefs hold us together, and how do we know?

One Answer: Ask Delphi

To solve this puzzle, Dr. Tracy Tokuhama-Espinosa, Dr. Ali Nouri, and Dr. David Daniel organized a “Delphi Panel.”

That is: they asked 100+ panelists to respond to several statements about the intersection of psychology, neuroscience, and education. (Full disclosure: I’m almost sure I was 1 of the 100 — but I don’t have specific memories of my contributions.)

They then crunched all those answers to determine a) the panelists’ points of agreement, and b) their enduring concerns about those points.

For instance, 95% of the panelists agreed with this statement:

Human brains are unique as human faces. While the basic structure of most humans’ brains is the same (similar parts in similar regions), no two brains are identical. The genetic makeup unique to each person combines with life experiences and free will to shape neural pathways.

However, several participants disagreed with the inclusion of the phrase “free will” — including some who agreed with the statement overall.

This Delphi Panel method, in other words, BOTH looks for points of consensus, AND preserves nuanced disagreements about them.

 21 Tenets, and Beyond…

So, what do “we” in the world of MBE believe?

The Delphi Panel supported 6 principles and 21 tenets across a wide range of topics: motivation, facial expression, tone of voice, sleep, stress, novelty, even nutrition. (91% of panelists agreed with the statement “NUTRITION influences learning. Basic nutritional needs are common to all humans, however, the frequency of food intake and some dietary needs vary by individual.”)

Taken all together, they add up to several Key Concepts — almost all of which matter to teachers who read this blog.

For instance:

Teachers should understand some basic definitions, and beware of some enduring neuromyths. (“Learning styles,” here’s looking at you.)

We should know that attention networks can improve, and so can executive functions. (I’m a little concerned about this last statement, as it implies false hopes about working memory training.)

Teachers should know that affect matters as much as cognition; that retrieval practice and spacing really work; that growth mindset is a thing; that interleaving helps.

Excellent Timing

In fact, several of this Delphi Panel’s conclusions align with our upcoming conference on Calming Anxious Brains (starting November 19).

For instance:

STRESS influences learning. However, what stresses one person and how may not stress another in the same way. (95% agreement)

ANXIETY influences learning. However, what causes anxiety in one person may not cause anxiety in another. (97% agreement)

In other words: our students aren’t little learning computers. Their emotional systems — when muddled by the stress and anxiety of Covid times — influence learning profoundly.

Teachers should attend to our students’ emotional lives not because of some misguided mushiness; instead, we do so because those lives can make learning much harder, or much more fluent and natural.

MBE research, and the Delphi Panel, say so.


As a bonus, here’s Dr. Tokuhama-Espinosa explaining the “The Difference between Mind, Brain and Education, Educational Neuroscience and the Learning Sciences”:

Changing the System: Where Do We Start?
Andrew Watson
Andrew Watson

I recently spent two hours talking with a group of splendid teachers from Singapore about Mindset Theory.

We talked about “charging” and “retreating.” We discussed “performance goals” and “learning goals.” Of course, “precise praise” merited lots of attention.

At the end of our session, several of their insightful questions focused on systemic change:

How can we help teachers (not just students) develop a growth mindset?

How can we change our grading system to promote GM goals?

What language should we use throughout the school to talk about learning and development?

These questions — and others like them — got me thinking:

We know that psychology and neuroscience research has so much to offer teachers, learners, and education. What systems should be in place to spread the word? 

Thinking Big

This question gets complicated quickly.

In the first place, teaching will (almost) always be INDIVIDUAL work taking place within a complex SYSTEM.

In some cases, we want teachers to have lots of freedom — say, to try out teaching strategies suggested by cognitive science.

In other cases, we want teachers to follow their school leaders’ guidance — say, when leaders follow wise psychology research.

How can we get that balance right?

  • In England, I believe, a national agency (OFSTED) has evaluation standards that apply to all schools and teachers.
  • France is in the process of creating a Council to vet research-based advice to schools and teachers. (LatB speaker Stanislas DeHaene is taking a leading role.)

In the US, of course, local control of schools makes such a system hard to imagine.

What might we do instead? What levers can we push?

I know of one organization — Deans for Impact — that focuses on teacher education.

Their logic makes great sense.

If we can ensure that teacher training programs incorporate cognitive science wisely, we can change the beliefs and practices of a generation of teachers.

Now THAT would — as they say — “move the needle.”

D4I has published a number of immensely useful summaries and reports. This one, for instance, briskly summarizes six core principles of learning: the research behind them, and their classroom implications.

Focus on Schools

Instead of teacher training, we might focus on schools as systems.

Eric Kalenze (blog here) has written a splendid book about creating a school within a schoolWhat The Academy Taught Us doesn’t focus on cognitive science, but it does offer a chalk-in-hand view of building new systems from scratch.

In Kalenze’s telling, a supportive and inspiring principal created just the right combination to allow for meaningful change. (And a school district’s overly rigid policies brought this hopeful experiment to an end.)

I know of several independent schools that are doing exactly this work. The Center for Transformative Teaching and Learning at St. Andrew’s School has been guiding their faculty — and teachers across the country — for over a decade.

The Peter Clark Center for MBE at the Breck School and the Kravis Center for Excellence in Teaching at Loomis Chaffee (the school where I work) both do excellent work in this field.

Perhaps this “Center” model will spread widely throughout schools in the US. If so, these highly local “Deans for Impact”-like initiatives just might — gradually but powerfully — shape the future of teaching.

One By One

At the same time, my own experience suggests the importance of working teacher by teacher.

I attended my first Learning and the Brain conference in 2008. Inspired by the possibilities of combining psychology, neuroscience, and education, I began my own independent exploration.

Although I don’t run a school or supervise teachers, I’m able to spread the word — both as a classroom teacher, and in my work as a consultant (hello Singapore!).

And here’s where Learning and the Brain conferences continue to be so valuable.

The more individual teachers who attend — the more groups of teachers who pool together to share excitement and ideas — the more we can expand networks and create the movement we need.

Perhaps the best way to change the complex system is: one teacher at a time.

I hope you’ll join us in Boston in November!