Skip to main content
Laptop Notes or Handwritten Notes? Even the New York Times Has It Wrong [Reposted]
Andrew Watson
Andrew Watson

You’ll often hear the claim: “research says students remember more when they take notes by hand than when they use laptops.”

The best-known research on the topic was done in 2014.

You’ll be surprised to discover that this conclusion in fact CONTRADICTS the researchers’ own findings. Here’s the story, which I wrote about back in 2018…


Here’s a hypothetical situation:

Let’s say that psychology researchers clearly demonstrate that retrieval practice helps students form long-term memories better than rereading the textbook does.

However, despite this clear evidence, these researchers emphatically tell students to avoid retrieval practice and instead reread the textbook. These researchers have two justifications for their perverse recommendation:

First: students aren’t currently doing retrieval practice, and

Second: they can’t possibly learn how to do so.

Because we are teachers, we are likely to respond this way: “Wait a minute! Students learn how to do new things all the time. If retrieval practice is better, we should teach them how to do it, and then they’ll learn more. This solution is perfectly obvious.”

Of course it is. It’s PERFECTLY OBVIOUS.

Believe It Or Not…

This hypothetical situation is, in fact, all too real.

In 2014, Pam Mueller and Dan Oppenheimer did a blockbuster study comparing the learning advantages of handwritten notes to laptop notes.

Their data clearly suggest that laptop notes ought to be superior to handwritten notes as long as students learn to take notes the correct way.

(The correct way is: students should reword the professor’s lecture, rather than simply copy the words down verbatim.)

However — amazingly — the study concludes

First: students aren’t currently rewording their professor’s lecture, and

Second: they can’t possibly learn how to do so.

Because of these two beliefs, Mueller and Oppenheimer argue that — in their witty title — “The Pen is Mightier than the Laptop.”

But, as we’ve seen in the hypothetical above, this conclusion is PERFECTLY OBVIOUSLY incorrect.

Students can learn how to do new things. They do so all the time. Learning to do new things is the point of school.

If students can learn to reword the professor’s lecture when taking notes on a laptop, then Mueller and Oppenheimer’s own data suggest that they’ll learn more. And yes, I do mean “learn more than people who take handwritten notes.”

(Why? Because laptop note-takers can write more words than handwriters, and in M&O’s research, more words lead to more learning.)

And yet, despite the self-evident logic of this argument, the belief that handwritten notes are superior to laptop notes has won the day.

That argument is commonplace is the field of psychology. (Here‘s a recent example.)

Even the New York Times has embraced it.

The Fine Print

I do need to be clear about the limits of my argument:

First: I do NOT argue that a study has been done supporting my specific hypothesis. That is: as far as I know, no one has trained students to take reworded laptop notes, and found a learning benefit over reworded handwritten notes. That conclusion is the logical hypothesis based on Mueller and Oppenheimer’s research, but we have no explicit research support yet.

Second: I do NOT discount the importance of internet distractions. Of course students using laptops might be easily distracted by Twinsta-face-gram-book. (Like everyone else, I cite Faria Sana’s research to emphasize this point.)

However, that’s not the argument that Mueller and Oppenheimer are making. Their research isn’t about internet distractions; it’s about the importance of reworded notes vs. verbatim notes.

Third: I often hear the argument that the physical act of writing helps encode learning more richly than the physical act of typing. When I ask for research supporting that contention, people send me articles about 1st and 2nd graders learning to write.

It is, I suppose, possible that this research about 1st graders applies to college students taking notes. But, that’s a very substantial extrapolation–much grander than my own modest extrapolation of Mueller and Oppenheimer’s research.

And, again, it’s NOT the argument that M&O are making.

To believe that the kinesthetics of handwriting make an essential difference to learning, I want to find a study showing that the physical act of writing helps high school/college students who are taking handwritten notes learn more. Absent that research, this argument is even more hypothetical than my own.

Hopeful Conclusion

The field of Mind, Brain, & Education promises that the whole will be greater than the sum of the parts.

That is: if psychologists and neuroscientists and teachers work together, we can all help each other understand how to do our work better.

Frequently, advice from the world of psychology gives teachers wise guidance. (For example: retrieval practice.)

In this case, we teachers can give psychology wise guidance. The founding assumption of the Mueller and Oppenheimer study — that students can’t learn to do new things — simply isn’t true. No one knows that better than teachers do.

If we can keep this essential truth at the front of psychology and neuroscience research, we can benefit the work that they do, and improve the advice that they give.

“How to Study Less and Learn More”: Explaining Learning Strategies to our Students
Andrew Watson
Andrew Watson

Because cognitive science gives us such good guidance about learning, we want to share that information with our students.

“Study THIS WAY!” we cry. “Research says so!”

Alas, all too often, students don’t follow our advice.

A key part of the problem: the research that supports our advice is — ahem — really complicated and abstract. We might find it convincing, but our students’ eyes glaze over when we try to explain.

Because I talk frequently talk with students about brain research, I’m always on the lookout for research that…

… is methodologically sound,

… supports useful studying advice, and

… is easy to explain.

I’ve found such a study [updated link], and I think we can explain it to our students quite easily.

Two Are Better Than One

We all know the research showing that sleep helps consolidate long-term memory formation (fun studies here).

We all know the research showing that spreading practice out is better than doing it all at once (fascinating research here).

How about doing both? How about doing two study sessions, and sleeping in between them?

If we could convince our students to adopt those two strategies, that would be GREAT.

And, the research necessary to test that advice is — conceptually, at least — easy to do.

Students learned a topic: French-Swahili word pairs. (This research was done in France.)

Half of them did that at 9 am, and then tested themselves 12 hours later, at 9 pm. (Note: they did not sleep between these two sessions.)

How many times did these non-sleepers have to go through their flashcards to get all the answers right?

On average, they reviewed flashcards 5.8 times to get all those word pairs right. (For the sake of simplicity, let’s just call that 6.)

The other half learned the French-Swahili word pairs at 9 pm. They then got a good night’s sleep, and tested themselves 12 hours later, at 9 am.

How many times did the sleepers go through flashcards to get all the word pairs right? On average, they got them all right on the third attempt.

That’s right: instead of 6 review sessions, they needed 3.

Can We Do Better?

Okay, so far this study is easy to explain and shows real promise. Because they spread practice out AND slept, they cut study time IN HALF to get all the answers right.

But, so far this research measures learning 12 hours later. That’s not really learning. What happens if we test them later?

Specifically, what happens if we test them 6 months later?

Hold onto your hat.

When the researchers retested these students, the non-sleepers remembered 4 of those word pairs. The sleepers remembered 8 pairs.

So: HALF as much review resulted in TWICE as much learning 6 MONTHS later.

The Headline Please

When I talk with students about brain research, I start with this question: “Would you like to study less and learn more?”

I have yet to meet the student who doesn’t get behind that goal.

This easy-to-explain study shows students that half as much review leads to twice as much memory formation — if they both spread practice out over time and sleep between review sessions.

I think we have a winner.

The Limits of “Desirable Difficulties”: Catching Up with Sans Forgetica
Andrew Watson
Andrew Watson

We have lots of research suggesting that “desirable difficulties” enhance learning.

That is: we want our students to think just a little bit harder as they practice concepts they’re learning.

Why is retrieval practice so effective ? Because it requires students to think harder than mere review.

Why do students learn more when they space practice out over time? Because they have to think back over a longer stretch — and that’s more difficult.

We’ve even had some evidence for a very strange idea: maybe the font matters. If students have to read material in a hard-to-read font, perhaps their additional effort/concentration involved will boost their learning.

As I wrote last year, a research team has developed a font designed for exactly that reason: Sans Forgetica. (Clever name, no?) According to their claims, this font creates the optimal level of reading difficulty and thereby could enhance learning.

However — as noted back then — their results weren’t published in a peer-reviewed journal. (All efforts to communicate with them go to their university’s publicity team. That’s REALLY unusual.)

So: what happens when another group of researchers tests Sans Forgetica?

Testing Sans Forgetica

Testing this question is unusually straightforward.

Researchers first asked participants to read passages in Sans Forgetica and similar passages in Arial. Sure enough, they rated Sans Forgetica harder to read.

They then ran three more studies.

First, they tested participants’ memory of word pairs.

Second, they tested memory of factual information.

Third, they tested understanding of conceptual understanding.

In other words, they were SUPER thorough. This research team didn’t just measure one thing and claim they knew the answer. To ensure they had good support behind their claims, they tested the potential benefits of Sans Forgetica in many ways.

So, after all this thorough testing, what effect did Sans Forgetica have?

Nada. Bupkis. Nuthin.

For example: when they tested recall of factual information, participants remembered 74.73% of the facts they read in Sans Forgetica. They remembered 73.24% of the facts they read in Arial.

When they tested word pairs, Sans Forgetica resulted in lower results. Participants remembered 40.26% of the Sans Forgetica word pairs, and 50.51% of the Arial word pairs.

In brief, this hard-to-read font certainly doesn’t help, and it might hurt.

Practical Implications

First, don’t use Sans Forgetica. As the study’s authors write:

If students put their study materials into Sans Forgetica in the mistaken belief that the feeling of difficulty created is benefiting them, they might forgo other, effective study techniques.

Instead, we should encourage learners to rely on the robust, theoretically-grounded techniques […] that really do enhance learning.

Second, to repeat that final sentence: we have LOTS of study techniques that do work. Students should use retrieval practice. They should space practice out over time. They should manage working memory load. Obviously, they should minimize distractions — put the cell phone down!

We have good evidence that those techniques work.

Third, don’t change teaching practices based on unpublished research. Sans Forgetica has a great publicity arm — they were trumpeted on NPR! But publicity isn’t evidence.

Now more than ever, teachers should keep this rule in mind.

“Doing Science” or “Being a Scientist”: What Words Motivate Students?
Andrew Watson
Andrew Watson

Teachers often find that small changes in wording produce big benefits.

One recent example: a research team in New York explored the difference between “being a scientist” and “doing science.”

The first phrasing — “being a scientist” — might imply that scientist is a kind of fixed, exclusive identity. In the same way that dogs are dogs and can’t also be cats, so too young children might infer that people who are artists or athletes or authors can’t also be scientists.

The second phrasing — “doing science” — might clear away that rigidity. This classroom exercise is something we’re all doing. It doesn’t have immediate identity implications one way or another.

If this simple switch in phrasing helps motivate students, that would be the least expensive, least time-consuming intervention EVAH.

The Research

Three researchers prepared a science lesson about friction for pre-kindergarten students.

Half of the teachers (62) saw a training video that modeled specific language: “Today we are going to do science! The first part of doing science is observing with our senses.”

The other half (68) saw a similar video that didn’t include such modeling. (Researchers assumed that most teachers — without clear modeling — would using phrasing about ‘being a scientist’ rather than ‘doing science.’ Indeed, that’s what happened.)

Teachers then ran those friction lessons, where toy cars rolled down ramps with different surfaces: carpet, sandpaper, wrapping paper.

A few days later, these pre-K students had the chance to play a tablet-based video game that resembled their science experiment. The game was programmed in such a way that all students got the first round right (success!) and the second round wrong (struggle!).

So, how long did these children persist after struggle? And: did the “doing science” vs. “being a scientist” language matter?

The Results

Sure enough, students in the “do science” lessons persisted longer than those in the the “be a scientist” lessons.

That is: when teachers spoke of science an action we take, not an identity that we have (or don’t have), this subtle linguistic shift motivated to students to keep going longer.

The effects, although statistically significant, were quite small.

Students in the “do science” lessons were 6% likelier to continue after they got one question wrong. And they were 4% likelier to keep going three problems later. (You read that right: six percent, and four percent.)

We might read these results and throw our hands up in exasperation. “Six percent! Who cares?”

My answer is: we ought to care. Here’s why.

Students experienced this linguistic change exactly once. It cost nothing to enact. It took no time whatsover. Unlike so many educational interventions — pricey and time consuming — this one leaves our scarcest resources intact.

Now: imagine the effect if students heard this language more than once. What if they heard it every time their teacher talked with them about science. (Or, art. Or, creativity. Or, math. Or, any of those things that feel like ‘identities’ rather than ‘activities.’)

We don’t (yet) have research to answer those questions. But it seems entirely plausible that this FREE intervention could have increasingly substantial impact over a student’s school career.

One Step More

In two ways, this research reminds me of Mindset Theory.

First: Dweck’s work has taken quite a drubbing in recent months. In some some social media circles, it’s fashionable to look down on this research — especially because “the effects are so small.”

But, again: if one short mindset intervention (that is FREE and takes NO TIME) produces any effect — even a very small effect — that’s good news. Presumably we can repeat it often enough to make a greater difference over time.

I’m not arguing that promoting a growth mindset will change everything. I am arguing that even small boosts in motivation — especially motivation in schools — should be treasured, not mocked.

Second: this research rhymes with Mindset Theory. Although the researchers didn’t measure the students’ mindsets — and certainly didn’t measure any potential change in mindset — the underlying theory fits well with Dweck’s work.

That is: people who have a fixed mindset typically interpret success or failure to result from identity: I am (or am not) a “math person,” and that’s why I succeeded (or failed).

People with a growth mindset typically interpret success or failure to result from the quality of work that was done. If I work effectively, I get good results; if I don’t, I don’t.

So: this study considered students who heard that they should think about science as an identity (“being a scientist”) or as a kind of mental work (“doing science”). The results line up neatly with mindset predictions.

To Sum Up

First: small changes in language really can matter.

Second: encouraging students to “do this work” rather than “be this kind of person” can have motivational benefits.

Third: small changes in student motivation might not seem super impressive in the short term. But, if they add up over time, they might be well worth the very small investment needed to create them.

“Seductive Details”: When Do Cool Stories and Videos Interfere with Learning?
Andrew Watson
Andrew Watson

As an English teacher, I really do enjoy (almost) everything I teach. I love discussing Macbeth, and coaching strong writing, and parsing English grammar. (No, really!)

My students? Not so much with the enjoying.

They’re a good-natured lot, and so — for the most part — amiably play along with my enthusiasm. But, at times I feel I should enliven a topic with a story or a picture or a video.

If our Macbeth conversation slips into neutral, I might talk about an actor’s a funny mistake during a recent production of The Crucible.

If they just can’t focus on the difference between a predicate nominative and a predicate adjective, I might start using my Godfather voice and ask them to show respect for the family. (They LOVE it when I do that…)

But here’s the question: do those funny/gross/intriguing side treks ultimately benefit or harm learning?

Seductive Details

In research world, we call these additions “seductive details.”

For instance, my lesson plan might focus on the geological forces that cause volcanoes to erupt. Past experience tells me that the lesson itself can be a bit of a slog, so I start it off with the story of Krakatoa’s eruption — which killed more than 35,000 people.

Maybe I break out some pictures of Vesuvius — an eruption that buried and preserved an entire Roman city.

In these cases, my stories enliven the general topic: “volcanoes are super deadly!” But they don’t add to the specific learning goal: geological forces that cause eruptions.

Such stories are “seductive,” but not — shall we say — “substantive” in their contribution to the lesson.

This topic gets lots of scholarly interest, and has led to many publications. Quite recently, Dr. NarayanKripa Sundararajan (Kripa Sundar) and Dr. Olusola Adesope published a meta-analysis that crunched the data of 68 different experiments.

What did they learn?

Many Questions, Helpful Answers

Given so many studies to examine, these researchers had lots of ways to parse the data:

Does it matter if the “seductive detail” is a photo or a video or an audio recording?

Does it matter how researchers measure ultimate learning?

Perhaps it matters if the “seductive details” is at the beginning, middle, or end of the lesson?

Does it matter if the students had some prior knowledge of the materiel?

With so many variables (and lots more), Sundararajan and Adesope have LOTS of conclusions to report. Rather than list them all, I’ll highlight a few that struck me as most important.

First: seductive details matter. They do, in fact, interfere with learning. Depending on which variable they studied, the researchers found different effect sizes. But, quite consistently, additional “seductive” information ended up lowering final measurements of learning.

For video and audio. If the details were at the beginning, in the middle, or at the end. For novices and experienced students. Etc…

Second: the length of the lesson matters. Specifically, seductive details have a considerable effect in short lessons (less than 5 minutes), but no statistically significant effect on longer ones (more than 10 minutes).

Practically speaking, I think this means that typical classroom lessons (which very rarely last less than 5 minutes) won’t suffer terribly from the inclusion of seductive details.

But — and this is an important exception — our current climate of online teaching might well prompt us to create brief lessons. In such lessons, seductive details will be much more distracting.

Third: the meta-analysis suggests that suggestive details create bigger problems for novices than for experts — or even for students who have some baseline knowledge of the topic. So, as Dr. Sundararajan wrote to me, “perhaps it’s not a good idea to use [seductive details] when introducing new content but perhaps not too bad in a review.”

What Should Teachers Do Now?

To start off, in the researchers’ words: “educators should minimize the use of seductive details in their instruction.”

That advice holds true especially if those details might…

… distract students from the essential ideas under discussion,

… remind students of prior misconceptions that you want to override,

… take up lots of space (say — a diagram on a page),

… take place in a relatively short (online?) lesson.

Another strategy: rework those seductive details so that the DO connect DIRECTLY to the learning goal. So, rather than simply focus the Vesuvius story on the destruction it wrought, instead talk about Roman conceptions of the causes of volcanoes. Those mythical explanations aren’t literally true, but they point the way toward — and might even align with — the scientific content you want to cover.

Yet another strategy: don’t sweat too muchSeductive details don’t permanently destroy all possibility of understanding. If used the wrong way, yes, they can get in the way. But — as all teachers know — sometimes students need a lively distraction to perk them up. As long as we use those details sparingly and thoughtfully, we don’t have to panic about occasional side-tracks.

By the way: I’m not the only one who thinks this. When I emailed Dr. Sundararajan with questions about the meta-analysis, part of her answer was:

“I think it is also important for teachers to remember that the effect is fairly small and not to feel more guilt than they need to – avoid seductive details when possible.

If your kids need a laugh, bring [seductive details] in and be aware that you might want to revisit that content with other learning strategies to help reinforce the learning (e.g. retrieval practice, or note-taking).

Along the same lines, keep in mind that kids are kids and sometimes you just have to let them process the distraction and restart.”

In other words: seductive details matter, but other things matter too. As long as we keep this research in mind as we make our teaching decisions, we’re welcome to talk about Super Deadly Volcanoes from time to time.


By the way: Dr. Sundararajan has expressed an interest in working with teachers on questions and materials. You can reach her at her website, and find her on twitter: @kripasundar.

Beyond “Tricks-n-Tips”: What does Cog Sci Tell Us About Online Learning?
Andrew Watson
Andrew Watson

In our early scramble to get teaching online, it’s easy to focus on the immediately practical: how to auto-mute on Zoom, how to use Dropbox links, how to find the best online resources.

This emphasis on tricks and tips makes good sense in the short term.

Once we’ve  gotten a few days’ experience in this new teaching world, we can take a mental step back and ask about the bigger learning picture.

What can cognitive science tell us about teaching and learning online?

As is so often the case, the answer to that question boils down to these words: “don’t just do this thing. Instead, think this way.”

In other words: research can give lots of very specific advice. But it’s probably most useful when it suggests broad, flexible principles that teachers can adapt to our own specific circumstances.

One Place to Start

Regular readers know that working memory is essential for learning. It allows us to hold and combine ideas, bits of information, mental processes, and so forth.

When we successfully hold and combine — and practice doing so the right way — that’s when learning happens.

Alas, we don’t have much working memory.

This CRUCIAL bottleneck dooms many worthy teaching endeavors. But, if we manage it well, we show real expertise in our craft.

So, if the question is:

“what can cognitive science tell us about online learning?”

one answer is:

“As much as we can, we should recognize and mitigate the working memory demands of this new learning world.”

In other words: students are using working memory not only to learn our content, but also to manage the novel physical and mental space in which this learning should happen. As much as feasible, we should help.

A Simple Example

Over on Twitter, I’ve been learning from David Weston (@informed_edu) to get practical information about online teaching. (Some of those “tricks and tips.”)

For instance, he recently posted a video showing how teachers can show a PowerPoint presentation over Zoom.

For some of us, that’s immensely helpful information.

At the same time — depending on your prior knowledge — this video might require lots of heavy lifting in working memory.

You’ve got to use ALT+TAB (if you’re using a PC) or COMMAND+TAB (if you’re using a MAC). You’ve got to navigate one arrangement of buttons for PowerPoint, and a quite different arrangement for Zoom. You’ve got to determine whether or not you have to switch back-n-forth during the presentation to advance the PowerPoint slides.

If you know from PowerPoint and Zoom, then this combination of steps is probably quite easy to manage.

If, however, you’re a newbie to either, then you might struggle to process all those steps effectively. You’ll probably have to rewatch parts of the video. You’ll probably make several mistakes. You’ll probably get frustrated before you finally figure it out.

And — here’s my key point — our students are probably experiencing similar frustrations. They’re figuring out new systems. They’re adapting old learning models to new (bizarre) circumstances.

All that working memory stress comes on top of the working memory stress that learning always requires.

And so my advice is not “do this thing” (“Here’s how you can solve this problem…”) Instead, cognitive science encourages us to “think this way.”

We should develop the new mental habit of asking: how does this particular learning arrangement increase working memory load for me and my students? And, what can I do to fix the problem?

Two Important Points

First: almost certainly the solutions to the working memory problems will be…

… choose to slow down and practice the new/unfamiliar steps,

… use your teacherly instincts,

… be patient with your students and yourself.

That advice isn’t super specific. But: it’s really flexible. And, given what we know about working memory, it really will help.

Second: I’ve used Weston’s video as an example of potential working memory overload NOT because it’s badly done. Instead, Weston has created a video that will help most people; and, it will help even more if we pause to recognize its working memory demands.

That is: if technology just isn’t your thing, if you’ve never Zoomed before, if you’re not sure whether you have a PC or a Mac, assume that you’ll need to reduce working memory demands in one part of your teaching world to create some working memory headroom to deal with the technology.

That’s hard to do. But: it’s MUCH easier to do if we proactively think this way than if we try to solve working memory problems as they occur.

Cognitive science tells us that our brains work that way. We can use that knowledge to make online teaching and learning the best it can be.

Beyond the Mouse: Pointing in Online Learning [Repost]
Andrew Watson
Andrew Watson

As teachers across the country prepare to move our work online, I’ve been looking over previous posts that might offer practical guidance.

This post — from July of last year — asks a simple question: in online teaching, does pointing matter?

Happily, research by Richard Mayer points us in a useful direction.


You know, of course, that the right kind of movement can help students learn. The nascent field of “embodied cognition” works to explore the strategies that work most effectively.

Here’s a collection of resources.

And, here’s a recent blog post about kindergarteners moving to learn the number line.

You also know that online learners easily get distracted, often because they multitask. (I say “they” because you and I would never do such things.)

This recent post shows that even folding laundry — a harmless-seeming activity — reduces online learning.

What happens when we put these two research pools together?

Specifically: can movement reduce distraction, and increase learning, for online learners?

Benefits of Online Pointing?

Several researchers — including the estimable Richard Mayer — wanted to answer that question.

Specifically, they wanted to know: do pointing gestures made by the teacher help online students learn?

They had students watch an online lecture (about “neural transmission,” naturally).

For the first group of students, the teacher pointed at specific places on relevant diagrams.

For the second group, the teacher pointed generally toward the diagrams (but not at specific parts of them).

For the third, the teacher moved his hands about, without pointing specifically.

For the fourth, the teacher didn’t move his hands.

Do different pointing strategies help or hurt?

Benefits Indeed

Sure enough, pointing matters.

Students in the first group spent more time looking at the relevant parts of the diagrams.

They did better on a test that day.

And — most important — they did better than the other groups on a test a week later.

Now: a week isn’t exactly learning. We want our students to remember facts and concepts for months. (Preferably, forever.)

But, the fact that the memories had lasted a week suggests it’s MUCH likelier they’ll last longer still.

Practical Implications

If your classroom life includes online teaching, or teaching with videos, try to include specific pointing gestures to focus students on relevant information. At least with this student population, such gestures really helped.

By the way, this study doesn’t answer an interesting and important question: “does student movement as they watch online lectures help or hurt their learning?”

We know from the study cited above that irrelevant movement (like folding laundry) doesn’t help. But: should students mirror your gestures as they watch videos? Should you give them particular gestures to emulate?

We don’t know yet…but I hope future research helps us find an answer.

Overcoming Potential Perils of Online Learning [Repost]
Andrew Watson
Andrew Watson

In June of 2019, I wrote about Dr. Rachael Blasiman’s research into the effect of typical distractions on online learning.

Given the current health climate, I thought her work might be especially helpful right now.

The key take-aways here:

First: (unsurprisingly) distractions interfere with online learning, and

Second: (crucially) we can do something about that.

In brief, we should start our online classes by teaching students how to learn online…

Here’s the post from June.


Online learning offers many tempting — almost irresistible — possibilities. Almost anyone can study almost anything from almost anywhere.

What’s not to love?

A tough-minded response to that optimistic question might be:

“Yes, anyone can study anything, but will they learn it?”

More precisely: “will they learn it roughly as well as they do in person?”

If the answer to that question is “no,” then it doesn’t really matter that they undertook all that study.

Rachael Blasiman and her team wanted to know if common at-home distractions interfere with online learning.

So: can I learn online while…

…watching a nature documentary?

…texting a friend?

…folding laundry?

…playing a video game?

…watching The Princess Bride?

Helpful Study, Helpful Answers

To answer this important and practical question, Blasiman’s team first had students watch an online lecture undistracted. They took a test on that lecture, to see how much they typically learn online with undivided attention.

Team Blasiman then had students watch 2 more online lectures, each one with a distractor present.

Some students had a casual conversation while watching. Others played a simple video game. And, yes, others watched a fencing scene from Princess Bride.

Did these distractions influence their ability to learn?

On average, these distractions lowered test scores by 25%.

That is: undistracted students averaged an 87% on post-video quizzes. Distracted students averaged a 62%.

Conversation and The Princess Bride were most distracting (they lowered scores by ~30%). The nature video was least distracting — but still lowered scores by 15%.

In case you’re wondering: men and women were equally muddled by these distractions.

Teaching Implications

In this case, knowledge may well help us win the battle.

Blasiman & Co. sensibly recommend that teachers share this study with their students, to emphasize the importance of working in a distraction-free environment.

And, they encourage students to make concrete plans to create — and to work in — those environments.

(This post, on “implementation intentions,” offers highly effective ways to encourage students to do so.)

I also think it’s helpful to think about this study in reverse. The BAD news is that distractions clearly hinder learning.

The GOOD news: in a distraction-free environment, students can indeed start to learn a good deal of information.

(Researchers didn’t measure how much they remembered a week or a month later, so we don’t know for sure. But: we’ve got confidence they had some initial success in encoding information.)

In other words: online classes might not be a panacea. But, under the right conditions, they might indeed benefit students who would not otherwise have an opportunity to learn.


I’ve just learned that both of Dr. Blasiman’s co-authors on this study were undergraduates at the time they did the work. That’s quite unusual in research world, and very admirable! [6-11-19]

Does Teaching HANDWRITING Help Students READ?
Andrew Watson
Andrew Watson

I recently saw a newspaper headline suggesting that teaching students HANDWRITING ultimately improves their READING ability.

As an English teacher, I was intrigued by that claim.

As a skeptic, I was … well … skeptical.

In this case, we have two good reasons to be skeptical. First, we should always be skeptical. Second, claims of transfer rarely hold up.

What is “transfer”?

Well, if you teach me calculus, then it’s likely I’ll get better at calculus. If you teach me to play the violin, it’s likely I’ll get better at playing the violin. But: if you teach me to play the violin, it’s NOT likely that this skill will transfer to another skill — like calculus. (And, no: music training in youth doesn’t reliably improve math ability later in life.)

In fact, most claims of transfer — “teaching you X makes you better at distantly-related-thing A” — end up being untrue.

So, is it true — as this newspaper headline implied — that handwriting skills transfer to reading skills?

The Research

This newspaper article pointed to research by Dr. Anabela Malpique, working in Western Australia.

Her research team worked with 154 6-7 year-olds around Perth. They measured all sorts of variables, including…

…the students’ handwriting automaticity (how well can they write individual letters),

…their reading skills (how accurately they read individual words),

…the amount of time the teachers reported spending in reading/writing instruction.

And, they measured handwriting automaticity and reading skills at the beginning and end of the year. For that reason, they could look for relationships among their variables over time. (As you can see, Malpique’s research focuses on many topics — not just the writing/reading question that I’m discussing in this post.)

Tentative Conclusions

To their surprise, Malpique’s team found that more fluent letter formation at the beginning of the year predicted more fluent word reading at the end of the year. In their words, this finding

suggest[s] that being able to write letters quickly and effortlessly in kindergarten facilitates pre-reading and decoding skills one year later.

In other words: this research allows the possibility that teaching writing does ultimately help students read single words.

However — and this is a big however — the researchers’ methodology does NOT allow for causal conclusions. They see a mathematical “relationship” between two things, but don’t say that the writing ability led to later reading ability.

They warn:

Experimental research is needed to confirm these findings[,] and systematically evaluate potential explanatory mechanism[s] of writing-to-reading effects over time in the early years.

They specifically note that they did NOT measure reading comprehension; they measured single word reading.

To put this in other words: we would like to know if

a) teaching letter writing leads to

b) improved letter writing fluency, which leads to

c) improved single word reading, which leads to

d) improved reading comprehension.

These findings make the b) to c) connection more plausible, but the certainly do not “prove” that a) leads to d).

Classroom Implications

This research doesn’t claim we should make big changes right away.

I do think it leads to this conclusion:

Some schools are replacing books with computers and tablets. I can imagine (although I haven’t heard this) that advocates might make this claim:

“In the future, no one will need to write by hand. Everything will be keyboarding, and so we need to get children typing as soon as possible. Let’s replace handwriting instruction with keyboarding instruction, to prepare our kids for the future!”

If we hear that argument, we can say:

“I have LOTS of objections to that logical chain. In particular, we have tentative reasons to believe that handwriting instruction improves reading. If that’s true — and we don’t yet know — we should be VERY wary of doing anything that slows our students’ ability to read. We might not be handwriting so much in the future, but we’ll be reading forever.”

In sum: I don’t think that newspaper article captured essential nuances. However, this research raises the intriguing possibility that transfer just might take place from writing instruction to single-word reading. We need more research to know with greater certainty.

But, given the importance of reading for school and life, we should be excited to find anything that can help students do better.

The Big Six: A Grand Summary
Andrew Watson
Andrew Watson

Much of the time, this blog digs into a specific example of a specific teaching practice.

Within the last two weeks, I’ve written about spacing and interleaving in math instruction, a “big challenging book” strategy for struggling readers, and the potential benefits of cold calling.

At times, however, it’s helpful to zoom the camera back and look at THE BIG PICTURE.

What does cognitive science tell us about learning?

Today’s Grand Summary

Regular readers know that The Learning Scientists do a GREAT job explaining…well…the science of learning.

In particular, they focus on “six strategies of effective learning”:

Spacing

Interleaving

Retrieval Practice

Concrete Examples

Elaboration

Dual Coding

In a recent post, Dr. Megan Sumeracki does a typically helpful job giving a thoughtful overview of those strategies. Rather than summarize her summary, I’m encouraging you to give her post a quick read. It will help put the pieces together for you.

Wise Caveats

Sumeracki introduces her summary with this helpful note:

Before digging into the specifics of each strategy, it is important to note that they are very flexible. This is a good thing, in that it means they can be used in a lot of different situations.

However, this also means that there really isn’t a specific prescription we can provide that will “always work.”

Instead, understanding the strategies and how they work can help instructors and students. [Emphasis added.]

In other words — as you often read on this blog — “don’t just do this thing; instead, think this way.”

Cognitive science really cannot provide a script for teachers to read verbatim. Instead, it offers principles that we must adapt to our own specific classrooms and students.

So, if you increase spacing and retrieval practice, your students will — almost certainly — remember more over the long term. But: exactly how to do that will differ from classroom to classroom, grade to grade, culture to culture.

In other words: teachers should draw on scientific understanding of minds and brains to shape our work. But: teaching itself isn’t a science. It’s a craft, a passion, a profession.