Skip to main content
Why Time is a Teacher’s Greatest Commodity…and What to Do When You Don’t Have Enough of It
Guest Post
Guest Post

Today’s guest post is by Jim Heal, Director of New Initiatives, and Rebekah Berlin, Senior Program Director at Deans for Impact.

Long-time readers know how much I respect the work that Deans for Impact does. Their Resources — clear, brief, research informed, bracingly practical — offer insight and guidance in this ever-evolving field.


Ask any teacher to name a rare commodity in their profession and there’s a good chance they will reply with the word: “Time.” Whether it’s time to plan, grade, or even catch one’s breath in the midst of a busy school day, time matters.

Time is perhaps most important when it comes to time spent focusing on the material you want students to learn. So, how do you ensure that you’re making the most of the time you have with students and that they’re making the most of the way you structure their time?

Water Is Life

To answer this, let’s consider the following scenario. You’re a 7th Grade ELA teacher teaching a lesson on ‘Water is Life’ – a nonfiction text by Barbara Kingsolver. One of the objectives for this lesson is: Analyze the development of ideas over the course of a text.

You know from reading the teacher’s guide that student success will require them to compare two parts of the reading: a section describing a lush setting with an abundance of water and another describing an arid setting where rain hardly ever falls. Comparing the two will allow students to explore one of the main ideas of the text: The environmental role played by water and water sustainability.

Here is the section of the lesson[1] designed to address these aims. Take a moment to read it and consider when students are being asked to think deeply about comparing the two settings:

You arrive at school on the morning you’re due to teach this content, and there’s an unexpected announcement for students to attend club photo sessions for the yearbook during your lesson.

Big Changes, Little Time

At this point you realize that, by the time your class gets back together, you’ll need to cut ten minutes from this part of the lesson and now you have a choice to make:

If you only had twenty minutes to teach the thirty minutes of content you had planned for, how would you adapt your plan so that the most important parts of the lesson remained intact?

Let’s begin addressing this challenge with a couple of simple truths:

First: The harder and deeper we think about something, the more durable the memory will be. This means that we need students to think effortfully about the most important content in any lesson if we want it to stick.

Second: If you treat everything in the lesson as equally valuable and try to squeeze it all into less time, students are unlikely to engage in the deep thinking they need to remember the important content later.

Therefore, something’s got to give.

To help determine what goes and stays, you’re going to need to differentiate between three types of instructional tasks that can feature in any given lesson plan.

Effortful Tasks

Tasks and prompts that invite students to think hard and deep about the core content for that lesson.

In the case of ‘Water is Life’ a quick review of the plan tells us the effortful question (i.e. the part that directs students to the core knowledge they will need to think deeply about) doesn’t come until the end of the allotted thirty minute period.

This question is this lesson’s equivalent of the ‘Aha!’ moment in which students are expected to “analyze the development of ideas over the course of the text” (the lesson objective) by exploring the way the author uses juxtaposition across the two settings.

If you reacted to the shortened lesson time by simply sticking to the first twenty minutes’ worth of content, the opportunity for students to engage in the most meaningful part of the lesson would be lost. It’s therefore crucial to ask what is most essential for student learning in each case and ensure that those parts are prioritized.

Essential Tasks

Foundational tasks and prompts that scaffold students to be able to engage with the effortful questions that follow.

Just because effortful thinking about core content is the goal, that doesn’t mean you should make a beeline for the richest part of the lesson without helping students build the essential understanding they will need in order to engage with it effortfully.

In the case of ‘Water is Life’ – even though some of the tasks aren’t necessarily effortful, they are an essential stair step for students to be able to access effortful thinking opportunities.

For example, consider the moment in the lesson immediately prior to the effortful thinking prompt we just identified:

As you can see, even though we want students to go on and address the effortful task of juxtaposing the language in each of the two settings, that step won’t be possible unless they have a good understanding of the settings themselves. This part might not be effortful, but it is essential.

In this example, it isn’t essential that students share their understanding of each setting as stated in the plan, but it is essential that they do this thinking before taking on a complex question about juxtaposed settings. In other words, the instructional strategy used isn’t essential, but the thinking students do is.

Armed with this understanding, you can now shave some time off the edges of the lesson, while keeping its core intentions intact. For instance, in a time crunch, instead of having groups work on both questions the teacher could model the first paragraph and have students complete the second independently.

Strategies like these would ensure students engage more efficiently in the essential tasks – all of which means more time and attention can be paid to the effortful task that comes later on.

Non-Effortful, Non-Essential Tasks

Lower-priority tasks and prompts that focus on tangential aspects of the core content.

Lastly, there are those parts that would be nice to have if time and student attention weren’t at a premium – but they’re not effortful or essential in realizing the goals of the lesson.

If your lesson plan is an example of high-quality instructional materials (as is the case with ‘Water is Life’) you’ll be less likely to encounter these kinds of non-essential lesson components. Nevertheless, even when the lesson plan tells you that a certain section should take 30 minutes, it won’t tell you how to allocate and prioritize that time.

This is why it’s so important to identify any distractions from the ‘main event’ of the lesson. Because effortful questions are just that: they are hard and students will need more time to grapple with their answers and to revise and refine their thinking – all of which can be undermined by non-essential prompts.

For instance, it might be tempting to ask…

…“What was your favorite part of the two passages?”

…“What does water sustainability mean to you?”

…“Has anyone ever been to a particularly wet or dry place? What was it like?

These might seem engaging – and in one definition of the term, they are – it’s just that they don’t engage students with the material you want them to learn. For that reason alone, it’s important to steer clear of adding questions not directly related to your learning target in a lesson where you’re already having to make difficult choices about what to prioritize and why.

Three Key Steps

It’s worth noting that, even though our example scenario started with a surprise announcement, this phenomenon doesn’t only play out when lesson time gets unexpectedly cut. These kinds of decisions can happen when you know your students will need more time to take on an effortful question than the curriculum calls for, or even when lesson time is simply slipping away faster than you had anticipated. In either case, you would need to adjust the pacing of the lesson to accommodate the change, and bound up within that would be the prioritization of its most important parts.

There are steps one can take to ensure the time you have becomes all the time you need. Here are three such strategies informed by Deans For Impact’s work supporting novice and early-career teachers:

Identify the effortful tasks – aka the opportunities for effortful thinking about core content within the lesson. These effortful ‘Aha!’ moments can appear towards the end of the lesson, so don’t assume that you can trim content ‘from the bottom up’ since that could result in doing away with the most important parts for student learning.

Determine which are the essential tasks – aka the foundational scaffolds students will need in order to engage with those effortful thinking opportunities. These stepping stone tasks will often deal with the knowledge and materials students need to engage in the effortful part of the lesson. Even though they can’t be removed, they can be amended. If in doubt, concentrate on the thinking students need to do rather than the surface features of the instructional strategy.

Trim those parts of the lesson that don’t prompt effortful thinking or the foundational knowledge required to engage in it. This means that anything NOT mentioned in the previous two strategies is fair game for shrinking, trimming or doing away with altogether. Ask yourself whether this part of the lesson is instrumental in getting students to engage deeply with the content you want them to take away.

So, even if lesson time always feels like it’s running away (which it often is!) there are steps we can take to ensure teachers (and subsequently students) make the most of it.


Jim Heal is Director of New Initiatives at Deans for Impact and author of ‘How Teaching Happens’. He received his master’s in School Leadership and doctorate in Education Leadership from the Harvard Graduate School of Education.

Rebekah Berlin is Senior Director of Program at Deans for Impact. She received her Ph.D. in teaching quality and teacher education from the University of Virginia.

If you’d like to learn more about the work of Deans for Impact, you can get involved here.


[1] “Grade 7: Module 4B: Unit 1: Lesson 1” by EngageNY. Licensed under CC BY-NC-SA 3.0.

 

A Little Help, Please…
Andrew Watson
Andrew Watson

I’ve got a problem, and I’m hoping you can help me.

Here’s the situation…

I work as a high school English teacher. And I’m also a consultant – presenting psychology and neuroscience research for teachers and students and parents.

In that consulting work, I often face this problem: teachers/parents/students believe – quite confidently – in some brain myth or another.

For instance:

When I talk with teachers about managing working memory load, I regularly get this question:

“Can we reduce working memory overload by aligning instruction with students’ learning style?”

When I talk about research into attention and distraction, I often hear this rejoinder:

“Yes, but: all the research shows that an enriched environment enhances learning.”

A discussion about student motivation often defaults to this baseline:

“Around here we remind students to have a growth mindset. That will get the job done.”

A comment about note-taking strategies prompts this response:

“Of course, we know from research that handwritten notes result in more learning than laptop notes.”

In these moments, how should I – the “outside expert” – respond?

We’ve Got Two Hands

On the one hand, I should – obviously – let them know they’re wrong.

First, because they are wrong (as far as research currently shows).

No: learning styles theories have not held up over time. We just don’t have good evidence to support them.

No: ‘enriched environment’ research doesn’t apply to schools. (It was initially done with rats; lots of research suggests that busy classrooms distract from learning. I tell this story in a recent book.)

No: mindset theory is not a slam dunk. This topic churns up lots of controversy, but my own view is…

…we’ve seen enough positive results to think something is going on there,

…and enough negative results to know we don’t have a good handle on the specifics yet.

And

No: the handwriting vs. laptop debate is nowhere near settled.

The second reason to highlight these errors: we don’t want their colleagues to believe these myths.

If I don’t contradict these false beliefs right away, they can easily propagate.

These two truths, however, face an ugly “on the other hand.”

On the Other Hand

When I speak up to contradict these myths, I’m pursuing two goals:

Change the mind of the person who made the comment, and

Encourage other listeners to adopt correct beliefs.

Here’s my awkward question:

does contradicting brain myths directly actually accomplish those goals?

Imagine I say:

“I’m so glad you’ve brought up learning styles. It turns out that the research just hasn’t supported this theory.”

Will the teachers who made those comments in fact change their minds?

Will others around them believe me?

Honestly, I’m not so sure…

A Compelling Backstory

Let’s ask this surprising question: why do people believe in learning styles?

Why do they believe that elaborate classroom decoration enhances learning, or that handwritten notes rule? Why do laptop notes receive so much confident hatred?

Almost certainly, teachers believe in these myths because some other consultant told them that “research says so.”

Or, they heard these myths at a conference touting “brain science!”

That is: teachers don’t believe these myths because they reject research. Teachers believe them because they embrace research.

In many cases, I suspect, they first heard that information at a PD day organized by their principal or district. In other words: they were once professionally expected to believe this myth.

Teachers are not, for the most part, stubborn flat-earth luddites. Instead, they have used these (seemingly) research-based strategies for years. Those strategies might even seem to help.

Why, then, should they change those beliefs? Just because some new guy (me) shows up and says “today’s research shows…”?

The Big Question

So, here’s my problem.

I really must correct brain myths.

And, I’m really unsure that “correcting brain myths” directly will work.

For the last few years, I’ve adopted a 3-step strategy in this situation:

First: I don’t contradict in public. Embarrassing people rarely inspires them to change their opinions.

Instead, I offer strong, research-based alternatives. (“Rather than focus on learning styles to reduce working memory load, I would …”)

Second: I ask that teacher curious questions in a one-on-one conversation:

“Where did you first hear about learning styles? Which version have you tried? What research have you explored? Have you looked at recent studies?”

Once rapport develops, I’ll mention that more current research hasn’t supported the learning styles hypothesis. I might even offer to send links and share resources.

Third: I include school leadership. Most principals and leadership teams I’ve worked with know common neuromyths, and want to root them out.

In-school leaders know better than I the best places to intervene: perhaps a departmental conversation, or a future faculty meeting. That is: they know how to spread the word widely without singling out and embarrassing any one teacher.

I wish I were sure these methods always work. But I simply don’t know.

And so, here are my questions to you:

What approach would be most effective with your colleagues?

What approach would be most effective with you?

If, for instance, you feel entirely certain that handwritten notes work better than laptop notes, what could I say to influence your thinking?

Would it, in fact, help to contradict you at that moment, in front of your peers? (“Let me explain why that study is so obviously flawed…”)

Did the research-based link above open new avenues for your thinking?

Would you rather have a one-on-one conversation about that research?

Honestly, I’m open for suggestions!

TL;DR

We really must correct brain myths in education. And, I’m really unsure about the best way to do so.

I’m hoping that you’ve got helpful suggestions…

Does Higher Engagement Promote Learning?
Andrew Watson
Andrew Watson

Long-time readers know: I thoroughly enjoy research that challenges my beliefs.

After all, I (probably) have lots to learn when a study makes me think anew.

In this case — even better! — I’ve found a study that (I suspect) challenges almost everybody’s beliefs.

Here’s the story…

The “Active vs. Passive” Debate

Education scholars often fiercely advocate for “active learning.”

This phrase serves as a catchy shorthand for several educational beliefs and practices.

People who champion a “constructivist” approach to schools, or embrace project pedagogies, or advocate student “voice and choice” often describe their approach this way.

And, they often point out one crucial benefit to active learning: student “engagement.” Students who shape their own learning feel invested in and energized by their efforts.

Other scholars zealously dispute this account.

Whereas their approach has been dismissed as merely “passive learning,” they often prefer phrases such as “direct instruction” to explain their views.

In this view of learning, limitations on working memory prevent novices from tackling overly complex problems.

Students benefit from highly structured pedagogy, in which expert teachers help students build mental models (“schema”) and thereby achieve their own expertise.

For champions of direct instruction, “engagement” might look good (“the students are all so active!”), but doesn’t necessarily result in actual learning. (Why? Because students might well experience working memory overload….)

If you attended our conference in San Francisco at the beginning of February, you heard speakers embrace both sides of this debate.

This Does Not Compute

A study published in 2019 splendidly complicates this tidy summary.

A research team led by Dr. Louis Deslauriers ran a straightforward experiment.

Researchers worked with two groups of students enrolled in an introductory physics class at Harvard.

The first group studied topic A in an “active learning” paradigm, and topic B with a “passive lecture.”

The second group switched that order: topic A was “passive lecture,” and topic B was “active learning.

The research team found a surprising set of results.

Students learned more from the “active learning” classes, but enjoyed (and thought they learned more from) the “passive lecture.”

Paradoxically, passive learning enhanced engagement but reduced understandingActive learning enhanced learning but reduced engagement.

Almost everyone will find that combination of results surprising, even disappointing.

Puzzle #1 (with a potential explanation)

Members of Team Active Learning, I suspect, predicted that the students would learn more when their professors followed that approach. Voila: they did.

And (almost certainly) teachers on that team predicted that active learning would result in higher engagement. Yet — as measured in this study — it didn’t.

Students clearly preferred the “passive lecture.”

For instance, survey results show that students wanted other physics courses to be taught with passive lecture/direct instruction.

 

The researchers have a hypothesis explaining this puzzling result. They wonder if the additional cognitive challenge created by active learning resulted in “desirable difficulty.”

That is: the students had to think harder — a challenge they didn’t really enjoy.

And this extra thought resulted in more learning. (You can watch a short video here to learn more about this hypothesis.)

Puzzle #2 (with another potential explanation)

Members of Team Direct Instruction, no doubt, are delighted that students preferred the (misnamed) “passive lecture.” According to the survey results, students felt they learned more from it than from the “active learning.”

And yet, Direct Instruction advocates no doubt feel genuine puzzlement that their preferred approach resulted in less learning. How could that be?

 

I myself have a hypothesis explaining this puzzling finding.

Contrary to many stereotypes, direct instruction advocates do NOT champion uninterrupted lecture.

Instead, they suggest that teachers start with straightforward explanation of core concepts.

Once those have been presented clearly, then students should do substantial independent mental work with those ideas.

In other words, advocates of direct instruction heatedly reject the label “passive learning.” Students do plenty of active cognitive work after they get the benefit of initial priming from instructors.

And yet, in this study, students in the passive learning group had to, in the researchers’ words, “adjust to a complete elimination of any active engagement” — such as “demonstrations, … interactive quizzes, or conceptual questions.”

NO educational thinker feels surprise that students learn less in the total absence of active engagement.

That’s not “direct instruction.” That’s … well … that’s a very bad idea. (To be clear: a very bad idea that happens all too frequently.)

A (Potential) Resolution

Because the “passive learning” condition subjected the students to pure lecture, then this study seems much less surprising (to me).

With “passive learning,”

Students learned LESS from uninterrupted lecture. (Why? They didn’t do any independent mental work with the material.)

Because the professor’s explanation made sense, on the other hand, they FELT they understood the material better.

With “active learning,”

Students learned MORE, because they interacted with the concepts and problems individually.

Alas, they FELT they understood less because they experienced the “difficult” half of “desirable difficulties.”

In other words: the study results seem confusing because the labels don’t mean what we thought they meant.

Until we know EXACTLY what happened in both “passive” and “active” learning, we can’t really judge how well those phrases align with our preconceptions — and with our own teaching practices.

One more thought

If a particular diet benefits, say, professional athletes, will it benefit me?

I’m going to be honest: I’m not a professional athlete.

A diet that benefits their level of physical fitness, metabolism, professional goals, etc., might not be healthy for me. (In his swimming prime, Michael Phelps ate 8000-10,000 calories a day. I suspect my doctor would discourage me from doing so.)

If Harvard even remotely lives up to its reputation, then students in Harvard physics classes understand an impressive amount of science. They have a great deal of motivation to learn more about science. They’ve been impressively successful in academic pursuits.

If a teaching method works with Harvard physics students, will it work with my 10th grade English students? Will it work with your 2nd graders? Maybe … but also, maybe not.

In general: I’m hesitant to apply research done at Harvard (or Stanford, or Oxford, or the US Naval Academy…) to most K-12 learning.

It’s entirely possible that the method “works” not because of the method, but because of the extraordinary background of the students who participate in it.

TL;DR

Before we embrace research “active learning” or “direct instruction,” we should know…

… EXACTLY what those labels mean in the research, and

… the GOODNESS OF FIT between those research participants and our students.

Dan Willingham has wisely written: “one study is just one study, folks.”

The Downsides of Desirable Difficulties
Andrew Watson
Andrew Watson

For several years now, we’ve been talking about the benefits of “desirable difficulties.”

For instance, we know that spreading practice out over time helps students learn more than does doing all the practice at once.

Why? Because that schedule creates greater mental challenges. Our students must think harder.

In other words: “spacing” creates “desirable difficulty.”

Likewise, we know that jumbling many topics together during practice helps students learn more than does practicing only one thing at a time.

Why? Students face greater cognitive challenges as they try to figure out which strategy to use or topic to notice.

Desirable difficulty.

And: requiring students to use retrieval practice helps them lots more than simple review.

Yup, you guessed it: d_______ d__________.

A theory that is simultaneously counter-intuitive and common-sense. What’s not to love?

Not So Desirable

I’ll tell you what’s not to love: the big silence.

The phrase “desirable difficulty” implies, obviously, that our students might face UNdesirable difficulties.

And yet, very few people ever discuss — much less research — this topic.

So, what exactly would an undesirable difficulty be? How can I predict or spot them?

I discuss this question with teachers quite often, and I have two sets of suggestions.

The First Strategy

At a Learning and the Brain conference a few years ago, Dr. Robert Bjork (who coined the phrase “desirable difficulty” with his wife, Dr. Elizabeth Ligon Bjork) explained that DDs have two core features.

First: they require students to think harder about the material.

Second: despite the difficulties, students ultimately succeed.

By implication, difficulties that don’t meet those criteria aren’t desirable.

For instance, I’ve just assigned a final project on Macbeth to my sophomores: they must think about the play, create a new something (a set design, a costume plot, a new scene, etc.), and then explain their thinking.

I’ve warned my students quite strictly: they may use technology, but they should NOT get carried away with all the cool tech possibilities at hand.

If they know how to edit videos and want to shoot a scene, that’s fine. But they should not simply throw in 1001 cool editing effects. Those edits would make them think harder, perhaps, but not think harder about the play.

The work would be difficult, but not desirably difficult.

So, too, I might ask my students to write a sentence that uses noun clauses as both the subject of the verb and as an appositive, and also uses an introductory subordinate clause as an adverb.

In this case, my students would think harder (that’s Bjork’s first criterion), but they almost certainly wouldn’t succeed (Bjork’s second criterion).

Again: a difficulty, but not a desirable one.

In other words, we want to ramp up the difficulty — but not too far — without letting the focus subtly shift to another topic.

The Second Strategy

So, difficulties aren’t desirable if they don’t meet both of Bjork’s criteria.

Another way to recognize UNdesirable difficulties: MOST difficulties are undesirable.

So, I can make attention more challenging by — say — playing loud music while students read.

More difficult? Yes. Desirable? No.

I can vex my students’ working memory by giving them ten verbal instructions to remember and follow.

More difficult? Still yes. Desirable? Still no.

I could fiendishly reduce my students’ motivation by inculcating a fixed mindset.

You know the answer. That additional difficulty would in no way be desirable.

In other words, a few specific difficulties (spacing, interleaving, retrieval practice) can be desirable. Most others, however, simply are not.

TL;DR

Desirable difficulties — which require students to think harder before they succeed at their work — can foster deeper learning.

However, most classroom difficulties don’t meet that definition, and therefore aren’t desirable.

Whenever we champion desirable difficulties, we should be sure to mention and guard against the undesirable ones that imperil students’ learning.

Too Good to be True: When Research and Values Collide
Andrew Watson
Andrew Watson

Let’s start with some quick opinions:

Flipped classrooms…

… can transform education and foster students’ independence, or

… are often a waste of time, and at best just rename stuff we already do.

A growth mindset…

… allows students to learn and become anything, or

… is just an over-hyped fad with little research support.

Multiple-choice questions…

… help me see what my students already know (and help them learn), or

… reduce knowledge to trivia, and enforce an authoritarian view of learning.

It seem strange that our profession can contain such starkly contrasting beliefs about core practices.

But if your experience is like mine, you know that debates among teachers can quickly arrive at these extremes. (If you hang out on Twitter, you probably see these clashes at their fiercest.)

Resolving Conflict with Research (?)

When we come across such vehement debates, we might propose an obvious way to settle them: research.

If the science shows X, well then, we teachers should believe X. And, we should run our classes and our schools the X way.

Obviously.

Alas, this solution might not work as well as we would hope. A recent essay by Brendan Schuetze (Twitter handle, @BA_Schuetze) helps explains why.

As Schuetze outlines, Mindset Theory lives in a strange place in the world of education.

On the one hand: research suggests that specific growth-mindset strategies offer some students modest benefits under particular circumstances. (Better said: they sometimes or probably do.)

On the other hand: lots of teachers and school systems think that…well…a growth mindset means that “anyone who tries can succeed at anything.”

How can it be that researchers (often) have one view of an educational theory, and teachers (often) have such a dramatically different understanding of that same theory?

The Values We Hold Influence the Beliefs We Adopt

To answer this question, Schuetze focuses on “values-alignment.” That is: we (teachers specifically, people generally) are quick to endorse research that aligns with values we already hold.

If (and this is my example, not Schuetze’s) we value innovation and the transformative power of technology, we’re likelier to think that flipped classrooms will radically improve education.

We might even come across research supporting this value-aligned position.

If we value tradition and the transformative power of face-to-face conversation, we’re likelier to think that this flipped-classroom nonsense will fail quickly and spectacularly, and we’ll go back to the methods that have always worked.

We can easily discover research supporting this position as well.

In his essay, Schuetze takes the example of growth mindset.

In a well-sourced recap, Schuetze explains:

Teacher education programs tend to endorse transformative constructivist pedagogy (as opposed to more traditional pedagogy), where social justice and the socio-emotional needs of students are increasingly seen as legitimate educational concerns…

In line with this affective turn, teachers are encouraged to be concerned not only with intellectual development, but also with molding, inspiring, and caring for their students–or what might be summarized in one word as the “growth” of students.

Because teacher training programs encourage us to value students’ “growth” quite broadly, our profession tends to believe any research that holds up growth as an outcome.

And we might not ask hard questions before we embrace that belief.

More Concerns, Possible Solutions

In fact (I’m inferring this from Schuetze’s essay), we’re likelier to over-interpret the plausibility and effectiveness of that theory.

Imagine a modest, research-based suggestion aligns with our values:

Researchers say, “X might help these students a bit under these circumstances.”

We teachers hear, “X transforms students — it’s almost magic!”

In my experience — and here I’m going WAY beyond Schuetze’s essay — our hopeful beliefs then call up the very “evidence” we need to persuade ourselves:

Well-meaning teachers write hopeful books that extrapolate substantially from the research they cite.

Blog posts — in an effort to make complex research clear — gloss over the nuance and uncertainty that researchers typically highlight.

Edu-Tweeps with thousands of followers simplify complex ideas into 280 characters.

Suddenly, it seems “everybody believes” that “research shows” what we already value.

To face this problem, I think we need to combine several steps.

Too Good

In the first place, I think it helps to focus on Schuetze’s troubling insight. We might find, someday, that a teaching practice BOTH helps our students learn AND contradicts our values.

Perhaps flipped classrooms really do help students (for the most part), even though we value tradition and face-to-face pedagogy.

Or, to reverse the case,

Perhaps growth mindset strategies don’t really help, even though we value students’ overall growth above their narrow academic achievement.

In these cases, we should honestly accept the tension between research and values. If we act as if they align when they don’t, we won’t make decisions as effectively or thoughtfully as we should.

That is: we can quite appropriately say:

This intervention might not help students learn more. But it aligns with a core value in our community, so we’ll do it anyway.

In the second case, I think we should hone an odd kind of skepticism:.

If a research-based teaching suggestion sounds deeply good — that is, if it aligns with our values — then we have an extra responsibility to assume it’s too good to be true.

Does “authenticity” sound good to you? You should BEWARE a pedagogical strategy called “authentic exploration.”

Does “mindfulness” sound uplifting? You should BEWARE mindfulness initiatives.

Have you (like me) always enjoyed being outdoors with the trees? You (like me) should BEWARE any teaching initiative with the words “woods” or “nature” or “wilderness” in the title.

Of course, when you warily undertake a review of the research literature, you just might find that it does in fact support this core value. (Quick: let’s all study in a forest!)

But we owe it to our profession and our students to admit: the values we hold dear might lead us into too credulous acceptance of the next new thing.

I (hope I) value my students’ development too much to let that happen.

New Research: Unrestricted Movement Promotes (Some Kinds of) Creativity
Andrew Watson
Andrew Watson

Teachers like creativity.

We want our students to learn what has come before, certainly. And, we want them to do and think and imagine new things with that prior knowledge.

We want them, in ways big and small, to create. How can we best foster such creativity?

Over the years, I’ve often heard that walking outside promotes creativity.

Because I work at a summer camp, I’m in favor of almost anything that promotes being outside. Alas, it turns out, this data pool didn’t hold up very well.

Since that time, lots of folks have focused on the walking part of “walking outside.” Several studies do suggest that simply walking around ramps up creative output. (For instance, here.)

Can we be more specific than “walking around”? Do some kinds of walking boost creativity more than others?

Defining Creativity

Ironically, the study of creativity begins with mundane, even tedious, tasks: defining and measuring it.

Researchers often focus on two kinds of creativity.

First, my students might come up with something new and useful.

Researchers measure this flavor of creativity (“divergent”) in a fun way:

Think about, say, a brick. Now, list all the things you might do with a brick.

The answer “build a wall” doesn’t score very high, because almost everyone says “build a wall.”

The answer “raise the water level in my pool by throwing lots of bricks in” does score high, because — well — because nobody ever says that. This answer is new and (assuming you care about the water level in your pool) useful.

Second, my students might see hidden connections.

Researchers measure this flavor of creativity (“convergent”) in another fun way:

Think about these three words: cottage, swiss, and cake.

Can you think of a word that pairs with each of those to make a meaningful phrase? (Answer: “cheese.” As in, cottage cheese, etc.)

Researchers in Germany wanted to knowwhat kind of walking might increase DIVERGENT creativity.

Here’s what they found…

It’s All About the Freedom

Researchers Supriya Murali and Barbara Händel asked participants to walk or to sit.

And, they asked them to do so in restricted or unrestricted ways.

Unrestricted walkers, for instance, could walk around a large room however they pleased. Restricted walkers had to walk back and forth down the middle of the room. (Notice: all this walking was inside.)

Unrestricted sitters sat in a solid chair (no wheels, no reclining features) with a view of the full room. Restricted sitters sat in the same chair, but with a computer screen in front of them. The “fixation cross” on the screen implied (if I understand this correctly) that the participants should remain focused on the screen.

What happened afterwards, when they took a test on divergent thinking?

Headlines:

Walkers scored higher on tests of divergent creativity than sitters.

Participants without restrictions (both walking and sitting) scored higher than their restricted peers.

For some interesting reason, unrestricted movement reduces restrictions in subsequent mental activity.

Classroom Implications

As I think about this research, it implies some happy, practical suggestions.

If we want our students to launch an explicitly creative assignment — start composing a poem, imagine an approach to studying a historical question, plan an environmentally-friendly city — we can give them an extra boost of physical freedom.

Walking outside might be good.

But if they can’t walk outside (that’s just not possible in many schools), then walking inside could be good.

Heck, if 25 students walking around in the classroom sounds like too much chaos, maybe they can choose a new place to sit for a while.

In other words: this research suggests that the actual movement (walking/sitting) matters, and that the relative degree of restriction also matters.

Even if students sit in solid chairs, their freedom to choose seats or move seats or sit cross-legged (or whatever) might jostle some creative energy in useful ways.

TL;DR

As long as we don’t make our claims too strong or grand, this research allows a sensible claim: “by reducing physical limitations for a while, we might help students expand their mental activity and creativity.” *


* I should note that the sample sizes in these three studies are quite small: 20, 17, and 23. Were these studies repeated with larger sample sizes (and/or in more classroom-like conditions), I’d be more confident and emphatic in drawing these conclusions.


Kuo, C. Y., & Yeh, Y. Y. (2016). Sensorimotor-conceptual integration in free walking enhances divergent thinking for young and older adults. Frontiers in psychology7, 1580.

Murali, S., & Händel, B. (2022). Motor restrictions impair divergent thinking during walking and during sitting. Psychological research, 1-14.

The First Three Steps
Andrew Watson
Andrew Watson

Early in January, The Times (of London) quoted author Kate Silverton (on Twitter: @KateSilverton) saying:

It’s the schools that have the strictest discipline that have the highest mental health problems.

Helpfully, they include a video recording of her saying it.

In context, Silverton is saying — in effect — that schools’ strict disciplinary policies damage students’ mental health.

If she’s right, school leaders should know that!

If we run schools with strict disciplinary policies, we should at least consider changing them. Obviously, we don’t want to cause mental health problems.

But … is she right?

This specific question leads to a broader question:

When someone says “research says you should change the way you run your school!,” what should we do next?

Accept the advice? Reject it? Flip a coin?

Let me suggest three simple steps.

Step 1: Ask for Sources

This advice seems too obvious to say out loud.

OF COURSE someone reading this blog would ask for sources.

However, in my experience, we’re very hesitant to do so. It seems — I don’t know — rude, or pushy, or presumptuous.

Especially when the research comes from psychology or neuroscience, we just don’t want to seem stubborn.

But, trust me, it’s always appropriate to ask for the research.

In this case, happily, lots (and lots) of people did ask Silverton for research.

This small niche of edutwitter lit up with people asking — quite simply — “what research suggests that strict school discipline damages mental health?” (To be clear, it also lit up with people praising Silverton for speaking out.)

Even more happily, she responded by citing 11 research studies.

Her transparency allows us to ask a second question:

Step 2: Does the Research, in fact, Support the Claim?

Here again, the question seems to obvious to raise. Who would cite research that doesn’t support the claim they make?

I’m here to tell you: it happens all the time. (I wrote about a recent example here.)

In this case, teacher/researcher/blogger Greg Ashman looked at those sources. (You can read the article he wrote here, although you might have to subscribe to his substack to do so.)

So, does the research support the claim?

Amazingly, most of the cited studies don’t focus on students’ mental health.

That’s right. To support the claim that “strict discipline harms mental health,” Silverton cites very little research about mental health. (Ashman has the details.)

Yes, we might make some guesses based on these studies. But, guesses aren’t research.

As Ashman writes:

it’s easy to accept that suspension and [expulsion] are associated with higher rates of depression without assuming suspension and [expulsion] are the cause.

So, DOES strict school discipline cause mental health problems? I don’t (yet) know of direct research on the subject.

This specific example about school discipline, I hope, emphasizes the broader point:

Simply by a) asking for research and b) giving it a quick skim, we can better decisions about accepting or rejecting “research-based” teaching advice.

Step 3: Actively Seek Out Contradictory Information

Because humans are so complicated, psychology and neuroscience research ALWAYS produces a range of findings.

Even with something as well-supported as retrieval practice, we can find a few studies suggesting limitations — even (very rare) negative effects.

I thought of this truth when I saw a New York Times headline: Cash Aid to Poor Mothers Increases Brain Activity in Babies, Study Finds.

This blog is about brain research, not politics. At the same time, this brain research might be cited to support a policy proposal.

So: what should we do when we see brain research used this way?

Step 1: “Ask for sources.” Good news! The sources are quoted in the article.

Step 2: “Does the research, in fact, support the claim?”

Sure enough, the researchers conclude

“we provide evidence that giving monthly unconditional cash transfers to mothers experiencing poverty in the first year of their children’s lives may change infant brain activity.”

Step 3: “Actively seek out contradictory information.”

Because this claim made the front page of the Times, I kept an eye out for responses, both pro and con.

Just a few days later, I found this tweet thread. In it, Dr. Stuart Richie points out some real concerns with the study.

For instance: the authors “pre-registered” their study. That is, they said “we’re going to measure variables X, Y, and Z to see if we find significant results.”

As it turns out, they found (small-ish) significant results in P, Q, and R, but not X, Y, and Z.

As Richie notes, P, Q, and R are certainly interesting. But:

This is a clear example of hype; taking results that were mainly null and making them into a huge, policy-relevant story. [The research] is a lot more uncertain than this [Times article implies].

To be very clear: I’m not arguing for or against a policy proposal.

I am arguing that when someone says “brain science shows!,” we should ask questions before make big changes.

TL;DR

When people cite brain research to encourage you to teach differently…

… ask for sources,

… confirm they support the recommendation,

… seek out contradictory points of view.

Our students benefit when we follow those three simple steps.

A “Noisy” Problem: What If Research Contradicts Students’ Beliefs?
Andrew Watson
Andrew Watson

The invaluable Peps Mccrea recently wrote about a vexing problem in education: the “noisy relationship between teaching and learning.”

In other words: I can’t really discern EXACTLY what parts of my teaching helped my students learn.

Was it my content knowledge?

The quality of my rapport with them?

The retrieval practice I require?

The fact that they slept and ate well in the days before class?

Some combination of all these variables?

Because I don’t know EXACTLY which teaching variable helped (or hurt) learning, I struggle to focus on the good stuff and eliminate the bad stuff.

I thought about Mccrea’s wisdom when I read a recent study about interleaving.

Here’s the story…

Interleaving 101

Frequent blog readers know all about interleaving, a way of organizing students’ practice.

Let’s say I teach my students about parts of speech.

Once they have a basic understanding of each one, I could have them practice each part of speech on its own.

That is: they identify nouns on Monday, adverbs on Tuesday, prepositions on Wednesday, and so forth.

Researchers call that structure “blocking” — as in “blocks of homework focusing on individual topics.”

Or, I could have my students jumble several topics together every night.

That is: Monday night they practice nouns, adverbs, and prepositions. Tuesday they practice verbs, prepositions, and conjunctions. Wednesday: nouns, verbs, and adjectives.

The total number of practice problems would remain the same, but they’d practice several parts of speech all together.

Researchers call this system “interleaving” — as in “weaving together several different topics.”

Measuring Success

Of course, teachers want to know: does interleaving work? Do students who interleave their practice learn more than students who block?

Let’s imagine two ways of answering that question

Strategy #1: ask the students.

Obviously.

Who knows more about the students’ learning than the students themselves?

Strategy #2measure their learning.

Obviously.

If students who block consistently remember more than students who interleave (or vice versa), then we have a winner.

So, what’s the answer?

Answers, and Vexing Questions

According to Samini and Pan’s 2021 study, strategy #1 yields a clear answer: students say that interleaving is harder and results in LESS learning.

Of course, that means they think that blocking is easier and results in MORE learning.

Alas, strategy #2 arrives at a contradictory result.

When we measure students’ actual learning, they remember more after interleaving than blocking.

Samini and Pan’s study gets this result. And, LOTS AND LOTS of research gets to the same result. (See Agarwal and Bain’s book for a great review of the research.)

In other words, this study points to an especially “noisy” part of the relationship between teaching and learning.

Students genuinely think and believe that interleaving interferes with learning.

However, interleaving in fact promotes learning.

How do we handle this quandary?

Tentative Solutions

In my high-school classroom, we do A LOT of retrieval practice.

Almost every day, I fire off questions and ask students to attempt an answer.

Sometimes I call on raised hands; or cold call; or have students write answers in their notebooks (I circle the room to check their responses). They might write on the board; they might consult in pairs.

I’m entirely comfortable using retrieval practice — and so are my students — because on the second day of class I showed them research about retrieval practice.

I told them:

This might feel hard at first.

But, trust me. It feels hard because your brain is working harder. And that means you’re learning more.

It’s like going to the gym. You don’t gain muscle by picking up paper clips. You gain muscle by picking up heavy things. Hard work leads to better fitness.

The same rule applies here. Retrieval practice is harder, so you’ll learn more.

Since that day, I stop every now and then at the end of an RP session and say: ” do you feel how much you’ve learned? Do you see how much retrieval practice is helping?”

In fact (I swear I am not making this up), one of my Sophomores once said: “Thank you Mr. Watson for making us do retrieval practice every day.”

I tell this story because it applies to interleaving as well.

I’ve been interleaving all year, but I haven’t (yet) explained it to my students. I plan to do so this upcoming week (or next).

My hope is: they’ll see why we’ve been bouncing back and forth from topic to topic in ways that might seem random or disorganized.

We’ve been interleaving all along.

I offer this solution as “tentative” because my context might not match yours.

For instance, if you teach younger or older students, they might not respond as mine do.

If you teach students with diagnosed learning differences, interleaving might not benefit them as much.

And so forth.

As always: consider the research findings, consider my experience, and then use your own best judgment to fit them into your classroom practice.

TL;DR

If students’ beliefs contradict research, I myself tell them about the research — graphs and all. And then I ask them to trust me.

Retrieval practice and interleaving really do work. My students know about this research pool. So far, they’re on board.

If you try this strategy, or another one, I hope you’ll let me know about your own experience.


Samani, J., & Pan, S. C. (2021). Interleaved practice enhances memory and problem-solving ability in undergraduate physics. NPJ science of learning6(1), 1-11.

Teaching with Images: Worth the Effort?
Andrew Watson
Andrew Watson

According to Richard Mayer’s “multimedia principle,”

People learn better from words and pictures than from words alone.

If that’s true, then we should — obviously — be sure to include pictures in our teaching.

However…

Whenever we see a broad principle like that, we should always look for specific limitations.

That is…

… does this principle apply to kindergarteners as well as 5th graders and adult learners?

… does it apply for students with an ADHD diagnosis?

… is it true when teaching Civil War history, theorems about similar triangles, and bunting strategies?

And so forth.

Researchers call such limits “boundary conditions,” and we should ALWAYS look for boundary conditions.

So, let’s look at that broad principle ( “pictures + words” > “words”) and ask this boundary question:

Does the content of the picture matter?

Possibilities and Perils

Happily, one of the people asking that question is…Richard Mayer himself.

In his career, he’s come up with a whole suite of useful principles. And, he spends lots of time looking for boundary conditions.

Specifically, in a usefully straightforward study, he and Eunmo Sung study several different kinds of images:

Instructive images: “directly relevant to the instructional goal.”

I’m teaching Macbeth right now, and focusing on the play’s tension between order and chaos. So, I might show students a picture of Scotland’s craggy wildernesses (chaos) and one of a highly structured royal ceremony (order).

Seductive images: “highly interesting but not directly relevant to the instructional goal.”

A movie version of Macbeth — starring Denzel Washington and Frances McDormand — just came out. I could show my students a picture of these two movie stars on the Red Carpet at an Oscar ceremony.

Decorative images: “neutral but not directly relevant to the instructional goal.”

Macbeth can be a grim play: so much beheading, so much unseaming. So: I could include pictures of waterfalls and sunrises on my handouts to raise my students’ spirits a bit.

Once we start exploring these potential boundary conditions — perhaps not all images benefit learning equally — we might get even more useful guidance about combining words and images.

Predictions and Results

Sung and Mayer measured the effects of such images on students’ learning AND on their enjoyment of a lesson.

Take a moment to make some predictions on your own.

Which, if any, of those graphics will help students learn more?

Which, if any, will help students enjoy the lesson more?

[I’ll pause while you think about those questions.]

 

 

Perhaps you, like Sung and Mayer, predicted that ALL the images would increase students’ enjoyment.

And perhaps you predicted that the INSTRUCTIVE images would help students learn, but not the others.

Sure enough, you and they were right. Students LIKE images, but LEARN FROM images that focus their attention on the learning goal. (If you’re interested in the specific numbers, look at the 6th page of the study.)

We should, I think, focus on this key finding: students do not always learn more when they enjoy a lesson more.

We shouldn’t deliberately make our lessons dull.

But: we shouldn’t assume that an enjoyable lesson necessarily results in more learning. In this case, those photos of Macbeth movie stars piqued my students’ curiosity and interest, but didn’t help them learn anything about the play.

Three Final Points

First: the benefits of dual coding have gotten lots of attention in recent years.

To get those benefits, we should remember these boundary conditions. Dual coding helps if — and only if — the images highlight the learning goal.

Second: a recent meta-analysis about “seductive details” nicely complements this study.

Third: Like many teachers, I see the good and the vile in Twitter.

Yes (YES!!), it can be a sink of repulsive yuckiness.

And (surprise!!), it can also be supportive and helpful.

I bring up this point because: a wise soul on Twitter mentioned this Sung & Mayer study recently, and reminded me of its importance.

I can’t remember who brought it up (I would credit that tweep if I did), but I’m grateful for the nudge.

Such useful research! Such helpful guidance!


Sung, E., & Mayer, R. E. (2012). When graphics improve liking but not learning from online lessons. Computers in Human Behavior28(5), 1618-1625.

Let’s Get Practical: How Fast Should Videos Be?
Andrew Watson
Andrew Watson

Research often operates at a highly abstract level.

Psychologists and neuroscientists study cognitive “tasks” that stand in for school work. If we’re being honest, however, we often struggle to see the connection between the research task and actual classroom learning.

HOWEVER…

Every now and then, a study comes along that asks a very practical question, and offers some very practical answers.

Even better: it explores the limits of its own answers.

I’ve recently found a study looking at this (incredibly practical) question:

Because students can easily play videos at different speeds, we need to know: which video speed benefits learning the most?

So: what advice should we give our students about learning from videos?

Exploring The Question

Let’s start with a specific example:

If a student watches a video at double speed, she (obviously) spends only half as much time mentally interacting with its information.

Does that reduction in time lead to an equal reduction in learning? Will she learn half as much as if she had watched it at regular speed?

Dr. Dillon Murphy starts with that question, and then quickly gets interested in crucial related questions:

What about other video speeds? That is: what about watching the video at 1.5x speed? What about 3x speed?

Does the topic of the video matter?

And, here’s a biggie: what should students do with the time they save?

Even before we look at the results of this study, I think we can admire its design.

Murphy’s team ran multiple versions of this study looking at all these different variables (and several others).

They did not, in other words, test one hypothesis and then — based on that one test — tell teachers what to do. (“Best practices require…”)

Instead, they invited us into a complex set of questions and possibilities.

Maybe 1.5x is the most efficient speed for learning.

Maybe 3x is the best speed if students use the time they saved to rewatch the video.

Maybe regular speed is best after all.

Because Murphy’s team explores so many possibilities with such open-minded curiosity, we have a MUCH better chance of figuring out which results apply to us. *

The Envelope Please

Rather than walk you through each of the studies, I’ll start with the study’s overall conclusions.

First: watching videos at higher speeds does reduce learning, but not as much as you might think.

That is: spending half as much time with the video (because a student watched it at double speed) does NOT result in half as much learning.

To be specific: students watched ~ 14 minute videos (about real-estate appraisals, or about Roman history).

A week later, those who watched them at regular speed scored a 59% on a quiz. Those who watched at 2x speed scored a 53%.

59% is higher that 53%, but it’s not twice as high. **

Second: students can use that “saved” time productively.

What should a student do with the 7 minutes she saved? She’s got two helpful choices.

Choice 1: rewatch the video right away.

Students who used their “saved” time to rewatch the video right away recaptured those “lost” points. That is: they had the same score as students who watched the video once at regular speed.

Choice 2: bank the time and rewatch the video later.

In another version of the study, students who watched the 1x video once scored a 55% on a quiz one week later.

Other students watched the 2x video once, and then once again a week later. They scored a 63% on that quiz. (For stats types, the d value is 0.55 — a number that gets my attention.)

In other words: rewatching at double speed a week later leads to MORE LEARNING in the THE SAME AMOUNT OF TIME (14 minutes).

Practical + Practical

Murphy takes great care to look at specific combinations.

His example encourages us to take care as well. For instance:

His team worked with college students. Will this result hold for 8th graders, or 2nd graders?

You can look to you your teacherly experience and judgment to answer that question.

Will this effect hold for longer videos: 30 minutes, or one hour?

We don’t know yet.

These videos included a talking head and slides with words — but not closed captions. Will some other combination (no talking head? closed captions on?) lead to different results?

We don’t know yet.

In other words: Murphy’s study gives us practical guidance. We should use our judgment and experience to apply it to our specific teaching circumstances.


* I should note: This study is unusually easy to read. If the topic interests you, you might look it over yourself.

** Important note: I’ve seen news reports about this study saying that watching once at double speed results in the same amount of learning as watching once at regular speed. That claim is untrue. And: Murphy’s study does not make that claim.

Murphy, D. H., Hoover, K. M., Agadzhanyan, K., Kuehn, J. C., & Castel, A. D. (2021). Learning in double time: The effect of lecture video speed on immediate and delayed comprehension. Applied Cognitive Psychology.