Skip to main content
Getting the Timing Right: Critical Thinking Online
Andrew Watson
Andrew Watson

If we want students to remember what we teach–and, what teacher doesn’t?–we’ve got a vital strategy: spread practice out over time.

We’ve got scads of research showing that the same number of practice problems results in a lot more learning if those problems are spread out over days and weeks, compared to being done all at once.

We call this the spacing effect, and it’s as solid a finding as we’ve got in the field of educational psychology.

As teachers interested in psychology research, we should always be asking: “yes, but does that work in my specific context.”

For instance: if research shows that college students learn stoichiometry better in a flipped-classroom model, that doesn’t necessarily mean that my 3rd graders will learn spelling better that way.

In the language of psychology research, we’re looking for “boundary conditions.” What are the limits of any particular technique?

The Spacing Effect Meets Critical Thinking

Researchers in Canada wanted to know: does the spacing effect apply to the teaching of critical thinking?

Of course, we want our students to be effective critical thinkers. But, there’s heated debate about the best way to teach this skill.

Lots of people doubt that critical thinking can be taught as a free-standing skill. Instead, they believe it should be nested in a specific curriculum.

That is: we can be critical thinkers about sonnets, or about football play-calling strategy, or about the design of bridges. But, we can’t learn to think critically in an abstract way.

The Canadian researchers start with that perspective, and so they teach critical thinking about a specific topic: the reliability of websites. And, they go further to ask: will the spacing effect help students be better critical thinkers?

In other words: if we spread out practice in critical thinking, will students ultimately practice their critical craft more effectively?

The Research; The Results

To answer this question, researchers used a 3-lesson curriculum exploring the credibility of websites. This curriculum asked 17 questions within 4 categories: the authority of the website’s authors, the quality of the content, the professionalism of the design, and so forth.

Half of the 4th-6th graders in this study learned this curriculum over 3 days. The other half learned it over 3 weeks.

Did this spacing matter? Were those who spread their practice out more proficient critical website thinkers than those who bunched their practice together?

In a word: yup.

When tested a month later, students who spread practice out were much likelier to use all four categories when analyzing websites’ reliability. And, they used more of the 17 questions to explore those four categories.

To Sum Up

This research leads us to two encouraging, and practical, conclusions.

First: we can help our students be better critical thinkers when they analyze websites. (Heaven knows that will be a useful skill throughout their lives.)

Second: we can improve their ability by relying on the spacing effect. As with so many kinds of learning, we get better at critical thinking when we practice over relatively long periods of time.

Fostering Curiosity in the Classroom: “What Percentage of Animals are Insects?”
Andrew Watson
Andrew Watson

As teachers, we know that learning works better when students are curious about the subject they’re studying.

Obviously.

So, what can we do to encourage curiosity?

We could choose a topic that (most) students find intrinsically interesting. Dinosaurs, anyone?

But, we can’t always work on that macro level. After all, many of us work within a set curriculum.

What strategies work on a smaller, more day-to-day level? In other words: is there anything we can do in the moment to ramp up students’ curiosity?

Before you read on, pause a moment to ask yourself that question. What do you predict might work?

Predictions, Please

According to a recent study, the very fact that I asked you to make a prediction increases your curiosity about the answer.

Here’s the story.

Researchers in Germany asked college students look at a question, such as “X out of 10 animals are insects.”

Sometimes the students made a prediction: “4 out of 10 are insects.”

Sometimes they thought about an example of an insect: “mosquitoes.”

Sure enough, students rated their curiosity higher after they made a prediction than after they provided an example.

And…drum roll please…they also remembered those facts better when their curiosity levels were elevated.

Don’t Take My Word For It

By the way: how did the researchers know how curious the students were to find the answer?

First, they asked them to rate their curiosity levels. That’s a fairly standard procedure in a study like this.

But, they also went a step further. They also measured the dilation of the students’ pupils. (You may know that our pupils dilate when we’re curious or surprised.)

And, indeed, by both measures, making predictions led to curiosity. And, curiosity led to better memory of these fact.

What To Do Next?

On the one hand, this study included relatively few students: 33, to be precise.

On the other hand, we’ve got LOTS of research pointing this direction. Some studies show that pretesting helps students learn better, even if the students can’t possibly know the answer to the question on the test.

So, some kind of early attempt to answer a question (like, say, making a prediction) does seem to help learning.

At the same time, I think it would be quite easy to overuse this technique. If students always take a pretest, they’ll quickly learn that they aren’t expected to know the answers and (reasonably enough) won’t bother to try.

If students always make predictions, I suspect they’ll quickly pick up on this trick and their curiosity will wear down.

As teachers, therefore, we should know that this approach can help from time to time. If you’ve got a list of important facts you want students to learn, you build predictions into your lesson plan.

I myself wouldn’t do it every time. But, I think it can be a useful tool–especially if you need to know how many animals are insects. (In case you’re wondering: the answer is, “7 out of 10.” Amazing!)

Tea and Macbeth: Autobiographical vs. Semantic Memory
Andrew Watson
Andrew Watson

A few years ago, a former student named Jeremy invited me out for coffee. (I haven’t changed his name, because I can’t think of any reason to do so.)

We were reminiscing about the good old days–in particular, the very fun group of students in the sophomore class with him.

At one point he said: “You know what I remember most vividly about your class?”

I waited.

“Instead of using a spoon, you’d wrap your teabag string around your pen to wring it out into the mug. That always AMAZED me.”

In my early days as a teacher, I would have been horrified by this comment.

We had done such good work  in this class! We analyzed the heck out of Macbeth. Jeremy had become a splendid writer–he could subordinate a quotation in an appositive like a pro. We had inspiring conversations about Their Eyes Were Watching God.

And all he remembered was the way I wrung out a tea bag?

The Hidden Compliment

Jeremy’s comment might seem like terrible news, but I think it’s good news. Here’s why:

The goal of sophomore English is for Jeremy to learn particular skills, facts, and habits of mind.

That is: he should remember–say–how to write a topic sentence with parallel abstract nouns.

However, he need not remember the specific tasks he undertook to learn that skill.

For example, when he wrote his essay about Grapes of Wrath, he got better at writing essays. Whether or not he remembers the argument he made in that paper, he honed his analytical habits and writing skills. (How do I know? His next paper was better. And the next.)

He doesn’t remember the day he learned how to do those things. But, he definitely learned how to do them.

Many Memories

When psychologists first began studying memory, they quickly realized that “memory” isn’t one thing. We’ve got lots of different kinds of memory.

Those distinct memory systems remember different kinds of things. They store those memories in different places.

For instance: I’ve written a lot about working memory. That essential cognitive system works in a very particular way, with very important strengths and limitations.

But, say procedural memory works very differently. Procedural memory helps us remember how to do things: like, say, ride a bike, or form the past tense of an irregular verb.

These distinctions help me understand Jeremy’s memories of my class.

Jeremy had a strong autobiographical memory: my wringing out a teabag with my pen.

As the name suggests, autobiographical memories are rich with details about the events and people and circumstances.

You have countless such memories:

The time you poured coffee on your boss’s desk;

The first time you met your current partner;

The time you forgot your lines on stage.

You can call up vivid specifics with delicious–or agonizing–precision.

At the same time, Jeremy has lots of semantic memories from our class. As Clare Sealy describes them, semantic memories are “context free.” They “have been liberated from the emotional and spatiotemporal context in which they were first acquired.”

For instance:

Jeremy knows the difference between a direct object and a subject complement.

Having read The Ballad of the Sad Cafe, he knows how to analyze love triangles in literature.

Knowing how we define the word “romance” in English, he can explain the (many) bizarrenesses of The Scarlet Letter.

However, those semantic memories have an entirely feel from autobiographical memories. They lack the vivid specifics.

Jeremy knows that a subject complement “renames or describes” the subject. But he can’t tell you the tie I was wearing when I first explained that. He can’t tell you the (probably gruesome) example I used to make the distinction clear.

If he could, they would be autobiographical memories as well as semantic memories.

Why The Distinction Matters

As teachers, we’re tempted–often encouraged–to make our classes dramatically memorable. We want our students to remember the time that we…

Surprisingly, that approach has a hidden downside.

As Clare Sealy explains in a recent essay, we can easily use information in semantic memory in a variety of circumstances. That is: transfer is relatively easy with semantic memory.

However, that’s not true for autobiographical memory. Because autobiographical memory is bound up with the vivid specifics of that very moment on that very day (in that very room with those very people), students can struggle to shift the underlying insight to new circumstances.

In other words: the vivid freshness of autobiographical memory impedes transfer.

Sealy explains this so nimbly that I want to quote her at length:

Emotional and sensory cues are triggered when we try to retrieve an autobiographical memory. The problem is that sometime they remember the contextual tags but not the actual learning.

Autobiographical memory is so tied up with context, it is no good for remembering things once that context is no longer present.

This means that it has serious limitations in terms of its usefulness as the main strategy for educating children, since whatever is remembered is so bound up with the context in which it was taught. This does not make for flexible, transferable learning that can be brought to bear in different contexts and circumstances.

By the way, in the preceding passage, I’ve used the phrase “autobiographical memory” when Sealy wrote “episodic memory.” The two terms mean the same thing; I think that “autobiographical memory” is a more intuitive label.

To Sum Up

Of course we want our students to remember us and our class: the fun events, the dramatic personalities, the meaningful milestones.

And, we also want them to remember the topics and ideas and processes they learned.

Crucially, the word “remember” means something different in those two sentences; the first is autobiographical memory, the second is semantic.

Teaching strategies that emphasize remembering events might (sadly) make it harder to remember ideas and processes.

So, we should use teaching strategies that foster the creation of semantic memories.

Happily, the autobiographical memories will take care of themselves.


Clare Sealy’s essay appears in The researchED Guide to Education Myths: An Evidence-Informed Guide for Teachers. The (ironic) title is ” Memorable Experiences Are the Best Way to Help Children Remember Things.”

Inquiry- and Problem-Based Pedagogy: Dramatic Results in South America (?)
Andrew Watson
Andrew Watson

A recent study, published by the Center for Effective Global Action, sees big benefits from teaching build around student collaboration, inquiry, and problem-solving.

Working with in four countries (!), in ten different schools, (!!), with over 17,000 students (!!!), researchers find that K-4 students made more progress in math and science when they explored questions, compared with students who listened to lectures.

They report these results in stats-y language that doesn’t translate well: after 7 months, students averaged 0.18 standard deviations higher in math, and 0.14 in science. After four years, those differences bloomed to 0.39 and 0.23.

That not as sexy sounding as, say, they scored X% higher on a standardized test. But, however you look at it, those are eye-catching numbers.

Inquiry Learning vs. What?

Despite these dramatic numbers and claims, I don’t think the study supports the strong claims made by these researchers.

Here’s why.

First, the research purports to study the difference between “inquiry and problem based pedagogy” with “traditional instruction.”

If you look over the description of the classes, however, I think you’ll quickly see that it studies the difference between “good” teaching and “bad” teaching.

So, for instance, in a “traditional” unit on the skeletal system in Argentina:

[S]tudents copy facts about bone tissues and the names of 206 bones of the human skeleton that teachers have written on the blackboard into notebooks.

That’s not traditional. That’s absurd. They copy the names of two hundred and six bones? The mind boggles.

And, by the way, the “inquiry and problem based pedagogy” [IPP] is full of good, old-fashioned direct instruction:

When done well, IPP includes elements of explicit instruction and scaffolding.

Teachers facilitate learning by guiding students through a series of steps and explicitly relating learning to students’ prior knowledge and experiences.

Teachers guide learners through complex tasks with explicit instructions that are relevant to the problems at hand.

They provide structure and scaffolding that help students not only carry out specific activities, but also comprehend why they are doing those activities and how they are related to the set of core concepts they are exploring.

So, yes, these students are inquiring and problem solving. And, they’re getting lots of explicit teacherly guidance.

So, again, the labels used in this study don’t fully align with the concepts we typically use them to mean.

Compared to Whom?

A second questions jumps out here as well.

The teachers who used IPP methods got impressive training and support. For instance:

They got 20 hours of professional training in these methods. (When was the last time your school provided twenty hours of training on one topic?)

They got lesson plans. They got teaching materials.

They got “continuous in-school teacher support.”

What did the teachers in the control-group schools get? The study doesn’t say.

That silence leads to the possibility that they got…nothin’.

Which is to say: the study compares teachers who got lots and lots (and lots) of support, with teachers who didn’t get any support.

So, the difference might have come from the specifics of the teaching method: in this case, “IPP.”

Or, it might have come from the energizing effects of working at a school getting so much researcher support and attention.

We simply don’t know. And, if I’m right that this was a “business as usual” control group, then the study design doesn’t let us know.

Strong Conclusions

Based on this study, I think we can conclude that…

4th graders should not have to copy 206 vocabulary words into their notebooks. (I’ll go out on a limb and say NO ONE should have to do that.)

Some level of explicit teacherly support and guidance is essential.

Once foundational knowledge has been established, an appropriate level of independent questing can solidify and extend knowledge.

Most of us, I suspect, would have agreed with all of those statements before we read the study.

I don’t think, however, we can conclude from this study that “Inquiry and Problem Based Pedagogy” (as we typically use those words in the US) is the best approach. Because: that’s not what this study tested and measured.

Prior Knowledge: Building the Right Floor
Andrew Watson
Andrew Watson

Take a gander at this passage from Michael Jones’s recent biography of The Black Prince:

“In the fourteenth century England used a silver standard of currency. The unit of account was the pound sterling (£) which was equal to one and a half marks of silver. The pound was divided into twenty shillings (s), each of twelve pence (d). There was also, from 1344, a gold coinage based on the noble, which was conventionally worth 6s 8d, but was rarely used. It would, however, be significant in the calculation of the ransom of King John II and also in the introduction of gold coinage into Gascony and then the principality of Aquitaine by the Black Prince.”

Many readers, I suspect, felt tempted to give up relatively quickly. (Don’t blame yourself if you did.) Unless you’re really up to speed on 14th century English currency–both silver and gold!–the paragraph quickly becomes overwhelming.

The vocabulary in this passage probably doesn’t strain our cognition. Except for the phrase “marks of silver,” I know what all those words mean. (And, I can guess from context that a “mark” is some unit of measurement.)

However, the passage does place several mental demands on the reader.

First, it invites you to undertake several rapid mathematical calculations. (Quick: how many shillings in a mark?)

Second, it requires you to learn abbreviations as you go. To understand the fourth sentence, you need to remember the (wildly counter-intuitive) abbreviation of “pence” as “d” from the third sentence.

Third, it assumes you recall several events and places unfamiliar–I suspect–to most Americans. Who was King John II? Why was he ransomed…was he kidnapped? Where are Gascony and Aquitaine? They don’t sound very English — why did an English prince introduce coinage to them? Actually: why is a prince empowered to introduce new currency?

Essential Background Knowledge

I thought of this paragraph when I read a recent article by Robert Pondiscio. In it, Pondiscio summarizes a study trying to determine how much background knowledge is necessary for comprehension.

In this study, students who scored higher than a 59% on a background knowledge test understood a reading passage substantially better than those who scored below 59%.

As summarized by Pondiscio, the study’s authors see some clear teaching implications here.

First, we can meaningfully measure our students’ relevant background knowledge.

Second, students who fall short on that measure will benefit A LOT if we provide them with the essentials.

For instance, students who understood that “habitat,” “species,” and “ecosystems” were relevant vocabulary for the study of ecology understood the reading passage more deeply. (The study included 3500 students, so I believe they controlled for various confounds. I haven’t read the study itself–it’s behind a paywall.)

I think those conclusions point to another:

Third: models of teaching that focus on “pure discovery” will create substantial challenges for students who lack background knowledge. Students who don’t know the basics of a topic simply can’t understand the field of inquiry within which they’re meant to discover.

And, they won’t feel motivated by curiosity to find out. They’ll feel discouraged by their confusion. (Few readers, I suspect, were motivated by the paragraph above to learn more about medieval English currency.)

A Final Thought

This study finds that 59% was the essential tipping point. Students who scored lower than 59% on the prior knowledge test found themselves in a different cognitive category than those who scored above.

Howeverthat percentage does not necessarily apply to all circumstances.

In other words: we shouldn’t give our students prior-knowledge tests, and focus only on those who score 58% and below.

Instead, we should plan our lessons and units knowing that some floor-level of knowledge will be crucial for learning most things.

In every case–as you hear me say so often–we’ll have to rely on the teacher’s judgment to discover that level.

Researchers can remind us that the floor exists. But they can’t identify it for every teacher in every classroom. Ultimately, with that research guidance in mind, we’ll find the right place for the floor. And, we’ll build it.

Advice: It Is Better to Give than Receive
Andrew Watson
Andrew Watson

When students struggle, we typically offer them advice. It seems obvious to think that receiving advice might help them learn.

What if we tried a different approach? What would happen if we thought that giving advice might help students learn?

Several researchers–including Angela Duckworth–recently tried this approach in a large high-school study. Almost 2000 students participated.

Working at a computer, students offered advice to “an anonymous younger students who was hoping to do better in school.” Specifically, they answered 14 questions on how and where to study.

They also wrote a brief motivational letter.

The Theory Behind The Practice

Duckworth & Co. hypothesized that this brief advice session might help advice-giving students for three reasons:

First: they might actually believe the advice they offer. (Psychologists call this the “saying is believing” effect.)

Second: when they offer this advice, they might come up with specific plans to apply it to their own studying.

Third: “giving advice, unlike receiving advice, can increase confidence.”

So, what happened?

When Small Effects Aren’t Small

The researchers kept track of grades in two courses: a) math, and b) a course that students themselves identified as one in which they particularly wanted to improve.

The students completed the advice exercise at the beginning of the 3rd quarter. Would that make a difference, compared to the control group, at the end of the 3rd quarter?

The short answer: yes, a little bit.

On the graphs, the 3rd quarter grades in the advice group look about 1 point higher than those in the control groups. In stats terminology, Cohen’s d was 0.12 for the class the students chose, and 0.10 in math class.

Did those effects last? Not really. By the end of the 4th quarter, the differences were no longer statistically significant.

At first, these data seem quite discouraging. The intervention didn’t make much of a difference, and didn’t make a lasting difference.

Duckworth’s team, however, feels much more optimistic.

First, most interventions have no effect at all. A small effect is better than none.

And, second, most interventions cost a lot. This one cost … [does quick calculation on back of envelope] … practically nothing. Even the opportunity cost is small: the whole exercise lasted eight minutes!

What’s Next?

I suspect that other researchers will pick up on this approach, and we’ll see other studies exploring it. (Joshua Aronson tried a similar strategy to combat stereotype threat back in 2002, and had similarly good results.)

In the meantime, what should teachers do?

First, I think we can adapt this approach to our own work. If our schools have a mentoring program, or a buddy system–or, heck, if our students have younger siblings, we’ve got a natural opportunity for this confidence-building approach.

Second, I think we ought to offer students some guidance about the advice they give. If the “saying is believing” effect consolidates beliefs about learning styles, for example, that would be counter-productive. A small menu of suggestions might be good for everyone involved.

Third: if an eight-minute intervention had an effect that lasted a few months, surely we could create more than one opportunity to give advice. Repeated doses of this educational medicine might be lots more helpful than just one.

If you try this approach in your classrooms, I hope you’ll let me know about your results.

Study Advice for Students: Getting the Specifics Just Right
Andrew Watson
Andrew Watson

If you follow research-based teaching advice, you’ve heard a lot about retrieval practice in recent months.

The headline: if students want to remember information, they shouldn’t review it. That is: they shouldn’t just look it over. (“Ah, yes, the book says here that the Ideal Gas law is PV=nRT.”)

Instead, they should try to remember it first. That is: they should try a mini mental quiz. (“Hmm. What is the Ideal Gas law again? PV = something…let me think for a moment.”)

One great benefit of this research finding: students can do it themselves. All those online testing programs (most famously, Quizlet) can help students self-test rather than simply review.

Timing is Everything

Two days ago, I presented this research to (quite splendid) teachers in Fukuoka, Japan. As they pondered this guidance, one teacher asked a question I’d never heard before. Here’s a paraphrase:

I understand that retrieval practice might promote learning. But, it also might be really discouraging.

If students keep testing themselves, and keep getting the answers wrong, they’ll feel helpless and frustrated.

So: this strategy might increase learning for some students, but paradoxically for other students it might decrease motivation to study.

At the time, my response was: that’s an entirely plausible hypothesis, but I haven’t seen any research into that question. If you the teacher see that retrieval practice is demotivating, you’ll know best when (and how) to switch to something else.

Entirely by coincidence, I found research that addresses that question the very next day.

Kalif Vaughn and Nate Kornell wondered: how does retrieval practice influence motivation? Specifically, does a student’s fear of getting the answer wrong discourage her from relying on retrieval practice?

If yes, can we redirect those motivational processes? And, crucially, can we redirect motivation without sacrificing the benefits of retrieval practice?

The Power of Hints

Vaughn and Kornell started researching the effect of hints. Here’s their thought process:

If I’m nervous about getting a retrieval-practice answer wrong, I might choose simply to review the material instead. (Rather that struggling to remember that PV=something something something, I’ll just look in the book.)

But if I know I’ll get a hint, then I might be willing to try retrieval practice. That is: the hint makes retrieval practice less scary, and so increases my motivation to try it out.

Sure enough, people who had to choose between straight-up retrieval practice and simple review strongly preferred the review. Something like 80% of the time, they reviewed the correct answer. Only 20% of the time did they dare retrieval practice.

However, when they could get a hint, they reviewed only 30% of the time. The other 70%, they tried some form of hint-informed retrieval practice.

That is: by including the hint option, teachers can more than triple the likelihood that students will try retrieval practice. Hints reduce the likelihood of failure, and thereby increase motivation.

The Danger of Hints?

But wait just a minute here.

Past research shows that pure retrieval practice helps students learn and remember. We should admit that hints just might undermine that effect.

In other words, hints could entice students to try self-quizzing, but could reduce the effectiveness of the technique. Ugh.

Happily, Vaughn and Kornell spotted that potential problem, and investigated it.

Their findings: hints didn’t hurt.

In other words: students who did pure retrieval practice, and those who got small hints, and those who got big hints all remembered new information better that students who simply reviewed information.

Based on these findings, the researchers write:

We recommend giving students the option to get hints when they are testing themselves. It will make them choose [retrieval practice] more often, which should increase their learning, and it will also make learning more fun, which might increase their motivation to study. We envision instructors making more use of hints in worksheets, questions at the end of textbook chapters, flashcards, and a variety of digital study aides that resemble Quizlet. The students themselves might also benefit by finding ways to give themselves hints as they test themselves.

Vaugh and Kornell also suggest that the hint option will be more beneficial early in the review process. After a while, students shouldn’t need them anymore to feel confident enough to try retrieval practice.

A final note: the word “hint” here should be interpreted quite broadly. Vaughn & Kornell let students see a few letters of the correct answer; that was their version of “hint.” As teachers, we’ll adapt that general concept to the specifics of our classroom work.

As I say so often: teachers needn’t do what researchers do. Instead, we should think the way they think. That thought process will bring us to our own version of the right answer in our classrooms.

The Best Teaching Method? Depends on the Student…
Andrew Watson
Andrew Watson

Should teachers show students how to solve a problem? Should we model the right way to do a task?

Or, should we let students figure solutions out on their own?

This set of questions has gotten LOTS of attention over the years. Sadly, as can happen all too often, the answers have become polarized.

You’ll read (emphatic) teaching advice that we must let students discover answers and processes on their own.

You’ll read (passionate) teaching advice that we have to explain and guide them every step of the way.

How can we escape from this all-or-nothing debate?

Asking a Better Question

Here’s one escape hatch: ask a more helpfully precise question.

In other words: the answer to the question “what’s the best way to teach my students X” is “it depends on your students.”

More specifically, it depends on your students’ level of expertise.

Once we rethink our teaching from this perspective, a common-sensical framework quickly comes into perspective.

“Beginners”–that is, students with little-to-no expertise–need lots of explicit instruction and guidance.

If we’re not there to shepherd them through the early stages, they’re likely to experience working-memory overload. (If you followed our series on working memory this summer, you know working memory overload is baaaaad.)

However, “experts”–that is, students who have gone beyond the foundations of the topic–can explore, invent, and discover on their own. In fact, they’re likely to be distracted by too much explanation.

That last sentence sounds very odd. Why would an “expert” be distracted by explanation?

Here’s why. If you understand a topic, and then listen to me explain it, you have to realign your understanding of it to match my explanation.

That realignment process takes up…you guessed it…working memory.

By the way: this sub-field of cognitive science has its own lingo to describe working memory in action. Right now I’m describing the expertise reversal effect: that is, teaching practices that benefit novices actually impede learning for experts.

An Example. Or Two.

In this study, researchers in Australia had students learn new procedures in geometry and algebra.

Beginners–those who didn’t yet understand much in these areas–benefited from examples showing how to solve the problems. That is: they did better than their beginner peers who didn’t get those example solutions.

However, experts–who understood much more in these areas–did not benefit from those examples. In fact, they might even have learned less.

Other researchers have found similar results for students studying Shakespeare.

One Final Point

If I’ve persuaded you that beginners need explicit instruction, whereas experts benefit from greater freedom to explore and discover, you’re likely to have this question:

How can I distinguish novices from experts?

That question deserves a post of its own. For the time being, I think the simplest answer is the most obvious: the teacher will know.

That is: if your teaching expertise says “these students are ready to struggle at this higher level,” then go for it. If your teaching expertise says “they really need more guided practice, more time with the scaffolds up,” then go that route instead.

We can get some guidance from psychology research in making these decisions. But, ultimately, we have to use our best judgment.

In Defense of Other-Than-Passionate Teaching
Andrew Watson
Andrew Watson

I’m reading Tom Sherrington’s The Learning Rainforest: Great Teaching in Real Classrooms as I travel. Like many of his readers, I’m spending most of my time thinking a) that’s splendidly put, and b) why did it take me so long to start reading this book? It’s been on my “must read” shelf forever…

In brief, I heartily recommend it.

Sherrington opens the second section of Learning Rainforest with a plea for passionate teaching:

“Teach the things that get you excited about your subject. Read that special poem that gets you fired up, show that fascinating maths puzzle with the neat solution, enthuse about the extraordinary story, or talk about that cool exploding watermelon video.” (Yes: Sherrington is British, so he writes “maths” not “math.”)

Much of me wants to agree with this advice. Certainly I try to follow this guidance in my own teaching.

In the classroom, I regularly taught “difficult” texts—from Woolf to Morrison to Hopkins—because they move me so much. (Hopkins’s line “the just man justices” still makes shiver. Who knew “justice” could be a verb?)

And now that I do PD work with teachers, I’m always grateful to get feedback about my enthusiasm and verve.

In brief, I try to practice what Sherrington is preaching.

And Yet…

As I think about this advice, though, I can practice it but not endorse it.

Here’s why:

I think most teachers do our best work when we enter the classroom as our authentic selves.

That is: some teachers are indeed funny. They enliven their classes and their subject matter with puckish wit.

However, many people just aren’t funny. If I try to make my teaching funny because funny works for you, the falsity of that performance may well have dreadful results.

Other teachers have, say, a den-mothery warmth. They can soothe and comfort, and bathe their classrooms with gentle balm.

But: those of us who aren’t naturally soothing might not be able to pull off that act. The pretense would be more disconcerting than calming.

Still other teachers, as Sherrington suggests, are passionate, enthusiastic, and entertaining. Like Robin Williams in The Dead Poets’ Society, they leap about on desks and declaim in Laurence Olivier voices.

Like Sherrington (I imagine), they love showing videos of exploding watermelons. They “get fired up.” They “enthuse.”

And yet, again: some teachers just aren’t like that. Arm waving and zealous emotion simply doesn’t come naturally. As before, faking a teaching style that isn’t my own could backfire disastrously. The only thing worse that fake-funny is fake-enthusiastic.

An Example

In graduate school, one of my best professors taught with an almost studied blandness.

He sat at his desk, looking up occasionally from his notes. While he didn’t read directly from them, he was clearly tracking his outline closely. (We could tell, because his text-only PowerPoint slides often matched what he said, word-for-word.)

He rarely modulated his voice, and never (that I recall) cracked a joke.

And yet, he was fascinating.

Here’s why. First, he had a knack for explaining complex ideas with clarity and rigor. Even the most opaque topics seemed conspicuously clear once he’d explained them.

Second, he had a technique for answering questions that I’ve never seen before.

A student might ask: “What do we know about the impact of music lessons on very young children?”

He’d think for a minute, and then say:

“So, you’re asking if anyone has done a study where one group of three-year-old children had music lessons, and another group spent the same amount of time on an equally active task—maybe dance lessons.

And then, when we tested them on—let’s say—verbal fluency six months later, did those music lessons make any difference?

That’s an interesting question, and as far as I know, no one has done that study…”

In other words: he didn’t so much answer the question as describe how it might be answered by psychology research. (Of course, if such a study had been done, he’d tell us about it.)

After about a month, the questions in class started changing.

My classmates would raise their hands and ask, “Has anyone ever done a study where one group of six-year-olds told stories they made up, while another group read someone else’s story aloud…”

That is: we learned from this professor not only about various psychology topics, but also how to investigate psychology in the first place.

And, to repeat: there was nothing remotely enthusiastic about this class. And yet, this method was remarkably effective, and surprisingly compelling. I always looked forward to his lectures.

In truth, I can think of many excellent teachers whom you’d never describe as “passionate.”

Two Theories

So, if I can’t quite champion excitement as an essential teaching strategy, what would I offer in its stead?

As noted above, I think the first key is authenticity.

If you’re a funny teacher, be funny. If you’re awe-struck and enthusiastic, own that. But if you’re not, don’t try to fake it. Be yourself in the classroom, not a pretend version of another teacher.

The second key: aligning that authenticity with the deep purposes of education.

Here’s what I mean.

I think I’d be a terrible lawyer because, at my core, I hate conflict. My ethical obligation to advocate zealously on my client’s behalf would run smack into my deep desire for everyone to get along.

That is: my authentic self doesn’t really align with the deep purpose of lawyering.

However: teacherly enthusiasm certainly can align with our teacherly goals. We want students to love what they learn, and enthusiasm can go a long way to help them do so.

So too a sense of humor.

A den-mother’s warmth, likewise, might help students face academic rigors that would otherwise stress them out.

And, my professor’s deepest interest—his fascination with the design of psychology studies—lined up beautifully with his teaching goals. He wasn’t enthusiastic. But his authentic self absolutely helped us learn.

In Sum

Should you be worried if your teaching isn’t passionate? Not necessarily.

Should you worry if you’re not classroom-funny? Nope.

Do you need to answer all questions with hypothetical research designs? Heck no.

Should you worry if your authentic self doesn’t foster student growth and learning?

Absolutely.

Exploring the Nuances of Peer Feedback
Andrew Watson
Andrew Watson

Over at the Learning Scientists, Katie Marquardt digs into peer feedback.

On the one hand, we can see many reasons that peer feedback would be beneficial.

It means that students are doing more of the work than we are–and, as we know, “the one who does the work does the learning.”

And, the opportunity to give peer feedback provides students with the responsibility and autonomy we want to be teaching.

On the other hand, those benefits don’t always materialize.

As Marquandt writes:

my colleagues express skepticism about peer review, because of the poor quality of feedback students sometimes give each other, and the challenges of managing peer review activities in the lessons.

This is valid criticism, and I have seen these shortcomings in my own lessons, particularly when working with English language learners who may lack the writing skills to give their classmates good feedback.

If we can imagine good and bad sides to peer feedback, what does the research say?

What The Research Says…

If you read this blog often, you can predict what I’m about to say: we need a narrower question.

Surely the effects of peer feedback depend substantially on the peers, and the feedback.

Marquandt’s post does a great job exploring lots of specific research examples. For that reason, I encourage you to read it. You should be asking: which of the studies she describes best matches your students, and your methodology for fostering peer feedback.

To take a compelling example: one study found that students who gave feedback improved their own second drafts of an assignment more than those who received feedback.

Crucially, this finding held true for the students who “commented more on the strength of macro-meaning and the weakness of micro-meaning” of the drafts they reviewed.

To decide whether or not this study applies to you, you’ll need to know what “micro-meaning” and “macro-meaning” actually mean.

And, you’ll have to decide if research done with college physics students writing up lab reports might reasonably apply to your students.

In other words: this topic is a great example of a broader principle. When we look for research to guide our teaching, we should be sure that the people and the specific methods in the research helpfully match our teaching work and our teaching world.