L&B Blog – Page 47 – Education & Teacher Conferences Skip to main content
10,000 People Talk About Sleep and Cognition
Andrew Watson
Andrew Watson

Most of the research studies I read include a few tens of people. Sixty or eighty is good; more than 100 is rare. I’ve seen published studies with an even dozen.

sleep and cognition

So when I hear about a study with over 10,000 participants, I sit up and take notice.

In this case, researchers in Canada asked people to fill out online surveys about sleep, and to take cognitive tests. Given their astonishing data pool, they can reach firm conclusions about the questions they’ve asked.

Sleep and Cognition: Firm Conclusions

Some of these conclusions will sound quite predictable. Others will surprise you. They certainly surprised me.

First, if you want optimal cognitive function, roughly 7-8 hours of sleep gives you the best results. (Assuming that “you” are an average person. Of course, not everyone is average.)

Second, that number doesn’t change with age. (See below for an important caveat.) That is: 30-year-olds and 80-year-olds think best with the same amount of sleep.

Third, too much sleep muddles cognition as much as too little sleep. As someone who likes sleeping, I’m sorry to say this but: the graphs don’t lie.

Fourth, non-optimal sleep doesn’t harm short-term memory. Researchers tested short-term memory with the “spatial span task.” Participants had to remember which boxes flashed green, and press them in the same order. Here’s an example:

https://www.youtube.com/watch?v=zWO_w3m4NQs

Instead, non-optimal sleep fuddles reasoning skills (like executive function and deductive reasoning) and verbal skills (like verbal working memory).

Of course, school requires A LOT of reasoning and verbal skill. No wonder sleep-deprived (or sleep-surfeited) students struggle.

(By the way, fifth, 48.9% of the participants didn’t get enough sleep.)

And, sixtha good night of sleep really does help. That is: people who got even one good night’s sleep before the test saw a measurable uptick in their cognitive performance.

Caveats

From a researcher’s standpoint, it’s important to note that this team didn’t draw on a random sample. These participants volunteered by coming to a particular website.

And, all of the data here come from self-report. People could be deceiving the researchers. They could also be deceiving themselves.

From a teacher’s standpoint, we should note the age cut-off for this study: 18 years. K-12 students might see similar patterns. That is: their short-term memory might be fine after low-sleep nights, while their reasoning and verbal skills suffer.

Or, entirely plausibly, younger people might see different effects. We just don’t know.

A Final Note

In my experience as a high-school teacher, my colleagues (and I) experienced sleep deprivation as much as our students did.

We should, of course, encourage our students to get enough sleep. (We should also schedule the class day to fit our students’ sleep cycles.)

Now that we’ve seen this research into the connection between sleep and cognition, we should also take better care of ourselves.

Choosing a Knowledge-Rich Curriculum: Pros and Cons
Andrew Watson
Andrew Watson

Should our curriculum focus on knowledge or skills?

Jon Brunskill debates this question with himself in this thoughtful post.

Brunskill does offer a strong conclusion in this debate. But just as important: the way he frames the discussion.

Following Rapoport’s Rules to Promote Civil Discourse (which I haven’t heard of before), Brunskill sets himself several tasks.

First, he summarizes the opposite belief as accurately and fairly as he can. (The goal, according to Daniel Dennett, is that the other person say “Thanks, I wish I’d thought of putting it that way.”)

Second, he notes his points of agreement with that position, and (third) what he has learned while thinking about it.

Only then, fourthdoes he get to express his disagreement, and advocate for a distinct point of view.

(By the way: you haven’t accidentally skipped a paragraph. I’ve deliberately not said what his conclusion is, because I want to focus on his methodology.)

The Takeaway

You might agree with Brunskill’s conclusion. Or, you might emphatically disagree with it.

If the latter, great news! You have an opportunity to follow his example.

How might you summarize his position as fairly as possible?

What do you agree with?

What did you learn?

Once you’ve answered those questions, then your rebuttal will be more persuasive, and more beneficial to the debate. I suspect it will also be more beneficial to you.

Surprise: The Adolescent Brain Isn’t Broken
Andrew Watson
Andrew Watson

Chapter 2 of Inventing Ourselves: The Secret Life of the Teenage Brain kicks off with a teenager’s diary entry from April of 1969:

I went to arts centre (by myself!) in yellow cords and blouse. Ian was there but he didn’t speak to me. Got rhyme put in my handbag from someone who’s apparently got a crush on me. It’s Nicholas I think. UGH.

Man landed on moon.

This anecdote marvelously captures common perceptions of adolescence.

adolescent brain

Self absorbed. Dotty about crushes and boys/girls and clothes. Too addled by hormones to focus on epochal events — like, say, Neil Armstrong’s small step onto the moon.

In Defense of the Adolescent Brain

Researcher Sarah-Jayne Blakemore would like to change your mind about all of these perceptions.

Drawing on decades of research, she focuses on one essential claim. Teenagers’ brains aren’t incomplete versions of adult brains. They’re not hyper-hormonal versions of children’s brains.

Instead, adolescence results from distinct, meaningful neural developments. Teenagers do the developmental work that their life stage calls upon them to do. Their brains help them along with exactly this task.

The Stories that Science Tells

More than most researchers, Blakemore manages to describe scientific studies precisely and readably.

You get a very clear picture of what researchers did, and why they designed their experiments as they did. And: what they learned from doing so.

And yet, you’re never bored or baffled. Blakemore’s descriptions just make sense.

(I try to do exactly this almost every day on this blog, so I can tell you: that’s REALLY hard to do well.)

As a result, you’ll come away with a clearer understanding of the cognitive developments that take place during the teenage years.

Also, some of the surprising deficits. (Teenagers are worse than 10-year-olds at recognizing emotional facial expressions!)

By the way: teens also don’t recognize the difference between high- and low- stakes as well as we would expect.

Because of Blakemore’s clarity, you’ll also know how we know each of these truth.

Conclusions

Blakemore doesn’t end with a step-by-step program for teaching or parenting teens.

Instead, she offers a way of thinking about this vital stage of development.

She helps us step back from day-to-day adolescent conflicts to see the bigger neuro-biological picture.

For example: it’s not just teenagers who drink more alcohol with their peers. Adolescent MICE drink more alcohol when surrounded by other adolescent mice. No, really. (See page 4.)

She also resists the popular temptation to rage against technology use. Based on her lab’s analysis (undertaken by one-time LatB blogger Kate Mills), we don’t really know enough about technology use to draw firm conclusions about its perils.

In particular, we don’t have good at all about the influence of adults’ technology use on the children around them.

In brief, we should read Blakemore’s book not for quick solutions but for long-term perspectives.

 

The Limits of Retrieval Practice, Take II…
Andrew Watson
Andrew Watson

Just two weeks ago, I posted about a study showing potential boundary conditions for retrieval practice: one of the most robustly supported classroom strategies for enhancing long-term memories.

As luck would have it, the authors of that study wrote up their own description of it over at The Learning Scientists blog. Those of you keeping score at home might want to see their description of the study, and their thoughts on its significance.

The short version: boundary conditions always matter.

We should assume they exist, and look for them.

A teaching practice that works with some students — even most students — just might not work with my students.

In that case: I’m happy it helps the others, but I need to find the strategy that will work with mine.

This Is Your Amygdala on a Cliff…
Andrew Watson
Andrew Watson

If you’ve seen the documentary Free Solo, you know about Alex Honnold’s extraordinary attempt to climb a 3000 foot sheer rock face.

Without ropes. Without protective gear of any kind.

And without, it seems, a typically functioning amygdala.

https://www.youtube.com/watch?v=nF-7H5Dk26E

Free Solo briefly mentions Honnold’s visit to Jane Joseph’s lab. (You see a quick image in this trailer.)

At the time, Joseph studied high sensations seekers: people who are “drawn to intense experiences and are willing to take risks to have them.” That is, for example, people who habitually scale sheer walls of granite.

(Descriptions of Honnold’s visit appear in J. B. MacKinnon’s excellent essay: “The Strange Brain of the World’s Greatest Solo Climber.”)

The Case of the Quiet Amygdala

Using fMRI scanning, Joseph’s team examined Honnold’s brain. In particular, they focused on the reactivity of his amygdalae.

These  small, almond-shaped regions of the brain sit at the tip of the hippocampus. Their function, simply put: to process strong negative emotions, like fear.

(For scrupulous readers, “amygdala” is singular; “amygdalae” is plural.)

Jane Joseph — like many others — wanted to know: did Honnold’s amygdalae react differently than those of others?

To test the question, she showed him 200 pictures, many of them gruesome or disgusting: “corpses with their facial features bloodily reorganized; a toilet choked with feces.”

Neurotypical observers — like the control subject Joseph also scanned — show strong reactions to these images.

Honnold’s amygdalae? Nothing. Nada. Bupkis.

Explaining the Inexplicable

MacKinnon describes Honnold’s free climbing this way:

“On the hardest parts of some climbing routes, his fingers will have no more contact with the rock than most people have with the touchscreens of their phones, while his toes press down on edges as thin as sticks of gum.”

Honnold’s quiet amygdalae might explain his fearlessness. But, what explains his quiet amygdalae? How can you stand 2000 feet about the ground on a stick of gum without gut-tormenting terror?

amygdala

(If you’re like me, your palms start sweating when you see him standing there. Now, imaging being there…)

To be clear, we should note that Honnold does have amygdalae. The MRI scan shows them, looking perfectly normal.

(Very rarely, some people have deformed or absent amygdalae. They don’t typically grow up to be free soloists, but they do demonstrate much less fear than others.)

Two explanations might help us understand Honnold’s remarkable brain. [Edit: to be clear, both these explanations appear in MacKinnon’s article.]

In the first place, genetic variability creates a range for all human functions and characteristics. For example, men average a height of just under 5’10”. The tallest man, however, towers at 8’2″.

In this case, Honnold might have — by the luck of the genetic draw — extremely under-reactive amygdalae.

Beyond Genes

In the second place, he might also have developed techniques for re-evaluating scary/terrifying situations. By mentally “reviewing the tapes” of his climbs, by deliberately re-evaluating them calmly and rationally, he can desensitize himself to the fear that would grip practically anyone else.

In other words: a combination of nature (genetics) and nurture (deliberate re-evaluation) might tame Honnold’s amygdalae, and allow him to face extra-ordinary terrors with extra-ordinary calm.

In just the right conditions, our brains can help our bodies do almost anything. Like: scaling a cliff with preternatural sang-froid.

 

To hear Honnold talk about his experience of fear, click here.

For other strategies to calm the amygdala, click here.

To learn A LOT more about emotions and fear, read Joseph LeDoux’s The Emotional Brain: The Mysterious Underpinnings of Emotional Life. Also, Behave by Robert Sapolsky.

Edited to credit MacKinnon’s article explicitly for the two explanations of Honnold’s unusual neural inactivity.

Ask a Simple Question, Get an Oversimplified Answer
Andrew Watson
Andrew Watson

handwritten notes

If learners were widgets, then educational research would be simple. The same teaching technique would work (or not work) equally well for all students.

It would also help if all teachers were widgets. And, if we all taught the same topic the same way.

We could ask simple research questions, get uncomplicated answers, and be ENTIRELY CERTAIN we were doing it right.

A Sample Case: Handwritten Notes

For example, if all students were identical, then we could know for sure the best way to take notes in class.

(It would help if teachers all taught the same way too.)

Are handwritten notes better than laptop notes? Vice versa? The study design couldn’t be simpler.

Mueller and Oppenheimer famously argue that “the pen is mightier than the keyboard.” (I’ve argued strenuously that their research does not support this claim, and probably contradicts it.)

But what if the question just can’t be answered that simply?

What if students do different things with their notes?

What if the classes in which they take notes are different?

Really, what then?

Mixing It Up

Linlin Luo and colleagues explore these questions in a recent study.

Happily, they start from the assumption that students use notes in different ways. And, that professors’ lectures include important differences.

For example: some students take notes, but don’t review them. (They probably should…but, there are LOTS of things that students probably should do. For instance, attend lectures.)

Others students do review the notes they take.

Some lectures include lots of visuals. Others don’t include many.

Once we start asking more complicated questions … that is, more realistic questions … we start getting more interesting answers.

More Interesting Answers

What did Luo and colleagues find? Unsurprisingly, they found a complex series of answers.

First: students who didn’t review their notes before a quiz did better using a laptop.

Second: students who did review their notes did better taking handwritten notes.

Third: in both cases, the differences weren’t statistically significant. That’s a fancy way of saying: we can’t say for sure that the laptop/handwriting distinction really mattered.

Fourth: unsurprisingly, students who took handwritten notes did better recording visuals than did laptop users. (Students who took laptop notes basically didn’t bother with visuals.)

Advice to Teachers and Students

What advice can we infer from this study? (And: from its analysis of previous studies?)

A: teachers can give students plausible guidance. “If you really will study these notes later, then you should take them by hand. But, if you really won’t, then use a laptop.”

B: teachers who present a lot of visuals should encourage handwritten notes. Or, make copies of those visuals available.

C: given that the differences weren’t statistically significant, we might encourage students to use the medium in which they’re more comfortable. If they (like me) have dreadful handwriting, then maybe they should use a laptop no matter what.

D: I continue to think — based on the Mueller and Oppenheimer study — that we should train students to take notes in a particular way. If they both use laptops AND reword the teachers ideas (rather than copying them verbatim), that combination should yield the most learning.

Most importantly, we should let this study remind us: simple answers are oversimplified answers.

If you’d like to meet two of the researchers who worked on this study, check out this video:

https://www.youtube.com/watch?v=BfCZ0K0HoJE

 

 

 

Evaluating the Best Classroom Practices for Teaching Math
Andrew Watson
Andrew Watson

What strategies work best for math teaching?

math teaching

And, crucially, how do we know?

To answer this question, we might rely on our teacherly instincts. Perhaps we might rely on various educational and scientific theories. Or, we might turn to data. Even big data.

Researchers in Sweden wondered if they could use the TIMSS test to answer this question.

(“TIMSS” stands for “Trends in International Mathematics and Science Study,” given every four years. In 2015, 57 countries participated, and 580,000 students. That’s A LOT of students, and a lot of data.)

3 Math Teaching Strategies

When students take these tests, they answer questions about their classroom experience.

In particular, they answer questions about 3 math teaching strategies. They are asked how often they…

Listen to the teacher give a lecture-style presentation.

Relate what they are learning in mathematics to they daily lives.

Memorize formulas and procedures.

Researchers want to know: do any of these teaching practices correlate with higher or lower TIMSS scores? In other words, can all these data help us evaluate the effectiveness of specific teaching practices?

2 Math Teaching Theories

Helpfully, the researchers outline theories why each of these practices might be good or bad.

As they summarize recent decades of math-teaching debate, they explain that “researchers with their roots in psychology and cognitive science” champion

formal mathematical notions,

explicit instruction where teachers show students how to solve math problems,

practicing and memorizing rules and worked examples.

On the other hand, “researchers with their roots in the reform movement” champion

connecting math to students’ daily lives,

a problem-solving approach,

understanding ideas and connections, rather than memorization.

Doubtless you’ve heard many heated debates championing both positions.

Predictions and Outcomes

These theories lead to clear predictions about TIMSS questions.

A cognitive science perspective predicts that “lecture-style presentations” and “memorizing formulas” should lead to higher TIMSS scores.

A reform-movement perspective predicts that “relating math to daily life” should lead to higher scores.

What did the data analysis show?

In fact, the cognitive science predictions came true, and the reform predictions did not.

In other words: students who listened to presentations of math information, and who memorized formulas did better on the test.

Likewise, students who applied math learning to daily life learned less.

An Essential Caveat

As these researchers repeatedly caution, their data show CORRELATION not causation.

It’s possible, for instance, that teachers whose students struggle with math resort to “daily life” strategies. Or that both variables are caused by a third.

Potential Explanations

“Connecting new math learning to real life situations” seems like such a plausible suggestion. Why doesn’t it help students learn?

These researchers offer two suggestions.

First, every math teaching strategy takes time. If direct instruction is highly effective, then anything that subtracts time from it will be less effective. In other words: perhaps this strategy isn’t harmful; it’s just less effective than the others.

Second, perhaps thinking about real-life examples limits transfer. If I use a formula to calculate the area of a table, I might initially think of it as a formula about tables. This fixed notion might make it harder for me to transfer my new knowledge to — say — rugby fields or floor plans.

At present, we can’t know for sure.

A final point. Although this research suggests that direct instruction helps students learn math, we should remember that bad direct instruction is still bad.

Lectures can be helpful, or they can be deadly tedious.

Students can memorize pertinent and useful information. Or, they can memorize absurd loads of information.

(A student recently told me she’d been required to memorize information about 60 chemical elements. Every science teacher I’ve spoken with since has told me that’s ridiculous.)

And so: if this research persuades to you adopt a direct-instruction approach, don’t stop there. We need to pick the right pedagogical strategy. And, we need to execute it well.

Cognitive science can help us do so..

Does Media Multitasking Really Interfere with Student Thinking?
Andrew Watson
Andrew Watson

media multitaskingTo many teachers, it just seems obvious: all that screen times MUST be bad for student brains.

To many other teachers, it just seems obvious: technology will unleash academic possibilities and revolutionize education.

So, which is it? Does media multitasking damage students’ cognitive capabilities? Or, does it allow them new avenues to creative possibilities?

Here’s What We Know

In a recent analysis, Uncapher and Wagner surveyed research into this topic.

Sure enough, they found some troubling evidence.

In half of the studies they examined, people who often use multiple streams of technology scored lower on working memory tests than those who don’t.

In two studies, they had a harder time recalling information from long-term memory.

Studies also showed problems with sustained attention.

Here’s a place where media multitasking might help: task switching. Given all the practice that multitaskers get diverting attention from one gizmo to another, they might well get better at doing so.

Alas, most of the research that U&W examined didn’t support that hypothesis.

Here’s What We Don’t Know: A LOT

Although all of the sentences above are true, they don’t answer most questions with any certainty.

For example, if half of the studies showed that high multitaskers do worse on working memory tests, that means that half of the studies DON’T reach that conclusion.

(It’s important to note that NONE of the studies showed that high multitaskers were better at working memory tasks than their counterparts.)

Uncapher and Wagner repeatedly emphasize this point. We don’t have lots of studies — and those we do have don’t all point the same direction.

Another important question: causality. Perhaps multitasking reduces sustained attention. Or, perhaps people who have trouble sustaining attention multitask more often.

Firm Conclusions

At present, we can conclude with confidence that we don’t have enough evidence to conclude anything with confidence.

Overall, the evidence suggests heavy media multitasking might cause (or might result from) relative weaknesses in several cognitive functions.

We certainly don’t have evidence that encourages us to promote multi-gizmo use.

I myself try to stick to one device at a time. Until more evidence comes in, I’ll gently suggest my students do likewise.

(For thoughts on technology and attention, click here.)

Avoiding Extremes: Common Sense in the Middle
Andrew Watson
Andrew Watson

Teachers feel passionate about our work. As a result, we can advocate exuberantly — occasionally too exuberantly? — for a particular position.

Advocates for (or against) Social-Emotional Learning can make zealous claims for their beliefs. Same for PBL, or direct instruction. Or for flipped classrooms, or traditional ones.

Of course, given the variety of teachers, students, schools, curricula — and the variety of societies in which they all operate — we perhaps should hesitate to make absolute claims.

Today’s Shining Example

I recently rediscovered a marvelous example of comfort with the ambiguous middle ground.

In this EdSurge post, Art Markman explains how mindfulness can help. And: how it might not help.

He explains the benefits of a growth mindset. And: its potential detriments.

When asked “if schools teach the way students learn,” he doesn’t scream “OF COURSE!” Nor does he bellow “NEVER!”

Instead, he offers this answer: “Sometimes, but often not.”

In other words: we’re not all spectacular successes or hideous failures. Contrary to much of the rhetoric you hear, we live somewhere in between.

I hope you enjoy reading this interview. And, that Markman’s sensible example offers guidance on moderation and nuance.

I myself look forward to reading more of his work.

Default Image
Andrew Watson
Andrew Watson

Here on the blog, I write A LOT about the benefits of “retrieval practice.” (For example: here and here.)

retrieval practice limitations

In brief: our students often review by trying to put information into their brains. That is: they “go over” the material.

However, they learn more if — instead — they review by trying to pull information out of their brains. That is: they fill in blanks on Quizlet, or use flashcards, or outline the chapter from memory.

AT THE SAME TIME…

I also write about the importance of “boundary conditions.”

A particular research finding might be true for this group (say, college students learning chemistry) but not that group (say, 3rd graders learning spelling rules).

(For example: here and here.)

So, I really should ask myself: what are the boundary conditions for retrieval practice?

Retrieval Practice Limitations?

In the first place, retrieval practice has become so popular because it works so well in so many circumstances.

It helps 2nd graders and adult learners.

It helps with declarative knowledge and procedural knowledge.

And, it helps Red Sox fans and Dodgers fans. (I might have made that one up.)

However, I have recently seen research into two retrieval practice limitations, and I think they’re important for teachers to keep in mind.

“Narrow” vs. “Broad” Learning

Researcher Cindy Nebel (nee Wooldridge) wanted to know if retrieval practice helps students learn only the information they retrieve. That is, it might have a narrow, focused effect.

Or perhaps it helps students remember ideas related to the information they retrieve. Retrieval of one memory might broadly influence other memory networks.

In my geography class, for instance, students might learn that the capital of Egypt is Cairo, and that its main economic drivers are tourism and agriculture.

I encourage my students to make flashcards to help them remember capitals. When a student looks at her Egypt flashcard, will remembering its capital (“Egpyt!”) help her remember its main industries as well? Or, does it help consolidate only that specific memory network?

Alas, according to Nebel’s research, RP has a “narrow,” not a “broad” effect. It helps students remember the specific information they retrieved, but not related concepts.

Practically speaking, this finding suggests that we should be sure to tailor retrieval practice exercises quite precisely to the specific memory we want students to form. A question about triassic fossils won’t necessarily help them recall specifics about the end of the cretaceous era.

If we want them remember asteroid impacts, we should use RP to foster those memories.

Question Difficulty, Difficult Questions

A more recent study has looked at other retrieval practice limitations: fluid intelligence, and question difficulty. This research is still behind a paywall, and so I haven’t looked at the specifics.

The abstract, however, suggests that — especially on difficult items — students with relatively low fluid intelligence might benefit more from review than RP.

This research finding raises several questions: how, precisely, do we measure question difficulty?

And: how much stock do we want to put into measures of fluid intelligence?

Classroom Decisions

As always, the question comes down to this: “what should I, as the classroom teacher, actually do?

Based on this research, I think we can reach a few clear conclusions:

In many circumstances, retrieval practice helps students remember more than simple review.

As much as possible, we should ensure that we have students retrieve the precise information (or process) we want them to remember. Nearby questions might not help enough.

When working with difficult material, or with students who really struggle in school, we should keep an open mind. Try different learning strategies, and see which ones prove most effective with this student right here.

I’ll keep you posted as I read more about boundary conditions for retrieval practice.