technology – Page 3 – Education & Teacher Conferences Skip to main content
Flipping the Classroom: Asking the Right Question
Andrew Watson
Andrew Watson

When teachers hear about an intriguing new approach, like–say–“flipping the classroom,”we’re inclined to ask: “but does it work?

Let me propose a different question: under what circumstances does it work?”

After all, we should assume that many teaching techniques work for this teacher instructing these students in this topic. Alas, those same techniques might not work for that teacher teaching those students this other topic.

So, ask not “does flipping the classroom work?” Instead, ask “does flipping the classroom help seventh graders in Germany learn three basic algebraic principles?”

That question might sound obscure. (Okay, I’m sure it sounds obscure.)

But: research can answer that second question. It can answer the first only by answering the second dozens (or hundreds) of different ways.

So, Does It?

Here’s a very particular example. Doctors in Finland have to write very particular kinds of insurance certificates. Therefore, Finnish medical schools have to teach future doctors to write them.

So our question is: “Does flipping the classroom help Finnish medical students learn to write insurance certificates?”

To answer that question, researchers did everything you’d want them to do. They had one professor teach the lecture-only version of that skill. The med students then practiced at home.

For a different group of med students, the professor created a short video for students to watch at home. And, they practiced the skill in class with the professor’s guidance.

Which group learned better?

The Envelope, Please

The flipped classroom group learned better. A LOT BETTER. The cohen’s d value was 2.85. (I’m typically delighted by a d value of 0.50 or higher. I can’t remember another 2.85.)

So, clearly all teachers should start flipping the classroom–right?

NO WE SHOULD NOT.

This study showed that Finnish med students learned certificate writing better this way.

But, this is a niche-ey topic indeed.

These are fourth year med students. They’re nearing the end of a highly technical education. They’re as good at school as any students on the planet.

Also, they’re learning a discrete skill. I don’t know much about Finnish medical insurance, but I’m guessing it’s quite a distinct genre. The video covering this skill lasted four-and-one-half minutes.

In other words: if you’re teaching very advanced students a very narrow topic, then this study might encourage you to flip the classroom.

But, if you’re teaching beginners, or you’re teaching complex and abstract material, you might want to find other research before trying out this technique.

For instance: this study of students learning epidemiology showed that flipping the classroom made essentially no difference.

Final Thoughts

I have a research adjacent (although, not research supported) opinion about flipping the classroom.

As always, I think the key variable is working memory. The headline is: teachers should have students to do the heavy WM work in the classroom.

So: I guess that the basic principles of insurance certificate writing are easy to understand. But, applying them to specific circumstances can be more challenging.

That is: the application takes more WM. For that reason, watching a video at home and practicing in class with the prof makes sense.

In the case of–say–analysis of literature, those demands are reversed. Students can read stories quite effectively on their own. So, that should be the homework. But, the analysis of that literature requires lots of complex working memory initiative. This sort of discussion should be in-class, with the teacher, and not online.

I’ve never seen research consider flipped classrooms from a WM perspective. But, that framework seems to offer reasonable guidelines–especially if you can’t find research that matches your situation.

 


After I drafted the post above, I found this recent meta-analysis. The headline is that it found modest benefits to flipping the classroom, but that they were subject specific. Alas, the abstract doesn’t say which disciplines do and don’t benefit. I hope it becomes public soon, so we can find out!

Does Banning Classroom Technology Improve Engagement? Learning?
Andrew Watson
Andrew Watson

We’ve got many reasons to believe that technology — whatever its benefits — can distract from learning.

Heck, according to one study, the mere presence of a cellphone reduces available working memory. YIKES.

Unsurprisingly, we often hear calls for technology-free zones in schools. Laptop bans have ardent champions.

One group of researchers wanted to know: what effect might a technology ban have on the tone of the classroom?

Would such a ban complicate the students’ relationship with the professor?

Would it affect their engagement with the material?

And, of course, would it benefit their learning?

The Study

One professor taught four sections of the same Intro to Psychology course. Cellphones and laptops were forbidden from two sections, and allowed in two.

At the end of the course, researchers measured…

Students’ rapport with the professor: for instance, students rated statements like “I want to take other courses from the professor,” or “I dislike my professor’s class.”

Students’ engagement with the class: for instance, “I make sure I study on a regular basis,” or “I stay up on all assigned readings.”

Students’ grades — on 3 exams during the term, and on their overall final grade.

That’s straightforward enough. What did they find?

The Results, Part I: Hang On to your Hat

You might predict that a technology ban would improve class tone. Freed from the distractions of technology, students can directly engage with each other, with their professor, and the material.

You might instead predict that a ban would dampen class tone. When teachers forbid things, after all, students feel less powerful.

Hutcheon, Lian, and Richard found that the tech ban had no effect on the students’ rapport with the professor.

They also found that the ban resulted in lower engagement with the class. That is, on average, students in a tech-free class said they did class readings less often, and put forth less effort.

This finding held true even for students who preferred to take notes by hand: that is, students who wouldn’t be inclined to use laptops in class anyway.

The Results, Part II: Hang On Tighter

The researchers hypothesized that students in the technology-ban sections would learn more. That is: they’d have higher grades.

That’s an easy hypothesis to offer. Other researchers have found this result consistently (famously, here).

However, Hutcheon and Co. didn’t get that result. There was no statistically significant difference between the two groups.

But, they got a result that did approach significance: the technology-ban sections learned less. On the final exam, for instance, the tech-ban sections averaged an 84.30, while the tech-permitted sections averaged an 88.04.

The difference between a B and a B+ might not be statistically significant…but it sure might feel significant to those who got the B.

What On Earth Is Going On?

The researchers wonder if the tone of their tech ban led to these results. To be honest, when I read the policy on “Technology Use in the Classroom,” I thought it sounded rather harsh. (For example: “Repeated infractions will result in points lost on your final grade.”)

So, perhaps a more genially-worded ban would impede class engagement less, and allow for more learning.

But, that’s just a guess.

For me, the crucial message appears in the authors’ abstract:

“[T]hese results suggest that instructors should consider the composition of students in their course prior to implementing a technology ban in the classroom.”

In other words, technology policies can’t be the same everywhere. We teach different content to different students in different schools. And, we are different kinds of teachers. No one policy will fit everywhere.

To be crystal clear: I’m NOT saying “This study shows that a tech ban produced bad results, and so teachers should never ban technology.”

I AM saying: “This study arrived at helpfully puzzling results that contradict prior research. It therefore highlights the importance of tailoring tech policies to the narrow specifics of each situation.”

As I’ve said before, teachers should follow relevant research. And, we should draw on our best experience and judgment to apply that research to our specific context.

Beyond the Mouse: Pointing in Online Lectures
Andrew Watson
Andrew Watson

You know, of course, that the right kind of movement can help students learn. The nascent field of “embodied cognition” works to explore the strategies that work most effectively.

Here’s a collection of resources.

And, here’s a recent blog post about kindergarteners moving to learn the number line.

You also know that online learners easily get distracted, often because they multitask. (I say “they” because you and I would never do such things.)

This recent post shows that even folding laundry — a harmless-seeming activity — reduces online learning.

What happens when we put these two research pools together?

Specifically: can movement reduce distraction, and increase learning, for online learners?

Benefits of Online Pointing?

Several researchers — including the estimable Richard Mayer — wanted to answer that question.

Specifically, they wanted to know: do pointing gestures made by the teacher help online students learn?

They had students watch an online lecture (about “neural transmission,” naturally).

For the first group of students, the teacher pointed at specific places on relevant diagrams.

For the second group, the teacher pointed generally toward the diagrams (but not at specific pants of them).

For the third, the teacher moved his hands about, without pointing specifically.

For the fourth, the teacher didn’t move his hands.

Do different pointing strategies help or hurt?

Benefits Indeed

Sure enough, pointing matters.

Students in the first group spent more time looking at the relevant parts of the diagrams.

They did better on a test that day.

And — most important — they did better than the other groups on a test a week later.

Now: a week isn’t exactly learning. We want our students to remember facts and concepts for months. (Preferably, forever.)

But, the fact that the memories had lasted a week suggests it’s MUCH likelier they’ll last longer still.

Practical Implications

If your classroom life includes online teaching, or teaching with videos, try to include specific pointing gestures to focus students on relevant information. At least with this student population, such gestures really helped.

By the way, this study doesn’t answer an interesting and important question: “does student movement as they watch online lectures help or hurt their learning?”

We know from the study cited above that irrelevant movement (like folding laundry) doesn’t help. But: should students mirror your gestures as they watch videos? Should you give them particular gestures to emulate?

We don’t know yet…but I hope future research helps us find an answer.

Does Smartphone Addiction Boost Anxiety and Depression?
Andrew Watson
Andrew Watson

We frequently hear about the dangers of “smartphone addiction.” If you search those words in Google, you’ll find this juicy quotation in the second link:

The brain on “smartphone” is the same as the brain on cocaine: we get an instant high every time our screen lights up with a new notification.

“An instant high.” Like cocaine? Hmmmm.

You might even have heard that we’ve got research about the perils of such addictions. But: can we rely on it?

A recent study asked a simple question, and got an alarming answer.

How Do We Know What We Know About Phone Usage?

Studies about smartphones typically ask participants to rate their cell phone usage — number of minutes, number of texts, and so forth. They then correlate those data with some other harmful condition: perhaps depression.

Researchers in Britain wanted to know: how accurately do people rate their cellphone usage?

When they looked at Apple’s “Screen Time” application, they found that participants simply don’t do a good job of reporting their own usage.

In other words: depression might correlate with people’s reported screen time. But it doesn’t necessarily correlate with (and certainly doesn’t result from) their actual screen time.

In the modest language of research:

We conclude that existing self-report instruments are unlikely to be sensitive enough to accurately predict basic technology use related behaviors. As a result, conclusions regarding the psychological impact of technology are unreliable when relying solely on these measures to quantify typical usage.

So much for that “instant high.” Like cocaine.

What Should Teachers Do

As I’ve written before, I think research into technology use is often too muddled and contradictory to give us good guidance right now.

Here’s what I wrote back in May:

For the time being, to preserve sanity, I’d keep these main points in mind:

First: don’t panic. The media LOVE to hype stories about this and that terrible result of technology. Most research I see doesn’t bear that out.

Second: don’t focus on averages. Focuses on the child, or the children, in front of you.

Is your teen not getting enough sleep? Try fixing that problem by limiting screen time. If she is getting enough sleep, no need to worry!

Is your student body managing their iPhones well? If yes, it’s all good! If no, then you can develop a policy to make things better.

Until we get clearer and more consistent research findings, I think we should respond — calmly — to the children right in front of us.

I still think that advice holds. If your child’s attachment to the cellphone seems unhealthy, then do something about it.

But if not, we shouldn’t let scary headlines drive us to extremes.

Overcoming Potential Perils of Online Learning
Andrew Watson
Andrew Watson

Online learning offers many tempting — almost irresistable — possibilities. Almost anyone can study almost anything from almost anywhere.

What’s not to love?

A tough-minded response to that optimistic question might be:

“Yes, anyone can study anything, but will they learn it?”

More precisely: “will they learn it roughly as well as they do in person?”

If the answer to that question is “no,” then it doesn’t really matter that they undertook all that study.

Rachael Blasiman and her team wanted to know if common at-home distractions interfere with online learning.

So: can I learn online while…

…watching a nature documentary?

…texting a friend?

…folding laundry?

…playing a video game?

…watching The Princess Bride?

Helpful Study, Helpful Answers

To answer this important and practical question, Blasiman’s team first had students watch an online lecture undistracted. They took a test on that lecture, to see how much they typically learn online with undivided attention.

Team Blasiman then had students watch 2 more online lectures, each one with a distractor present.

Some students had a casual conversation while watching. Others played a simple video game. And, yes, others watched a fencing scene from Princess Bride.

Did these distractions influence their ability to learn?

On average, these distractions lowered test scores by 25%.

That is: undistracted students averaged an 87% on post-video quizzes. Distracted students averaged a 62%.

Conversation and The Princess Bride were most distracting (they lowered scores by ~30%). The nature video was least distracting — but still lowered scores by 15%.

In case you’re wondering: men and women were equally muddled by these distractions.

Teaching Implications

In this case, knowledge may well help us win the battle.

Blasiman & Co. sensibly recommend that teachers share this study with their students, to emphasize the importance of working in a distraction-free environment.

And, they encourage students to make concrete plans to create — and to work in — those environments.

(This post, on “implementation intentions,” offers highly effective ways to encourage students to do so.)

I also think it’s helpful to think about this study in reverse. The BAD news is that distractions clearly hinder learning.

The GOOD news: in a distraction-free environment, students can indeed start to learn a good deal of information.

(Researchers didn’t measure how much they remembered a week or a month later, so we don’t know for sure. But: we’ve got confidence they had some initial success in encoding information.)

In other words: online classes might not be a panacea. But, under the right conditions, they might indeed benefit students who would not otherwise have an opportunity to learn.


I’ve just learned that both of Dr. Blasiman’s co-authors on this study were undergraduates at the time they did the work. That’s quite unusual in research world, and very admirable! [6-11-19]

More Contradictions in the Adolescent Sleep/Technology Debate
Andrew Watson
Andrew Watson

A month ago, I described an impressively large study (17,000+ adolescents) looking at the effects of technology on adolescent sleep and well being.

As I summed it up in the title of that post: “Surprise! Screen time (even before bed) doesn’t harm adolescents.”

Today, I’m linking to another large study (6600+ adolescents) showing … just the opposite.

The main findings for this study was that late-night technology use — especially once the room lights were off — predicted a lower “health-related quality of life” for adolescents.

At this point, I’m frankly flummoxed. I just don’t know how to sort out the contradictory research findings in this field.

For the time being, to preserve sanity, I’d keep these main points in mind:

First: don’t panic. The media LOVE to hype stories about this and that terrible result of technology. Most research I see doesn’t bear that out.

Second: don’t focus on averages. Focuses on the child, or the children, in front of you.

Is your teen not getting enough sleep? Try fixing that problem by limiting screen time. If she is getting enough sleep, no need to worry!

Is your student body managing their iPhones well? If yes, it’s all good! If no, then you can develop a policy to make things better.

Until we get clearer and more consistent research findings, I think we should respond — calmly — to the children right in front of us.

Pointing Out Online Mistakes Like a “Jerk”: More Misuses of Psychology Research
Andrew Watson
Andrew Watson

Headline writers face a difficult task, I suspect.

On the one hand, they want to capture the gist of the article. On the other hand, they really want you to click the link.

I thought about this puzzle when I read this recent headline:

People Who Point Out Grammar Errors Online Are Pretty Much Jerks, Study Finds

That’s an arresting claim. After all, the word “jerks” doesn’t often appear in psychology research papers…

Digging Deeper

So, what does this particular study say? Are people who “point out” online grammar errors “jerks”?

Researchers Boland and Queen asked themselves this question: does someone’s personality profile influence their response to written mistakes — such as typos or grammar errors?

(By the way: it would seem odd if the answer were “no.” If there is such a thing as a personality profile, shouldn’t it capture — among other things — the way people respond to one another’s errors?

But, in the field of psychology, we don’t just assume things. We research them. That’s what Boland and Queen do here.)

To answer their question, B&Q had 80+ people read short paragraphs: people’s responses to a “housemate wanted” ad.

Some of the responses were error free. Some included typos: “maybe we would mkae good housemates.” Some included grammatical errors: “If your someone who likes to play tennis…”

Participants then evaluated the authors of each paragraph. They also filled out a personality survey measuring “the big five” personality traits: openness to experience, conscientiousness, extraversion, agreeableness, and neuroticism.

So, here’s the research question: did their personality traits predict their responses to grammatical errors and typos?

The Results

The answer is: a smidge.

For instance, people with higher ratings of agreeableness didn’t much care about grammatical errors. People with lower agreeableness ratings cared a bit.

How much?

Well, on average, people with lower agreeableness scored an error-free message as a ~4.2. But, they rated a message with two grammar errors as a ~4.0

On a 7 point scale, does that 0.2 difference really matter? It was statistically significant. But, the researchers’ methodology makes it hard to evaluate the difference.

Here’s a hypothetical. When my students study using method A, they average an 80 on the unit test. When they study using method B, they average an 80.5.

Method B might be “better” in a way that’s statistically significant. But, it’s honestly not significant in the way that you and I use that word. If, for instance, method B takes 3 times as long as method A, that extra 0.5 point almost certainly wasn’t worth it.

So too in this case. The less agreeable folks might, on average, give lower ratings. But, 0.2 points hardly seems like a big enough deal to worry about.

So, Are People Who Point Out Online Grammar Errors Jerks?

First: NO ONE POINTED OUT ANY ONLINE GRAMMAR ERRORS. It just didn’t happen.

Second: The study shows that people with a relatively low agreeable rating feel more judgey about online grammar mistakes.

It does not show that people who comment on grammar mistakes have lower agreeableness scores.

And it certainly does not show that this particular person who just commented on a post has a low agreeableness score.

Those questions are related, but different. And, the differences really matter. Especially if you’re going to call someone a jerk.

Teaching Implications

When you see a headline like “Science Shows Group X Are Jerks,” have confidence it’s a wild overstatement.

So, when “science says” that …

“Teaching method X makes kids brilliant.”

“Cell phones make the world dumb and cruel.” (Or, “Cell phones will transform education and make classrooms perfect.”)

“This one habit will change your classroom forever.”

…follow up with the underlying research. See what the research says specifically. Decide whether or not it works for you and your students.

A Final Note

I’m honestly hoping that this article includes either a typo or a grammatical mistake. If it does, please point it out to me. I promise I won’t think you’re a jerk…

Surprise: Screen Time (Even Before Bed) Doesn’t Harm Adolescents
Andrew Watson
Andrew Watson

We’ve got lots of research on the complexity of adolescent life. And: lots of research on the importance of sleep.

We’ve also got some research showing that technology can clutter our cognitive processes. (To be clear: technology might also be fantastically useful.)

So, what happens when you put all that together and ask about technology and adolescent well-being?

Predictions

I myself would have made two predictions:

One: except at the very extreme end of screen use, I would have doubted technology time matters much for adolescent well-being. Over the years, I’ve seen plenty of studies suggesting that teens do just fine — even socially — when they’re often on line.

In brief: I’ve heard lots of exaggerated concerns, but little persuasive data behind them.

Two: sleep is, of course, essential for human well-being. We can’t think or learn well without it. Heck, we can’t function very well without it.

And, we’ve got research showing that the light from screens delays melatonin onset — and therefore makes it hard to fall asleep.

For those reasons, I would have predicted that screen time before bed — especially LOTS of screen time before bed — would make life hard for adolescents.

The Findings

According to this review, I’m half right. And: not the half I was confident about.

A study that looked at more than 17,000 adolescents in the US, England, and Ireland found that technology use generally didn’t affect adolescent well-being.

(More precisely, they found that screen time accounted for less than 1% of the difference in adolescent well-being.)

And — SURPRISE — they found that technology use before bed had no statistically significant effect.

Amazingly, even one hour of screen time produced no ill effects in this study.

What Teachers and Parents Should (and Should Not) Do

This study reconfirms the point that screen time — except extreme amounts — probably isn’t hurting teens. Even pre-bedtime screens aren’t such a big deal.

(If you’re still having trouble wrapping your head around that second point, don’t worry. I am too.)

So, what should we do?

Well, if we want to improve adolescent well-being, we should NOT focus our efforts on changing their technology habits. After all, if this study is correct, even an optimal change would improve their well-being by less than 1%.

That is: we should NOT be alarmed by the alarmists.

Instead, we should find out what really is stressing them out and focus on those problems instead.

As I find persuasive, research-based evidence to answer that question, this blog will let you know.

Strategies that Backfire: Monitoring Screen Time
Andrew Watson
Andrew Watson

Teachers and parents, reasonably enough, worry about the time that children spend looking at screens. Given the phones, tablets, phablets, laptops, and televisions that surround them, it seems normal to worry about the long-term effects of screens, screens, screens.

monitoring screen time

Monitoring screen time seems the obvious parenting strategy, and obvious teacher recommendation.

Not So Fast…

Recent research out of Canada throws doubt on this seemingly sensible approach.

Researchers surveyed parents of young children (ages 1.5-5), asking about their technology habits and parenting approaches.

Sure enough, they found that monitoring screen time correlates with an increase in the child’s technology use.

That is: when parents reward children with extra screen time, those children use more screens. Ditto parents who punish with reduced screen time. Ditto parents who simply keep track of their child’s screen time.

YIKES.

What’s a Parent to Do?

As is so often true, our behavior points the way. Parents who use screens less often in front of their children model the behavior they want to see. Result: less screen time.

This finding holds true especially for screens at mealtimes.

The best advice we’ve got so far: if you don’t want your children to obsesses over their tables, avoid monitoring screen time.

Several Caveats

First, given the survey methodology, the study can find correlation, but can’t conclude causation.

Second, the nitty-gritty gets complicated. The research team kept track of multiple variables: mothers’ behavior vs. fathers’ behavior; screen time on week days vs. screen time on weekends. To understand the specific connections, click the link above.

Third, this study focused short-term correlations with very young children. We simply don’t know about older children. Who knows: teens forbidden from playing Minecraft more than 3 hours a day might just play less Minecraft.

Finally, I think research about bright screens before sleep is well-established enough to be worth a reminder here. Blue light from computer screens can muddle melatonin onset, and thereby interfere with sleep. In this case in particular, we should model healthy screen behavior.

Ask a Simple Question, Get an Oversimplified Answer
Andrew Watson
Andrew Watson

handwritten notes

If learners were widgets, then educational research would be simple. The same teaching technique would work (or not work) equally well for all students.

It would also help if all teachers were widgets. And, if we all taught the same topic the same way.

We could ask simple research questions, get uncomplicated answers, and be ENTIRELY CERTAIN we were doing it right.

A Sample Case: Handwritten Notes

For example, if all students were identical, then we could know for sure the best way to take notes in class.

(It would help if teachers all taught the same way too.)

Are handwritten notes better than laptop notes? Vice versa? The study design couldn’t be simpler.

Mueller and Oppenheimer famously argue that “the pen is mightier than the keyboard.” (I’ve argued strenuously that their research does not support this claim, and probably contradicts it.)

But what if the question just can’t be answered that simply?

What if students do different things with their notes?

What if the classes in which they take notes are different?

Really, what then?

Mixing It Up

Linlin Luo and colleagues explore these questions in a recent study.

Happily, they start from the assumption that students use notes in different ways. And, that professors’ lectures include important differences.

For example: some students take notes, but don’t review them. (They probably should…but, there are LOTS of things that students probably should do. For instance, attend lectures.)

Others students do review the notes they take.

Some lectures include lots of visuals. Others don’t include many.

Once we start asking more complicated questions … that is, more realistic questions … we start getting more interesting answers.

More Interesting Answers

What did Luo and colleagues find? Unsurprisingly, they found a complex series of answers.

First: students who didn’t review their notes before a quiz did better using a laptop.

Second: students who did review their notes did better taking handwritten notes.

Third: in both cases, the differences weren’t statistically significant. That’s a fancy way of saying: we can’t say for sure that the laptop/handwriting distinction really mattered.

Fourth: unsurprisingly, students who took handwritten notes did better recording visuals than did laptop users. (Students who took laptop notes basically didn’t bother with visuals.)

Advice to Teachers and Students

What advice can we infer from this study? (And: from its analysis of previous studies?)

A: teachers can give students plausible guidance. “If you really will study these notes later, then you should take them by hand. But, if you really won’t, then use a laptop.”

B: teachers who present a lot of visuals should encourage handwritten notes. Or, make copies of those visuals available.

C: given that the differences weren’t statistically significant, we might encourage students to use the medium in which they’re more comfortable. If they (like me) have dreadful handwriting, then maybe they should use a laptop no matter what.

D: I continue to think — based on the Mueller and Oppenheimer study — that we should train students to take notes in a particular way. If they both use laptops AND reword the teachers ideas (rather than copying them verbatim), that combination should yield the most learning.

Most importantly, we should let this study remind us: simple answers are oversimplified answers.

If you’d like to meet two of the researchers who worked on this study, check out this video:

https://www.youtube.com/watch?v=BfCZ0K0HoJE