technology – Page 6 – Education & Teacher Conferences Skip to main content
The Distracted Mind: Ancient Brains in a High-Tech World by Adam Gazzaley and Larry D. Rosen
Rebecca Gotlieb
Rebecca Gotlieb

People are inherently information seekers. In today’s high-tech world this tendency can draw us to distraction and keep us from accomplishing our goals. Adam Gazzaley, a neuroscientist at the University of California, San Francisco, and Larry Rosen, a psychologist at California State University, Dominguez Hills put forward these ideas in their 2016 book entitled The Distracted Mind: Ancient Brains in a High-Tech World. This book will help people who have wondered why they are so susceptible to distractions and interruptions and how they can limit the adverse impacts of distraction on achieving goals.

The human ability to plan and set long-term goals distinguishes us from other species and has allowed our species to achieve greatly. Yet, our cognitive control abilities are limited and can interfere with our ability to set and achieve goals, which can make people dissatisfied. Cognitive control consists of three components: 1) attention, which directs our focus; 2) working memory, which is the ability to maintain and manipulate information in the short-term; and 3) goal management, which allows us to pursue more than one goal at a time.

Because of our limited cognitive control abilities, people are poor multi-taskers. In fact, we cannot actually do two tasks at once; rather, both neuroscientific and psychological evidence demonstrate that people rapidly switch back and forth between tasks. Even though we are not skilled at task-switching, we are often drawn to do so because we are inherently hungry for information, and task-switching helps prevent boredom and anxiety while seeking information. Additionally, ignoring distractions—whether they are from internal mental or external environmental events—is very challenging for people, and even when people want to disengage from a distractor it can take a long time. Indeed, even though we are generally not happy while doing it, people spend nearly 50% of their waking life mind-wandering.

Several factors impact cognitive control. These abilities peak when individuals are in their early twenties. Older adults are just as good as people in their twenties at bringing information to mind, but they are slower at suppressing irrelevant information. The quantity of information people can store in their working memory and the accuracy with which they store it decreases with age in adulthood. Other factors such as genetics, sleep deprivation, and drug or alcohol consumption can also affect cognitive control. There are also some clinical populations—e.g., people with ADHD, Alzheimer’s diseases, and post-traumatic stress disorder—that are known to be more distractible.

Modern technologies, such as the internet, smartphones, and social media, lead to more task-switching, have taxed our cognitive control abilities, and have exacerbated our distracted minds. Teenagers report spending over 30% of the day multitasking. Both teens and older adults struggle to be alone with their thoughts without checking email or a phone application.

Frequent task-switching and excessive media use have adverse impacts on our lives in big and small ways. They have been associated with lower college GPA, more alcohol and drug consumption, and even a rise in hospitalizations due to accidents. They can also hurt our relationships; the mere presence of a phone while conversing undermines trust and empathy between conversational partners. Use of technology is a major contributor to Americans’ substantial sleep deficit. Amount of daily technology use even predicts the severity of one’s anxious, depressive, and narcissistic symptoms. Of concern is that people are extremely poor judges of how successfully they can multi-task.

Fortunately, Gazzaley and Rosen offer several strategies for changing our brains and behaviors to reduce distractibility and increase cognitive control. Traditionally schools have not attempted to directly improve cognitive control. Rather than asking students to memorize content, we should assess and support them in developing cognitive control abilities. Meditation, video game play, time in nature, and dedicated break times may all be ways to enhance cognitive control. There is mixed evidence about so called “brain games” improving cognitive control. Increasingly students are using prescription drugs, such as ADHD medications, which are unlikely to be useful for students without a clinical need. Neuroscientists are testing new ways to improve cognitive control such as through transcranial alternating current stimulation and neurofeedback. Gazzaley and Rosen state that the best way to reduce distractibility may be one of the oldest recommendations of all—getting physical exercise.

The authors argue that to improve our habits we need to recognize the costs of multitasking, design our environments so as to decrease the accessibility of technology, and accept that decreasing interference from technology may take time. Especially if a task is urgent, important, risky, or requires substantial thought, we need to resist the urge to multitask. Changing our media use habits can lead people to be more productive, healthier, happier, and more fulfilled.

 

Gazzaley, A., & Rosen, L. D. (2016). The Distracted Mind: Ancient Brains in a High-Tech World. Cambridge, MA: MIT Press.

Learning from (gulp) Video Games
Andrew Watson
Andrew Watson

AdobeStock_23687771_Credit

Many teachers I know are baffled by the attraction of video games; some are heartily disgusted by them. (A few play them on the sly, but…ahem…no identities revealed here.)

Even if you don’t have much patience with video games yourself, you can still ask yourself this question: could they help us understand how our students learn?

After all, the many hours (and hours) that people devote to online gaming create vast quantities of data. Researchers can use those data to understand the habits that lead to the greatest improvement for the most number of people.

Well: researchers at Brown University have done just that. By studying two online games–Halo Reach and StarCraft 2–Jeff Huang and his intrepid crew have reached two quite helpful conclusions about this particular kind of learning.

It’s All in the Timing

If we want our students to learn a complex process, clearly practicing helps. And, presumably, more practice is better than less. No?

No. (Or, not exactly…)

Huang’s team found that the people who played the most Halo weren’t the people who improved the fastest. Instead, the players who took some time off — playing roughly once every other day, rather than every day or multiple times a day — raised their score most quickly.

If you’ve spent any time in Learning and the Brain world, you have heard about the spacing effect: practice spread out over time produces greater learning that lots of practice done all at once. (For just one example, see this article.)

Huang’s research in video games falls nicely into this pattern, but gives it an extra twist.

The spacing effect suggests that, if you’re going to play Halo 20 hours this week, you’ll improve faster if you spread those hours out than if you play them all in a row.

Huang’s research suggests that, if you want to improve quickly, you’re better off playing fewer hours with breaks in between sessions than more hours all at once.

In the classroom, this finding suggests that my students are better off practicing problems using the inscribed angle theorem with fewer, well-spaced problems than with more, rapid-fire problems.

It’s Also in the Warm Up

When the researchers turned their attention to StarCraft 2, they asked different questions and got usefully different answers.

In StarCraft (I’ve never played, so I’m taking the authors’ word for this), a player must control many units at the same time–sometimes issuing up to 200 commands per minute to execute effective strategy.

To simplify these demands, players can assign ‘hotkeys’ and thereby command many units with one button.

Huang’s team found that the best players used hotkeys more than others. And, even more interesting, they “warmed up” using hotkeys at the beginning of the game when they didn’t yet have many units to command.

In other words: even when they didn’t have complex cognitive work right in front of them, they were already stretching the necessary cognitive musculature to have it ready when it was needed.

This “cognitive warm up” behavior strikes me as a potentially very useful. When students do very simple problems–like the early StarCraft game without many units–they can already push themselves to think about these problems in complex ways.

If it’s easy to spell the word “meet,” you might encourage your students to think of other words that have a similar sound but are spelled differently: “heat,” “wheat,” “cheat.”

If it’s easy to find the verb in a sentence (“The porcupine painted the tuba a fetching shade of puce”), students might ask themselves if that sentence has an indirect object.

In each of these cases, students can use a relatively simple cognitive task as an opportunity to warm up more complex mental operations that will be coming soon.

The Bigger Picture

While I hope these specific teaching strategies might be useful to you, I also think there’s a broader point to make:

Teaching is fantastically complicated because learning is fantastically complicated–at least, much of school learning is. For that reason, teachers can use all the wise guidance we can get–from psychologists, from neuroscientists, and…yes…from video-game players.

The Potential Perils of Google
Andrew Watson
Andrew Watson

AdobeStock_78127454

You have heard before, and will doubtless hear again, that students don’t need to memorize facts because everything we know is available on the interwebs.

Mirjam Neelen and Paul A. Kirschner explain all the ways in which this claim is not just wrong, not just foolishly wrong, but dangerously wrong.

(The danger, of course, is that if we believe it, we’ll fail to teach our students all sorts of things they need to know.)

Students can do critical thinking if and only if they already know lots (and lots) of factual material. We don’t stifle creativity or deep thinking by teaching facts: we make creativity and deep thinking possible.

 

Classroom Note Taking: A Solution to the Technology Conundrum?
Guest Blogger
Guest Blogger

AdobeStock_126758941_Credit

[Editor’s note: this guest blogger piece is by Cindy Gadziala, Chairperson of Theology at Fontbonne Academy in Milton, MA.]

I am a veteran teacher, and yet sometimes I feel overwhelmed by all that I am supposed to be doing in my 21st century classroom.

The “wave of the future,” instructional technology—with its one-to-one initiatives, and Google platforms—offers many benefits: for example, individualized instruction, or applications that promote problem-solving skills.  I have had students demonstrate their learning by creating electronic posters and comic strips. I have even sent them on a virtual archaeological dig!  

But, there are days where classroom 102 becomes a battleground; and my enemy appears to be technology. As a Theology teacher I am supposed to love my enemy, but I need the best help I can get.

Enter — brain science!

Technology Problems: Working Memory and Attention

Psychology researchers are working diligently to understand how we get information “in and out” of our brains, and working memory is now understood as an essential gateway for learning.  We also know that working memory is both precious and limited. [1]

Part of our challenge in the classroom is to avoid overloading a student’s working memory, thereby causing a catastrophic failure…those glazed looks and blank stares that send a chill through the fiber of any teacher’s being!

So, teachers can employ proactive strategies to reduce the strain on working memory to facilitate learning. For example: lots of new information, or too many instructions, can create working memory burdens for overtaxed students.

And yet, paradoxically, classroom technology can sometimes require students to master new material, and to follow all sorts of instructions.

Just as it might overwhelm working memory, technology can also distract students’ attention.

For example: I often project images from my iPad to help my students focus. And yet, when the projector times out and kicks over to a screen saver, the swirling colors and images can disorient the very students whom I was helping focus.

These kinds of problems intensify all my questions about use of technology in the classroom:

  • Should I be allowing students to take notes on their laptops and tablets?
  • What happens to working memory when a student clicks a tab to go someplace else?
  • How does this affect the working memory of the student seated next to the web surfer?

While I hope that I am creating brilliantly engaging lessons to minimize such distractions, I have my limits.

Enter — “the conundrum!.”

Technology Possibilities

One of the boasts of technology in the classroom has been that students can use their devices for efficient note taking, yet the well-known Mueller and Oppenheimer study [2] suggests that laptops make note-taking too easy. Counter-intuitively, this ease reduces cognitive processing, and thereby reduces learning.  Between the risk of distraction and the reduction to learning I hear the cry go forth from teachers everywhere:  Victory! Ban technological devices in the classroom!

While tempting, this is not the best response. (Remember, I am trying to love my enemy!)

I have seen kids take amazing notes on a laptop. Often, they work quite thoughtfully with information, creating their own visual representations and mind maps as they go. I do not want to take this beneficial tool away from them.  

So, my task is to teach appropriate use of technological devices, build note-taking skills and…oh, by the way…teach content: all without overwhelming my students’ working memory.  

I wanted to know: how can I make technology my ally in the classroom to accomplish all these objectives? I have found an option that may help teachers to reduce strain on working memory in class, and facilitate cognitive processing both in class and at home.  

Enter — the Rocketbook.  

Paper, Improved?

The Rocketbook is a notebook, made from acid free fine grain paper with a dot grid pattern , that combines the benefits of handwriting and technology.

Because the Rocketbook has QR codes built into its pages, students can take handwritten notes in class, and then use a cell phone app to upload notes into the cloud. (Rocketbook supports Google Drive and Evernote, for example.)

Symbols on each page can be assigned to different destination folders, and so students can upload work for multiple disciplines to distinct places in the cloud.  Once their notes are uploaded, students can re-work them into a mind map or graphic organizer.  

From a teacher’s perspective, Rocketbook’s combination of paper and technology provides many benefits:

  • I reduce the strain on working memory in class because no devices should be open when students are engaged in note taking. In this way, I also make my classroom management easier.
  • I increase their cognitive processing, because they are writing by hand.
  • I increase their touches with content, because they are re-organizing their notes into the cloud.
  • I can use my LMS and Google Drive in concert to make this process part of their homework. When students provide me with a link to their uploaded notes, I can see their work from class, provide feedback on their note taking, see how they are processing and reorganizing the information, and create the opportunity to correct misinformation or redirect them to concepts they missed.

Of course, all innovations include some downsides; in this case, I sacrifice teaching my students about appropriate use of their devices in the classroom.

(A unique feature of the Rocketbook is that when the notebook is full, you can zap it in the microwave; the ink disappears and you start all over!)

Choices, Choices

While I have used the Rocketbook myself and find it both functional and cost effective (under $40.00 for pens and notebook!), there are some other interesting options available that teachers and students could use in a similar fashion. (My thanks to Learning and the Brain tech guru Scott MacClintic for these suggestions.)

First, there is the LiveScribe Echo Pen by Anoto. There are several versions of this product and the functions increase with the price tag.  (Average setup cost comes in around $200.00.) The premise here is that as you write your notes, the pen records what is being said in class.  This recording allows students to sync notes with the audio, review what was said and expand, revise and reorganize material from class.  

While the Echo Pen’s marketing is often directed to LD students, their tagline “write less, listen more” speaks to all learners. If students are coached on how best to use the tool, hearing class again combined with re-working the material could reap cognitive processing benefits.  

Equil’s Smartpen 2, (coming in around $160.00) does not offer the audio feature, but it does not require special ink or paper either. When students take notes with a special Bluetooth-enabled pen, those notes appear both on the paper where they write and on a Bluetooth-linked tablet.  Like the Rocketbook, in other words, it converts pen-and-paper notes into a laptop version—eliminating potential distractions from websites, advertisements, and Facebook.

In Sum…

While technology offers both challenges and benefits to students and teachers, it is clear to me that there are no magic bullet solutions with technology alone.  Teachers cannot abdicate their role to technology. To use it effectively, we need to know how it affects learning and the brain.  We must be all the more deliberate in our lesson planning, classroom management, and relationship building with our students.  

We equally must inform the art of teaching with the science of the brain. When we start integrating instructional technology, brain science and good pedagogical practice, as teachers we provide truly great opportunities for student learning!  

  1. Willingham, D. (2009). Why don’t students like school? A cognitive scientist answers question about how the mind works and what it means for the classroom. San Francisco: Jossey-Bass.
  2. Mueller, P. A., & Oppenheimer, D. M. (2014). The pen is mightier than the keyboard: Advantages of longhand over laptop note taking. Psychological Science, 1-10, doi: 10:1177/0956797614524581. [link]

Debate: E-Readers and Reading Comprehension
Scott MacClintic
Scott MacClintic

AdobeStock_88899588_Credit

[Editor’s note: Scott’s post is in response to this earlier article.]

Most times when I get asked about the e-reader debate, it is usually not a sincere question from a person who does not already hold a strong opinion on the matter. In these moments I am reminded of the expression “when you find yourself in a hole, stop digging!”

No matter how many studies I mention or which side of the issue I am trying to argue on behalf of, as soon as I provide a brief pause, I am confronted with “yeah, but…” and then the person proceeds to tell me why his/her long-held belief is the final word on the subject.

As for where I come down on the issue, I tend to defer to people who are way smarter than me on the subject —  such as Daniel Willingham.

As Willingham concludes in his review of some of the literature on the subject, If the choice is read on a device or read on paper, I believe that the paper is still slightly in the lead if you are looking at straight up comprehension. The problem I have is that this shift to digital is really only a lateral move or a substitution situation, and perhaps not a wise one if you want improved student comprehension!

As a teacher, I choose to incorporate technology in the design of my lessons if I believe it is going to result in noticeable and definable modification or redefinition of the learning tasks and outcomes (SAMR model). The question I ask is “what will the use of this technology allow me or my students to do that previously could not have been accomplished?” If the answer is a “not much” then I do not bother to use the technology. The technology itself should not be the focus of the lesson; student learning must be front and center.

So…”to e-reader or not to e-reader” is actually not the question that we should be asking; rather, we should be asking “does this technology add transformative value to the learning experience for my students?” If we want to go even further, we should ask “How might I measure this value and know that my students are benefiting?”

E-Readers and Reading Comprehension
Andrew Watson
Andrew Watson

AdobeStock_114246563_Credit

The invaluable Daniel Willingham briefly reviews the literature, and concludes that — for the time being — students understand more when they read on paper than when they use e-readers.

Willingham acknowledges that his review isn’t comprehensive. However, he’s recently written a book about reading instruction, and so I suspect he’s more up-to-date than most in this field.

If he’s right, this conclusion should give pause to the many (MANY) schools that are switching to e-textbooks. I know they have advantages; they’re less expensive, more portable, easier to modify to suit a specific teacher’s or student’s needs.

And yet, if students learn less when reading them, none of those advantages matters!

Willingham is hopeful that the quality of e-readers will improve enough to eliminate this discrepancy. Until that happens, and until we have good research showing that students can learn well from e-readers, old-fashioned books seem like the best technology we have.

(Scott MacClintic, this blog’s tech guru, will have some thoughts on this topic soon…)

To Ban or Not to Ban: A Usefully Provocative Answer
Andrew Watson
Andrew Watson

AdobeStock_49196205_Credit

For every enthusiastic voice championing the use of laptops in classrooms, we hear equally skeptical claims. College professors, in particular, have been increasingly vocal about banning distractions to ensure that students stay focused.

James M. Lang–a professor of English, who also also directs the Center for Teaching Excellence at Assumption College–pushes back against such bans.

In a striking comparison, he views problems with distracted laptop users the same way he views problems with cheating.

If lots of students are cheating on a particular assignment, Lang argues, then it’s time for us to change that assignment.

So too with laptop distractions. If lots of students are browsing FB posts, their disorientation lets us know that this current teaching method just isn’t working.

Lang’s argument implies that even if we take away the laptop, our teaching method hasn’t gotten any better.

Provocatively, this argument shifts an important responsibility from students to teachers; Lang, after all, tells us that students’ attention is as much our job as theirs.

Wisely, Lang offers specific classroom approaches to ensure that students use their laptops for good, not for ill.

“Screen Time”: Content and Context Matter
Andrew Watson
Andrew Watson

AdobeStock_97021223_Credit

This open letter–signed by many psychologists and neuroscientists well-known to LaTB audiences–argues that current panic about “screen time” isn’t based on evidence.

The authors argue that guidelines ought to be based on clearer thinking and deeper research.

Laptops in the Classroom: The Debate Continues…
Andrew Watson
Andrew Watson

AdobeStock_70098342_Credit

In at least this one college classroom, non-academic laptop use is inversely related to performance on the final exam.

Of course: school teachers may be able to supervise and control our students’ activities while using computers. In other words: this study is interesting to us, but shouldn’t be the final word in the debate.

[Hat tip: Daniel Willingham]

Does Internet Use “Rewire Adolescent Brains”?
Andrew Watson
Andrew Watson

AdobeStock_80617305_Credit

Our very own Kathryn Mills says: we’ve got a lot of anecdotes, but not a lot of evidence, suggesting that internet use is meaningfully changing — much less damaging — adolescent brains.

For example: one study that Mills cites tracks 908 adolescents for 2 years, and finds no meaningful correlation between an increase in World Wide Web surfing and a reduction in free time physical activity. In other words: the couch potato stereotype might exist more on TV drama than in reality.

In brief: although our teacherly instincts might warn us that the Web has drastically  changed adolescent cognitive or social abilities, researchers haven’t yet found much evidence to confirm these fears.

To see Kathryn’s earlier articles for this blog, click here.