classroom advice – Page 20 – Education & Teacher Conferences Skip to main content
“Without data, you’re just another person with an opinion.”― W. Edwards Deming
Scott MacClintic
Scott MacClintic

AdobeStock_119988913_Credit

Data Informed Instruction

Early Steps

There are a few key steps to effectively incorporating MBE (Mind, Brain & Education) ideas and concepts into one’s daily teaching routine. The first key is the low hanging fruit, namely, educating oneself on the research about learning and the brain and what the research suggests are effective pedagogies. If you are reading this blog, you are probably already familiar with one fantastic resource for such information (shameless plug warning!)) – www.learningandthebrain.com

There are certainly plenty of resources out there and I strongly encourage you to seek them out. This first step is critical and has become easier in the last few years as more and more of the actual research is available online and more and more has been written for teachers as the target audience.

The second key step involves actually trying something new in your classroom, whether it is using more retrieval practice exercises [1], incorporating movement [2] or perhaps shifting to a more student-centered model for class discussions [3].

Quantum Leaps

But wait, your work is not done! Trying something new based on the conclusions of a research paper you read is certainly a big step but how will you know that the change you made was effective? What is your evidence that the change you incorporated actually improved student learning? THIS is the difficult part.

Analyzing the impact or effect of a new pedagogy is quite complex and requires the collection and analysis of data. While you may not be able to perform a double-blind controlled experiment–the gold standard in scientific research–you CAN analyze the impact of your intervention and use data to inform your teaching practice going forward.

So how do you collect data that can help you improve your  teaching practice?

I have found that one of the most useful tools for collecting data is one of the easiest to set up and use, but is frequently one of the least likely to be used by teachers – videotaping your class.

Watching a videotape of your class and objectively analyzing the tape for evidence of improved learning can be extremely enlightening, illuminating and humbling.

  • Did I really only give Hermione 2 seconds of wait time before I moved on to Draco?
  • Were the students really trying to take notes, listen to me deliver content and participate in the conversation simultaneously?
  • Did I really shake my head in disapproval as Luke responded to my question with an answer that was way off base?

I have yet to find a teacher who enjoys watching himself/herself on video but have found that most teachers who actually go through with it find the experience to be incredibly informative. Watching your video with a trusted colleague or Critical Friends group can be even more thought provoking  and lead to fruitful conversations about teaching and learning.

Data 2.0

I have been playing around with an exciting new  tool for data collection lately that has the potential to make the time consuming analysis of videotape seem like a thing of the past.  The app does a deep dive into an audio recording from class and provides me with nearly immediate data to analyze.

Here’s how it works: At the beginning of class I start an audio recording of the class on my phone and hit stop when the class is over. In the current  iteration of the app, I upload the audio file to be analyzed and within an hour or so, I receive a report back on the class. Right now, the report includes data in 5 minute increments on:

  • My talking speed (words per minute)
  • How many questions I ask
  • The types of questions I asked – How? vs Why? vs What?
  • Percentage of the time that I was talking vs. the students were talking

Questions that I have been able to think more critically about with this data include:

  • Was my student-centered class discussion really as student-centered as I thought?
  • Am I asking questions that require surface level knowledge (“what”) or ones that will lead to more critical thinking on the part of my students (“why,” “how,” “if”)?
  • Am I speaking too fast when giving instructions as I set up an activity?

The app is still in its development phase and there are bugs to be worked out before it will be available to a wider audience; but if you are interested in participating in the pilot, you can sign up here. Of all the data collection tools out there, I think that this app  has the potential to be an incredibly valuable tool for teachers as they attempt to evaluate the impact of changes in their practice.

For all of its potential uses, I do realize that their are potential dangers with the collection of this type of data. Who initiates the collection of the data? What if an evaluator or administrator wants to use the data? What are the privacy concerns about collecting this type of data? Who has access to the files and data?

All of these questions are important ones that need to be fleshed out to be certain; however, I believe that if properly used, this app has the potential to be a powerful tool for teachers who want to use data to inform their teaching as they incorporate new strategies and pedagogies.

  1. http://www.retrievalpractice.org/
  2. Donna Wilson, Move your body, grow your brain, March 12, 2014 [link
  3. Goldschmidt, M., Scharfenberg, F. J., & Bogner, F. X. (2016). Instructional efficiency of different discussion approaches in an outreach laboratory: Teacher-guided versus student-centered. The Journal of Educational Research, 109(1), 27-36. [link]

 

Interactive Teaching at Harvard
Andrew Watson
Andrew Watson

AdobeStock_126355058_Credit

Harvard’s Initiative for Teaching and Learning has posted videos of their most recent conference. The topic: interactivity.

As you listen to these Harvard professors, you might find yourself thinking: their students, and their teaching problems, sound a lot like my students and my teaching problems.

Pro tip: each video begins with a very generous introduction. If you skip ahead 3-5 minutes, you’ll get to the good stuff much more quickly…

It Ain’t What You Know, It’s…Oh, No, Sorry, It IS What You Know
Ian Kelleher
Ian Kelleher

AdobeStock_92720672_Credit

I sense that the tide is beginning to turn on the knowledge-versus-skills debate, ‘21st Century’ or otherwise. There is an increasingly confident voice shouting a phrase that teachers have shouted for the few thousands of years that there have been teachers: knowledge is really important.

Yes, even in this Googleable world, knowledge is important. We could patiently wait for the “importance of knowledge” pendulum to swing back, or we could, as evidence-informed professionals, boldly provide an epistemic nudge [1].

This post is a concise argument for the importance of knowledge, and offers some research informed ideas for teachers on how to build it.

I recently heard Robert Pondisco, senior fellow at the Thomas B. Fordham Institute, give a wonderful talk at ResearchED DC on the importance of a recommitment to teaching knowledge. During his talk, Pondisco eloquently painted the picture of President Obama during his first Inaugural Address, glancing down the length of the Mall to the Lincoln Memorial where Martin Luther King Jr. said those famous words not that long ago.

And then Obama delivered the words in this clip. It was a powerful moment in American history. And Pondisco posed the question: what knowledge would children need to have to understand the significance of these words at this moment? Would they have this knowledge? How would they have got in? Who might have it and who might not? How does this fit in the existing inequality gap? Pondisco’s questions offer a fascinating thought experiment into the importance of knowledge.

Acknowledge the limits of active working memory

Active working memory can hold fewer things for less time than most people realize. Though it is hard to measure, 7 things for 30 seconds for adults is a well agreed upon estimate [2]. For children the numbers are lower. And there is a trade off too – we can hold more things but for progressively less time.

How do these limitations fit my argument?

Having knowledge stored in long term memory frees up the active working memory to more effectively help with higher order thinking tasks. In other words, having stored knowledge helps us think.

Even project based learning needs knowledge

What about things like project based learning (PBL): the antithesis of the “lecture, lecture, test” mode of teaching? How important is it to be very purposeful in teaching knowledge when we want students to be on a voyage of independent exploration? It turns out that explicitly teaching knowledge in very deliberate ways is extremely important for PBL: make-or-break important, in fact.

I will tuck deeply into the deficiencies of PBL at a later date. But the crux of the research-supported argument is that for project based learning to have any measure of success, independent inquiry needs to be balanced with didactic instruction. Without foundational information, students lack sufficient knowledge and skills to be able to engage with the task.

In fact, failing to provide adequate support for knowledge and skills may actually contribute to the achievement gap, as students from disadvantaged backgrounds often enter school with deficiencies in knowledge and skills that are necessary for success in the project [3, 4, 5].

Part of pedagogical content knowledge, that highly interlinked combination of subject knowledge and how to teach it, is to know exactly what knowledge scaffolding students need in order to successfully launch into a project. So if we want to create great projects, which we do, we also need to be great at teaching knowledge – and great at discerning what knowledge that needs to be.

Teaching for stickiness

No matter where in the spectrum from direct-instruction-focused to project-focused we happen to be teaching, we need to get content knowledge to reliably stick in long-term memory. Fortunately there is robust research to guide us here. It suggests both things we should do and should not do.

Things Not To Do

(1) rereading notes

A trip down the aisles of Staples in August confirms what we already know – students love highlighters. But research suggests that the staple of studying, rereading notes or the textbook, is a terrible way to study. It tends to lead to what Brown, Roediger and McDaniel call “the illusion of fluency” [6], where students become so familiar with the text that they believe they know it before they actually do.

HIghlight

(2) misusing flashcards

Similarly, students tend to use flashcards in entirely the wrong way – which is hard for such a simple device. They tend to turn them over too quickly to see the answer. The key part is how one lingers in the moment of not knowing. The key part is the moment before you turn it over. Flashcards work best when students ponder difficult questions, even when the answers prove elusive.

Things To Do

(1) retrieval practice

Retrieval practice is this idea of trying to recall knowledge from memory. Even if a student is unable to, research suggests that the act of trying helps memory storage and recall. Retrieval practice can take many forms: self testing, proper use of flashcards or online tools such as Quizlet, or taking a sheet of paper and writing out everything you know on a subject.

But I am sure you can be creative and add to this list. The key is having students try deeply to recall, then  having them check this against their notes or model answers.

(2) spacing

There is great research around the spacing effect. That is, students should study, leave a gap, then study again. We can, for example, coach students to space their studying rather than use massed studying. Massed studying does not lead to durable learning.

Instead, allowing your memory to get a bit rusty between study sessions makes the next study session more challenging. In doing so, it helps create knowledge that is both more durable and more flexible. This is a concept that Clark and Bjork call “desirable difficulty” [7].

But what is the optimal spacing gap for your students, your subject, and the content you are teaching? This is a great idea for you to play with and do your own disciplined inquiry. (You might check out Scott MacClintic’s forthcoming article on gathering classroom data for suggestions.)

(3) formative assessments

Replace pop quizzes with no- or low-stakes formative assessments. As you give these quizzes, say something along the lines of, “this is for you to figure out where you are, for me to figure out where you are, and for us both to adjust what we do accordingly.” This technique is retrieval practice plus. A further benefit is that more of the brain restructuring associated with learning occurs when we struggle and when we get things wrong [8, 9]. Getting things wrong is an important part of learning, and we need to craft no- or low-stress opportunities for this to happen.

(4) interleaving

Interleaving is a way to deliberately build the spacing effect into how you design your courses. Instead of starting the year with unit one, followed, perhaps, by unit two then unit three, there is an alternative way to organize things that will promote learning. After moving on to a new unit, plan on revisiting the core knowledge at least a couple more times at spaced intervals later on [10].

(5) pre-testing

“Research suggests that starting a unit of study with a pre-test helps create more enduring learning. It appears to give students something on which to hang subsequent information. This test should, of course, not be graded, or if it is, it should be graded for effort rather than correctness.

The other point of this pre-test is to give the teacher an idea of where the level of the class generally is, and what knowledge each individual student brings with them already, so that the teacher can tailor subsequent classes to best match the needs of the class. It is important to avoid seeding boredom, and to avoid the potential skipping of foundational knowledge that could prevent future learning. These are two common toxic effects on learning” [11].

A thought on how these suggestions link to assessment

Since a little kid, I have always enjoyed words. Some are more fun to play with than others, of course, but one of the best is ‘facile.’ We often use it to refer to someone who appears so good at something they do it with an effortless ease. But its more nuanced meaning is to refer to a demonstration of thinking that at first glance seems neat, concise and elegant, but which on closer inspection is only neat, concise and elegant because it is oversimplistic, itself lacking in nuanced details.

So this article, I believe, leads us to a future one that needs to be written: how do we avoid facile demonstrations of knowledge by our students? How do we craft assessments that steer students away from this? Or, as Rob Coe and David Didau put it, where will students think hard in this lesson? But in the time before this second article is written, I encourage you to explore this idea yourself. And if you have ideas as to what should go in such an article, please let us know.

 

  1. Thank you, Troy Dahlke, for this playful term
  2. Cowan, N. (2008). What are the differences between long-term, short-term, and working memory?. Progress in Brain Research, 169, 323-338. [link]
  3. Education Endowment Foundation Analysis [link]
  4. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86. [link]
  5. Kirschner, P. A., & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational psychologist, 48(3), 169-183. [link]
  6. Brown, P. C., Roediher, H. L., & McDaniel, M. A. (2014). Make it stick: The science of successful learning. Cambridge: The Belknap Press of Harvard University Press.
  7. Clark, C. M., & Bjork, R. A. (2014). When and why introducing difficulties and errors can enhance instruction. In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.), Applying the Science of Learning in Education: Infusing psychological science into the curriculum.  [link
  8. See this accessible research summary from Robert Bjork at UCLA
  9. Moser, J. S., Schroder, H. S., Heeter, C., Moran, T. P., & Lee, Y. H. (2011). Mind Your Errors Evidence for a Neural Mechanism Linking Growth Mind-Set to Adaptive Posterror Adjustments. Psychological Science21(2), 1484-1489. [link]
  10. Blasiman, R. N. (2016). Distributed Concept Reviews Improve Exam Performance. Teaching of Psychology44(1), 46-50. [link]
  11. Whitman, G. and Kelleher, I. (2016). Neuroteach: Brain science and the future of education. Lanham: Rowman & Littlefield.

To Ban or Not to Ban: A Usefully Provocative Answer
Andrew Watson
Andrew Watson

AdobeStock_49196205_Credit

For every enthusiastic voice championing the use of laptops in classrooms, we hear equally skeptical claims. College professors, in particular, have been increasingly vocal about banning distractions to ensure that students stay focused.

James M. Lang–a professor of English, who also also directs the Center for Teaching Excellence at Assumption College–pushes back against such bans.

In a striking comparison, he views problems with distracted laptop users the same way he views problems with cheating.

If lots of students are cheating on a particular assignment, Lang argues, then it’s time for us to change that assignment.

So too with laptop distractions. If lots of students are browsing FB posts, their disorientation lets us know that this current teaching method just isn’t working.

Lang’s argument implies that even if we take away the laptop, our teaching method hasn’t gotten any better.

Provocatively, this argument shifts an important responsibility from students to teachers; Lang, after all, tells us that students’ attention is as much our job as theirs.

Wisely, Lang offers specific classroom approaches to ensure that students use their laptops for good, not for ill.

Laptops in the Classroom: The Debate Continues…
Andrew Watson
Andrew Watson

AdobeStock_70098342_Credit

In at least this one college classroom, non-academic laptop use is inversely related to performance on the final exam.

Of course: school teachers may be able to supervise and control our students’ activities while using computers. In other words: this study is interesting to us, but shouldn’t be the final word in the debate.

[Hat tip: Daniel Willingham]

17 Ways to Fold Sheep
Andrew Watson
Andrew Watson

AdobeStock_50455195_Credit

Here’s a mental puzzle to start off your day:

Imagine you’ve got 17 sheep and four pens to put them in. Just for fun, you decide to put an odd number of sheep in each pen. How would you proceed?

As it turns out, this is quite a difficult problem. You might be inclined to tell me it’s impossible. The secret is…well, I won’t tell you the secret just yet. (Don’t look now, but there are some solutions down below.)

Your ability to solve this problem might depend on internal, mental characteristics. For example: more creative people typically find a solution more rapidly than less creative people.

At the same time, your ability – and, crucially, your students’ ability – might well depend on the external, physical actions used to solve the problem.

If you give your students a tablet on which they can write, draw, and erase, the chance that they’ll find a solution remains low. However, if you give them pipe-cleaner pens and little plastic sheep, the odds get a lot better.

In one study by Frédéric Vallée-Tourangeau [1], 0% of college students who used the tablet figured out the solution, whereas 43% of those who used the pipe-cleaners and sheeplets did so. (In a slightly different research paradigm, 17% of tablet users found solutions, vs. 54% of model builders who did.)

That is: manipulating meaningful objects increased the likelihood of success.

*          *          *          *          *

In recent years, researchers have increasingly focused on the topic of embodied cognition: the influence that our bodies (not just our brains) have on our thinking.

Susan Goldin-Meadow and Sian Beilock, for example, have studied the role that gestures play in cognition [2]. In one of their studies, a particular set of gestures helped some students learn math problems more effectively. (Intriguingly, students who said the wrong words but made the right gestures tended to learn more quickly than other students.)

Beilock’s recent book How the Body Knows its Mind: The Surprising Power of the Physical Environment to Influence How You Think and Feel offers a substantial introduction to this fascinating topic.

Vallée-Tourangeau’s just-published research – both the “17 Sheep” problem, and another study into mental math [3] – fits nicely under the heading of embodied cognition. After all, students who use their bodies a particular way think more effectively than students who use their bodies a different way.

*          *          *          *          *

What practical teaching advice flows from these insights?

First, we should recognize that this research is in very early stages, and specific teaching strategies haven’t yet been tested. At this point, we’re making plausible extrapolations, not relying on well-tested hypotheses. (Unless, that is, you’re teaching students how to fold sheep creatively.)

Second, this research pool encourages teachers to translate problems into objects both for step-by-step routines and for problems that require new insight.

Step-by-step routines: Vallée-Tourangeau’s mental math study shows that students who could move tiles around as they added digits in their head accomplished this task much more effectively than those who were forbidden from moving their hands.

Mental addition is – for most college students – quite a routine cognitive task. And yet, by combining bodily movement with cognitive efforts, students noticeably improved their performance.

Problems that require new insight: The solution to the “17 Sheep” problem requires a sudden AHA!, a flash of insight: the sheep pens might overlap with each other.

17 Animals

When Vallée-Tourangeau’s students thought about the “17 Sheep” problem in two dimensions, they had very little luck. When they thought about that same problem in three dimensions, however, that extra dimension prompted new – and successful – thought patterns. That is: physical objects made new insights easier to uncover.

This study suggests that we can help our students leap to surprising new ways of thinking by inviting them to move physical objects around.

Of course, the specifics of this suggestion have yet to be researched. They will doubtless depend on the subject you’re teaching, the students you’re teaching, and your own comfort with this kind of inventive extrapolation.

Despite these uncertainties, these researchers offer us exciting new approaches for teaching both basic procedures and complex insights.

Our students may well benefit from such strategies, and from our own classroom experiments.

 

  1. Vallée-Tourangeau, F., Steffensen, S. V., Vallée-Tourangeau, G., & Sirota, M. (2016). Insight with hands and things. Acta Psychologica, 170, 195-205. [Link]
  2. Goldin-Meadow, S., & Beilock, S. L. (2010). Action’s influence on thought: The case of gesture. Perspectives on Psychological Science, 5(6), 664-674. [Link]
  3. Vallée-Tourangeau, F., Sirota, M., & Vallée-Tourangeau, G. (2016). Interactivity mitigates the impact of working memory depletion on mental arithmetic performance. Cognitive Research: Principles and Implications, 1(1), 26. [Link]

Click Here: The Technology of Retrieval Practice in the Classroom
Scott MacClintic
Scott MacClintic

AdobeStock_110872836_Caption

Back in the dark ages, when I was just cutting my teaching teeth, we teachers might have asked our students to review for an upcoming test by asking them to reread the chapter and their notes from class. With the benefit of psychology research, we now know that another strategy will be more effective.

Rather than have students reread the chapter or their notes, we might instead encourage them to outline the content from memory. This approach–called “the testing effect” or “active recall’ or event “blank page review”–leads to substantial increases in long-term memory formation. (That’s psychologist speak for “learning.”)

The efficacy of this form of retrieval practice is supported by a wealth of research and has been shown to be a powerful strategy for long-term learning.(1) The benefits have been shown in a variety of environments, over a wide range of student ages, and across many disciplines.(2) (3) (4)

One of the nice things about the testing effect is that it can easily be integrated into a study routine or a class lesson plan. Students can employ the strategy on their own without the use of anything more sophisticated than a blank piece of paper. Teachers can incorporate blank page review or frequent low stakes or no stakes quizzes into their courses as a way to leverage the power or retrieval practice.

Are there other effective ways in which we can incorporate retrieval practice into the classroom using current technology to not only enhance long term learning but also to provide formative assessment data for both the student and the teacher? The simple answer is YES!

“High-tech” version

Student response systems–commonly known as “clickers”–are a fantastic way to engage students in the process of retrieval practice; they also provide both teacher and student with valuable formative assessment data. Several strategies for effective use of clickers will enhance students’ learning.

  1. Make sure that the questions are not too easy.
  2. Be sure to include the most common wrong answers as options.
  3. After the initial polling is complete, take advantage of the different student answers to generate discussion and debate about the topic. Insist that students make a convincing argument as to why their choice is the best answer.

After initial polling on a question, I often project the results for my students to see. Depending on the spread of answers, I follow up with one of the these questions:

“Can somebody make a case for why their answer is the best choice?”

“Can somebody make a case for why their answer is a better choice than the one that was just proposed?”

“What do you need to know/remember in order to answer this particular question?”

When there is no clear consensus and wide range of answers are selected, I usually go in a different direction.

“Take a minute at your table (typically, 3-4 students) or with the person next to you to discuss your initial answer and come to a consensus. In a minute, we will re-poll on the same question.”

After a brief period of discussion and re-polling, there tends to be fewer potential answers chosen. I can then solicit an argument for one answer or another.

Don’t be afraid to include some vague wording, or to have more than one answer be correct depending on how the question is interpreted. A little intentional confusion and healthy debate/discussion can be a powerful way to incorporate an additional desirable difficulty into the mix.(5)

The feedback that occurs during the post polling discussion and analysis is not only beneficial for correcting erroneous answers; it also helps with long term retention of correct answers on which the students were not initially confident in their answer.(6) Both of these factors lead to greater long term retention ,as well as strengthened metacognitive skills for the students. A win-win!

If you do not have clickers at your disposal, you have several web-based alternatives to collect student responses. Polleverywhere, Socrative, Google forms and Kahoot! are just a few of the options that exist out there for teachers to use.

As a word of warning, there are some potential downsides and caveats that you need to consider when using student response systems. First and foremost is that no matter how much you plan ahead, you can count on the technology not working flawlessly every time. Who among us has not experienced the joy of having the projector bulb blow out just as you are about to project something on the board?

Another factor to consider is the time required. It takes longer to cover the same ground using this retrieval practice strategy. I would argue that the time is well worth it for the students, but the reality is that it will take more of your valuable class time.

“Low-tech” version

If you do not have a set of clickers or enough electronic devices in your classroom, you can still take advantage of this technique. Personal white boards, paddles, or even different colored note cards let individual students or groups of students vote for various possible answers. Any way that allows you to canvas different student responses and then to generate discussion and debate about those answers will work just as well.

Regardless of the technique used, the power of retrieval practice and feedback for long term learning is undeniable and should be an arrow in your pedagogical quiver.

References:

  1. Roediger, H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210. Link
  2. Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331(6018), 772-775. Link
  3. Karpicke, J. D., & Roediger, H. L. (2008). The critical importance of retrieval for learning. Science, 319(5865), 966-968. Link
  4. Agarwal, P. K., Bain, P. M., & Chamberlain, R. W. (2012). The value of applied research: Retrieval practice improves classroom learning and recommendations from a teacher, a principal, and a scientist. Educational Psychology Review, 24(3), 437-448. Link
  5. Overoye, Acacia L.; Storm, Benjamin C. (2015) Harnessing the power of uncertainty to enhance learning. Translational Issues in Psychological Science, Vol 1(2), Jun 2015, 140-148. Link
  6. Butler, Andrew C.; Karpicke, Jeffrey D.; Roediger III, Henry L. Correcting a metacognitive error: Feedback increases retention of low-confidence correct responses. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol 34(4), Jul 2008, 918-928. Link