Skip to main content
Does Music Promote Students’ Creativity?
Andrew Watson
Andrew Watson

If we want our students to think creatively, should they listen to music? If yes, does the timing matter?

Intuition might lead us either to a “yes” or to a “no.”

Yes: music might get students’ creative juices flowing. Especially if it’s upbeat, energetic, and particularly creative in itself, music might spark parallel creativity in our students’ thought processes.

No: on the other hand, music just might be a serious distraction. Students might focus so keenly on the music — or on trying to ignore the music — that they can’t focus on the creative work before them.

Do You Smell a CRAT?

Researcher Emma Threadgold used a common creativity test – with the unlikely acronym of CRAT – to answer this question.

Here’s how a CRAT works:

I give you three words: “dress,” “dial,” and “flower.”

You have to think of another word that – when combined with each of those words – produces a real word or phrase.

To solve a CRAT, you have to rifle through your word bank and try all sorts of combinations before – AHA! – you pull the correct answer up from the depths of your brain.

In this case, the correct answer is “sun”: as in, sundress, sundial, and sunflower.

The Results Are In

Threadgold and her team tested this creativity question several times, in order to explore several variables.

They played music with English lyrics, with foreign lyrics, and with no lyrics. They played upbeat, happy music.

They even played library noise – with the sound of a photocopier thrown in for good measure.

In every case, music made it harder to solve CRAT problems.

To put that in stark terms: music interfered with listeners’ creative thinking.

(For those of your interested in statistics, the Cohen’s d values here are astonishing. In one of the three studies, the difference between music and no music clocked in a d=2.86. That’s easily the highest d value I’ve seen in a psychology study. We’re typically impressed by a value above 0.67.)

Case Closed?

Having done such an admirably thorough study, has Threadgold’s team answered this question for good?

Nope.

As always, teachers should look not for one definitive study, but for several findings that point in the same direction.

And, we should also look for boundary conditions. This research might hold up for these particular circumstances. But: what other circumstances might apply?

For me, one obvious answer stands out: timing.

Other researchers have studied creativity by playing music before the creative task, not during it.

For instance, this study by Schellenberg found that upbeat music produces higher degrees of creativity in Canadian undergraduates AND in Japanese five-year-olds. (Unsurprisingly, the five-year-olds were especially creative after they sang songs themselves.)

In this study, crucially, they listened to the music before, not during, the task.

Threadgold’s study, in fact, cites other work where pre-test music enhanced creativity as well.

More Questions

Doubtless you can think of other related questions worth exploring.

Do people who learn to play music evince higher degrees of creativity in other tasks?

How about courses in music composition?

Music improvisation training?

Does this effect vary by age, by culture, by the kind of music being played?

For the time being, based on what I know about human attention systems, this study persuades me that playing music during the creative task is likely to be distracting.

Depending on what you want your students to do, you might investigate other essential variables.

__________________

On a related topic: for Dan Willingham’s thoughts on listening to music while studying, click here.

Taking Notes with Graphic Organizers
Andrew Watson
Andrew Watson

research-based advice for studentsWe’ve blogged (quite energetically) about the difference between handwritten and laptop notes.

Of course, other note-taking differences merit investigation as well.

For example: if students take handwritten notes, is it better to give them:

a complete lecture outline,

a partial lecture outline,

a bare-bones lecture outline,

or

a complete graphic organizer,

a partial one, or

an empty one?

Over at the Learning Scientists, Carolina Kuepper-Tetzel explores this question, and adds some thoughts of her own.

One Man’s Experience

This article particular caught my eye because it applies so directly to my own work.

When I talk with teachers, students, or parents about brains, I always provide them with option #5 above: an incomplete graphic organizer.

My goal: reduce working memory load. (I’m always focused on reducing extraneous working memory load.)

The informal feedback I get is strongly positive. Many teachers, in fact, tell me that they’ve started using the same form with their own students.

When you read Dr. Kuepper-Tetzel’s post, you’ll see how well (if at all) my practice accords with the research we have.

Do Collaborative Projects Reduce or Increase Working Memory Stress?
Andrew Watson
Andrew Watson

Should teachers ask students to work on projects in teams?

This question generates a great deal of heat.

Many education thinkers advocate for the benefits of teamwork. Others insist that learning happens one brain at a time, and so should not be cluttered with interference from other brains.

Working Memory: Blesses and Curses

Working memory allows humans to hold and reorganize facts and ideas in temporary mental storage.

When you do a word problem, you must decide which parts should be translated into an equation. (Those decisions take WM.) You have to recall the appropriate equation to use. (Ditto.) And, you must plug the correct data into the correct formula before you can arrive at an answer. (Re-ditto.)

Composing a new sentence in a foreign language? Lots of working memory demands.

Comparing Paul Lawrence Dunbar’s poetry with that of Countee Cullen? Yup.

Learning how to tell time? Once again – lots of working memory involved.

In other words, WM allows students to do practically everything that we want them to do in school.

And yet, this working memory blessing co-exists with a powerful curse: we just don’t have very much of it.

You probably can alphabetize five days of the work week. You probably can’t alphabetize 10 months of the year. The first task lies within WM limits; alas, the second goes way beyond them.

Collaboration’s WM Dangers

In a recent article, Paul Kirschner and others consider the WM benefits and perils of group work.

(These scholars, especially John Sweller, have elaborated “cognitive load theory” to explain the relationship between long-term memory, WM, and the external world of perception and experience. See here for a review.)

One important peril: the working memory demands created by collaboration. When students work together, they have to negotiate roles. They must create joint mental models. They have to schedule and prioritize and debate.

All these “musts” take up precious working memory space. The result might be that students get better at negotiating, modeling, and prioritizing. But, the WM devoted to those task might make it harder for them to learn the content at the heart of the project.

Of course: you might reasonably want your students to focus on the social-emotional skills. But, if you wanted them to focus on Shakespeare or Boyle’s law, then the project might not produce the results you hoped for.

Collaboration’s WM Benefits

At the same time, Kirschner & Co. also see working memory upsides to collaboration.

A particular cognitive task might include quite stiff WM demands. If the group includes members with the right kinds of background knowledge, then the WM chores can be divided up and managed more effectively.

Student A carries this part of the WM load.

Student B carries that part.

Student C takes care of the tricky last bit.

In this way, the WM whole can be greater than the sum of the parts.

In other words: if teachers can organize group projects so that a) the WM difficulties of collaboration remain low, and b) the benefits of sharing WM burdens remain high, then such collaboration truly help students learn.

Putting It Together

Kirschner’s article concludes with a list of key variables for teachers to track: task complexity, domain expertise, team size, and so forth.

Be aware that cognitive load theory gets a little jargony, and you’ll need some time to learn the lingo before the article makes sense.

However, if you can devote that time, I think you’ll benefit from its practical suggestions, and helpful frameworks for planning students’ collaborative learning.

Praising Researchers, Despite Our Disagreements
Andrew Watson
Andrew Watson

This blog often critiques the hype around “brain training.” Whether Lumosity or Tom Brady‘s “brain speed” promises, we’ve seen time and again that they just don’t hold water.

Although I stand behind these critiques, I do want to pause and praise the determined researchers working in this field.

Although, as far as I can see, we just don’t have good research suggesting that brain training works*, it will be an AWESOME accomplishment if it someday comes to pass.

A Case In Point

I’ve just read a study that pursues this hypothesis: perhaps brain training doesn’t succeed because the training paradigms we’ve studied do only one thing.

So, a program to improve working memory might include cognitively demanding exercises, but nothing else. Or, brain stimulation, but nothing else. Or, physical exercise, but nothing else.

What would happen if you combined all three?

To test this question, Ward & Co. ran a remarkably complex study including 518 participants in 5 different research conditions. Some did cognitive exercises. Some also did physical exercises. And some also added neural stimulation.

The study even included TWO control groups.

And, each group participated in dozens of sessions of these trainings.

No matter the results, you have to be impressed with the determination (and organization) that goes into such a complex project.

Okay, but What Were The Results?

Sadly, not much. This study didn’t find that training results transferred to new tasks — which is the main reason we’d care about positive findings in the first place.

We might be inclined to think that the study “didn’t succeed.” That conclusion, however, misses the bigger point. The researchers pursued an entirely plausibly hypothesis…and found that their evidence didn’t support it.

That is: they learned something highly useful, that other researchers might draw on in their own work.

Someday — we fervently hope — researchers will find the right combination to succeed in this task. Those who do so will have relied heavily on all the seemingly unsuccessful attempts that preceded them.

__________

* To be clear: the phrase “brain training” means “training core cognitive capacities, like working memory.”

From a different perspective, teaching itself is a form of brain training. When we teach our students, brains that once could not do something now can do that something.

Brains change all the time. “Brain training” aims for something grander. And, we haven’t yet figured out how to do it.

Default Image
Andrew Watson
Andrew Watson

In a blog post, David Didau raises concerns about “the problem with teachers’ judgment.”

Here goes:

If a brain expert offers me a teaching suggestion, I might respond: “Well, I know my students, and that technique just wouldn’t work with them.”

Alas, this rebuttal simply removes me from the realm of scientific discussion.

Scientific research functions only when a claim can be disproven. Yet the claim “I know my students better than you do” can’t be disproven.

Safe in this “I know my students” fortress, I can resist all outside guidance.

As Didau writes:

If, in the face of contradictory evidence, we [teachers] make the claim that a particular practice ‘works for me and my students’, then we are in danger of adopting an unfalsifiable position. We are free to define ‘works’ however we please.

It’s important to note: Didau isn’t arguing with a straw man. He’s responding to a tweet in which a former teacher proudly announces: “I taught 20 years without evidence or research…I chose to listen to my students.”

(Didau’s original post is a few years old; he recently linked to it to rebut this teacher’s bluff boast.)

Beware Teachers’ Judgment, Part 2

In their excellent book Understanding How We Learn, the Learning Scientists Yana Weinstein and Megan Sumeracki make a related pair of arguments.

They perceive in teachers “a huge distrust of any information that comes ‘from above’ “… and “a preference for relying on [teachers’] intuitions” (p. 22).

And yet, as they note,

There are two major problems that arise from a reliance on intuition.

The first is that our intuitions can lead us to pick the wrong learning strategies.

Second, once we land on a learning strategy, we tend to seek out “evidence” that favors the strategy we have picked. (p. 23)

Weinstein and Sumeracki cite lots of data supporting these concerns.

For instance, college students believe that rereading a textbook leads to more learning than does retrieval practice — even when their own experience shows the opposite.

The Problems with the Problem

I myself certainly agree that teachers should listen to guidance from psychology and neuroscience. Heck: I’ve spent more than 10 years making such research a part of my own teaching, and helping others do so too.

And yet, I worry that this perspective overstates its case.

Why? Because as I see it, we absolutely must rely on teachers’ judgment — and even intuition. Quite literally, we have no other choice. (I’m an English teacher. When I write “literally,” I mean literally.)

At a minimum, I see three ways that teachers’ judgments must be a cornerstone in teacher-researcher conversations.

Judgment #1: Context Always Matters

Researchers arrive at specific findings. And yet, the context in which we teach a) always matters, and b) almost never matches the context in which the research was done.

And therefore, we must rely on teachers’ judgments to translate the specific finding to our specific context.

For example: the estimable Nate Kornell has shown that the spacing effect applies to study with flashcards. In his research, students learned more by studying 1 pile of 20 flashcards than 4 piles of 5 flashcards. The bigger pile spaced out practice of specific flashcards, and thus yielded more learning.

So, clearly, we should always tell our students to study with decks of 20 flashcards.

No, we should not.

Kornell’s study showed that college students reviewing pairs of words learned more from 20-flashcard piles than 5-flashcard piles. But, I don’t teach college students. And: my students simply NEVER learn word pairs.

So: I think Kornell’s research gives us useful general guidance. Relatively large flashcard decks will probably result in more learning than relatively small ones. But, “relatively large” and “relatively small” will vary.

Doubtless, 2nd graders will want smaller decks than 9th graders.

Complex definitions will benefit from smaller decks than simple ones.

Flashcards with important historical dates can be studied in larger piles than flashcards with lengthy descriptions.

In every case, we have to rely on … yes … teachers’ judgments to translate a broad research principle to the specific classroom context.

Judgment #2: Combining Variables

Research works by isolating variables. Classrooms work by combining variables.

Who can best combine findings from various fields? Teachers.

So: we know from psychology research that interleaving improves learning.

We also know from psychology research that working memory overload impedes learning.

Let’s put those findings together and ask: at what point does too much interleaving lead to working memory overload?

It will be simply impossible for researchers to explore all possible combinations of interleaving within all levels of working memory challenge.

The best we can do: tell teachers about the benefits of interleaving, warn them about the dangers of WM overload – and let them use their judgment to find the right combination.

Judgment #3: Resolving Disputes

Some research findings point consistently in one direction. But, many research fields leave plenty of room for doubt, confusion, and contradiction.

For example: the field of retrieval practice is (seemingly) rock solid. We’ve got all sorts of research showing its effectiveness. I tell teachers and students about its benefits all the time.

And yet, we still don’t understand its boundary conditions well.

As I wrote last week, we do know that RP improves memory of specifically tested facts and processes. But we don’t know if it improves memory of facts and processes adjacent to the ones that got tested.

This study says it does. This one says it doesn’t.

So: what should the teacher do right now, before we get a consistent research answer? We should hear about the current research, and then use our best judgment.

One Final Point

People who don’t want to rely on teacherly judgment might respond thus: “well, teachers have to be willing to listen to research, and to make changes to their practice based upon it.”

For example, that teacher who boasted about ignoring research is no model for our work.

I heartily – EMPHATICALLY – agree with that point of view.

At the same time, I ask this question: “why would teachers listen to research-based guidance if those offering it routinely belittle our judgment in the first place?”

If we start by telling teachers that their judgment is not to be trusted, we can’t be surprised that they respond with “a huge distrust of any information that comes ‘from above’.”

So, here’s my suggestion: the field of Mind, Brain, Education should emphasize equal partnership.

Teachers: listen respectfully to relevant psychology and neuroscience research. Be willing to make changes to your practice based upon it.

Psychology and neuroscience researchers: listen respectfully to teachers’ experience. Be up front about the limits of your knowledge and its applicability.

Made wiser by these many points of view, we can all trust each other to do our best within our fields of expertise.

Default Image
Andrew Watson
Andrew Watson

My friend Cindy Nebel has a thoughtful post about a recent article at TES.

Here’s the backstory: a world-famous geneticist has dismissed research into Mindset as “bullshit” and “gimmicks.”

Now, reasonable people have their doubts about Mindset Theory. We’ve written about such doubts before.

But, as Nebel emphasizes in her post, wholesale rejection of the theory simply doesn’t make sense. For instance:

Disciplines Matter…

Geneticists know a lot about genetics. And, genes matter for teaching and learning.

(How much do they matter? A BIG controversy…)

But: most geneticists remember that psychology research is complicated. Knowledge and skill in one field don’t automatically translate to knowledge and skill in another.

In other words: psychologists will — most likely — have better insights into the strengths and weaknesses of psychology debates than will rocket scientists, or nuclear submariners, or even geneticists.

This point, of course, extends to other kinds of cross-disciplinary critiques. Here’s Nebel on the intersection of neuroscience and education:

A common misconception that we hear is that education and neuroscience are related disciplines and that those who study the brain must know how we learn.

While one can inform the other, I promise that training in neuroscience does NOT include an understanding of how those brain processes translate into classroom practices.

We often talk about a very necessary dialogue between educators and researchers, because very few individuals have extensive experience in both domains.

For all these reasons, neuroscientists (and psychologists) can provide teachers with useful perspectives. But, only teachers can decide what makes the most sense in the classroom.

…but Cost Doesn’t Matter

One of the stranger parts of the TES interview: Plomin’s insistence that only expensive changes benefit education.

“To think there is some simple cheap little thing that is going to make everybody fine, it is crazy,” he said in exclusive remarks published today.

“Good interventions are the most expensive and intensive.”

Pish posh.

If you’ve spent any time at a Learning and the Brain conference, you know that teachers can make all sorts of highly effective changes to their teaching at no cost whatsoever.

Using retrieval practice instead of simple review: no cost.

Managing students’ working memory load by…say…spreading instructions out over time: no cost.

Moderating students’ alertness levels by having them move: no cost.

Anyone who says we can’t improve teaching and learning without spending lots of money simply doesn’t understand teaching, learning, or the promise of educational psychology.

Good News! Contradictory Research on Desirable Difficulties…
Andrew Watson
Andrew Watson

As we regularly emphasize here on the blog, attempts to recall information benefit learning.

That is: students might study by reviewing material. Or, they might study with practice tests. (Or flashcards. Perhaps Quizlet.)

Researchers call this technique “retrieval practice,” and we’ve got piles of research showing its effectiveness.

Learning Untested Material?

How far do the benefits of this technique go?

For instance, let’s imagine my students read a passage on famous author friendships during the Harlem Renaissance. Then they take a test on its key names, dates, and concepts.

We know that retrieval practice helps with facts (names and dates) and concepts.

But: does retrieval practice help with the names, dates, and concepts that didn’t appear on the practice test?

For instance, my practice test on Harlem Renaissance literature might include this question: “Zora Neale Hurston befriended which famous Harlem poet?”

That practice question will (probably) help my students do well on this test question: “Langston Hughes often corresponded with which well-known HR novelist?”

After all, the friendship between Hurston and Hughes was retrieved on the practice test, and therefore specifically recalled.

But: will that question help students remember that…say…Carl van Vechten took famous photos of poet and novelist Countee Cullen?

After all, that relationship was in the unit, but NOT specifically tested.

So, what are the limits of retrieval practice benefits?

Everything You Wanted to Know about Acid Reflux

Kevin Eva and Co. have explored this question, and found encouraging results.

In his study, Eva asked pharmacology students to study a PowerPoint deck about acid reflux and peptic ulcers: just the sort of information pharmacists need to know. In fact, this PPT deck would be taught later in the course – so students were getting a useful head start.

Half of them spent 30 minutes reviewing the deck. The other half spent 20 minutes reviewing, and 10 minutes taking a practice test.

Who remembered the information better 2 weeks later?

Sure enough: the students who took the practice test. And, crucially, they remembered more information on which they had been tested AND other information from the PPT that hadn’t been specifically tested.

That it, they would be likelier to remember information about Zora Hurston and Langton Hughes (the tested info) AND about van Vechten and Cullen (the UNtested info).

However, Eva’s tested students did NOT remember more general pharmacology info than their untested peers. In other words: retrieval practice helped with locally related information, but not with the entire discipline.

But Wait! There’s More! (Or, Less…)

About 2 month ago, I posted on the same topic – looking at a study by Cindy Nebel (nee Wooldridge) and Co.

You may recall that they reached the opposite finding. That is, in their research paradigm, retrieval practice helped students remember the information they retrieved, and only the information they retrieved.

Whereas retrieval practice helped students on later tests if the questions were basically the same, it didn’t have that effect if the questions were merely “topically related.”

For instance, a biology quiz question about “the fossil record” didn’t help students learn about “genetic differences,” even though both questions focus on the topic of “evolution.”

What Went Wrong?

If two psychology studies looked at (basically) the same question and got (basically) opposite answers, what went wrong?

Here’s a potentially surprising answer: nothing.

In science research, we often find contradictory results when we first start looking at questions.

We make progress in this field NOT by doing one study and concluding we know the answer, but by doing multiple (slightly different) studies and seeing what patterns emerge.

Only after we’ve got many data points can we draw strong conclusions.

In other words: the fact that we’ve got conflicting evidence isn’t bad news. It shows that the system is working as it should.

OK, but What Should Teachers Do?

Until we get those many data points, how should teachers use retrieval practice most effectively?

As is so often the case, we have to adapt research to our own teaching context.

If we want to ensure that our students learn a particular fact or concept or process, we should be sure to include it directly in our retrieval practice. In this case, we do have lots (and LOTS) of data points showing that this approach works. We can use this technique with great confidence.

Depending on how adventurous we feel, we might also use retrieval practice to enhance learning of topically related material. We’re on thinner ice here, so we shouldn’t do it with core content.

But, our own experiments  might lead us to useful conclusions. Perhaps we’ll find that…

Older students can use RP this way better than younger students, or

The technique works for factual learning better than procedural learning, or

Math yes, history no.

In brief: we can both follow retrieval practice research and contribute to it.

Is Your Classroom Worth More Than $10,000?
Andrew Watson
Andrew Watson

Here’s a remarkable story about potentially falsified research data.

The short version: researchers James Heathers and Nick Brown thought that Nicolas Guéguen’s research findings were both too sexy and too tidy.

Too sexy: Guéguen’s findings regularly made great headlines. For instance, his research shows that men voluntarily assist women who wear their hair loose much more than those who wear it in a ponytail or a bun.

Talk about your gender-roles clickbait!

Too tidy: As Heathers and Brown considered Guéguen’s math, they realized that his numbers were…too round. When Guéguen calculated averages, he had to divide by 30 — because his study groups had 30 people in them.

But, his results often ended in improbably round figures:  1.60, 1.80, 2.80.

Mathematically speaking, that’s possible. But, when you’re dividing by 30, it’s wildly unlikely.

Heathers and Brown have now spent quite a long time — years, in fact — trying to get Guéguen to share his data and/or answer their questions.

As far as I know, they haven’t gotten answers yet. (Here’s a somewhat more recent article.)

What Teachers Should Do

First, keep in mind that scientists are people too. While most strive honorably to follow procedures and discover meaningful findings, a few will always cheat the system and abuse our trust.

That’s true in any profession. It’s true in psychology and neuroscience research. (It might even be true of teachers.)

Second, let this knowledge energize your skepticism.

Here’s my suggestion: never make changes to your classroom work without asking hard questions.

Think about it this way: if someone wanted to sell you something for $10,000, you’d needs lots of confirmation. You wouldn’t take the seller’s word that the product was worth $10k. You’d investigate on your own.

For instance, ask…

Have other researchers gotten similar findings?

How aggressively have they tried to disprove them?

Is the suggestion strongly confirming a bias that everyone already holds?

So: your classroom is worth more than $10,000. You wouldn’t buy an expensive product without asking hard questions. Don’t adopt research suggestions without asking hard questions.

In the meanwhile, I’m going to put my hair up in a bun.

 

 

 

Healthy Snacks After Exercise? Depends on the Timing…
Andrew Watson
Andrew Watson

If you saw Roy Baumeister at the 2015 Learning and the Brain conference in Boston, you remember his presentation on self-control.

Of course, teachers care A LOT about self-control.

We need our students to control their behavior. (“Do not use the bunsen burner to light your backpack on fire,” my 6th grade science teacher said to me. Often.)

And, we need them to control their cognitive processes. (“When balancing chemical equations, start by identifying the elements.”)

Baumeister found, among other fascinating things, that both kinds of self-control “drain the same reservoir.”

That is: if I use up some self-control resisting the temptation to climb the jungle gym, I have less self-control left over to process the steps involved in subtracting two-digit numbers. (Baumeister’s book Willpower, written with John Tierney, explains his research in helpful detail.)

Replication Controversy

As the field of psychology wrestles with the “replication crisis,” Baumeister’s conclusions have come under question.

Some researchers haven’t gotten the same results when they run self-control experiments. Some question the research field in general. (For instance: terms like “self-control” are notoriously hard to define.)

This question matters to us. If Baumeister’s theories don’t hold water, then it’s unlikely the self-control solutions he proposes will help very much.

So, to take only the most recent example, John Medina’s Attack of the Teenage Brain devotes several chapters to helping adolescents develop executive functions — such as self-control.

If the research that Medina cites can’t be trusted…we might be back to square one.

Latest News

I’ve just found some pertinent research in an unlikely field: exercise and nutrition.

Researcher Christopher Gustafson and Co. asked visitors at a local gym to wear an accelerometer, purportedly so they could “keep track of relevant exercise data.” As a reward for participating, they were given a free snack after their workout.

In fact, the “accelerometer data” story masked the real interest of the study: participants’ snack choice.

All participants chose between a brownie and an apple. Some got the choice before they exercised; some after. Did the timing of the choice matter?

If Baumeister’s theory holds up, we would expect a difference between these two groups. Here’s why…

Self-Control, Snacks, and Exercise

Because apples are a healthier snack than brownies, we know we ought to choose them. But, for most of us, brownies taste a lot better. And so, we must use self-control to make that choice.

Likewise, we know that exercise is good for us. But, we rarely want to do it — and so that choice also takes self-control.

If I make the snack choice before exercise, my self-control reservoir remains relatively full. As a result, I’m likelier to make the “right” choice.

But, if I select my snack as I towel off after exercise, I’ve probably drained that reservoir considerably. So, I’ve got less willpower left. And I’m likelier to give into chocolatey temptation.

Is that what Gustafson found? Indeed he did.

In fact, 17% fewer people chose the apple, and 6% more chose the brownie. (The rest turned down a snack altogether.)

In other words: this study supports Baumeister, and gives us increased confidence in the research suggestions that flow from it.

What are some of those suggestions? You can start with an intriguing one here.

Welcome to San Francisco
Andrew Watson
Andrew Watson

If you’re a regular blog reader, you just might be a frequent Learning and the Brain conference attendee. (I got my start in 2007, and haven’t stopped since.)

We’re gathering — starting tomorrow! — at the Fairmont Hotel in San Francisco to discuss Educating with Empathy: cultivating kindness, compassion, empathy, and good behavior.

Many of the speakers have featured recently on the blog. John Medina’s most recent book has shown up at least twice.

Rebecca Gotleib — this blog’s book reviewer — will be presenting on the power of teens’ social-emotional skills.

And I’ll be offer a pre-conference session: “Understanding Adolescence, Teaching Adolescents.” We’ll be talking about teenage cognition, emotion — and even technology use.

I’ve enjoyed getting to know many of you over my blogging years. I hope you’ll stop by and introduce yourselves! I’m easy to find: I look a lot like the guy on the right…