Many people who offer teaching advice cite psychology and neuroscience research to support their arguments.
If you don’t have time to read that underlying research — or even the expertise to evaluate its nuances — how can you know whom to trust?
My advice comes in the form of a paradox: You should be likelier to TRUST people who tell you to DOUBT them.
I thought of this paradox last week when reading a blogger’s thoughts on Jeffrey Bowers. Here’s the third paragraph of the blogger’s post:
I am a researcher working in the field of cognitive load theory. I am also a teacher, a parent and a blogger with a lot of experience of ideological resistance to phonics teaching and some experience of how reading is taught in the wild. All of these incline me towards the systematic teaching of phonics. I am aware that Bowers’ paper will be used by phonics sceptics to bolster their argument and that predisposes me to find fault in it. Bear that in mind.
In this paragraph, the blogger gives the reader some background on his position in an ongoing argument.
He does not claim to read Bowers’s work as an above-the-fray, omniscient deity.
Instead, he comes to this post with perspectives — let’s just say it: biases — that shape his response to Bowers’s research.
And he explicitly urges his reader to “bear [those biases] in mind.”
Of course, in the world of science, “bias” doesn’t need to have a negative connotation. We all have perspectives/biases.
By reminding you of these perspectives — that is, his limitations — the blogger gives you reasons to doubt his opinion.
And my argument is: because he reminded you to doubt him, you should be willing to trust him a little bit more.
The blogger here is Greg Ashman, who writes a blog entitled Filling the Pail. Lots of people disagree with Ashman quite vehemently, and he disagrees right back.
My point in this case is not to endorse his opinions. (I never write about reading instruction, because it’s so complicated and I don’t know enough about it to have a developed opinion.)
But, anyone who highlights his own limitations and knowledge gaps in an argument gets my respect.
Over on Twitter, a professor recently tweeted out a quotation from the executive summary of a review. (The specific topic isn’t important for the argument I’m making.)
Soon after, he tweeted this:
“When I tweeted out [X’s] new review of [Y] a few days ago, I pulled a non-representative quote from the exec summary.
It seemed to criticize [Y] by saying [Z] … [However, Z is] not the key criticism in the review. Here I’ve clipped more serious concerns.”
He then posted 4 substantive passages highlighting the review’s core critiques of Y.
In other words, this professor told you “I BLEW IT. I created an incorrect impression of the review’s objections.”
You know what I’m about to say now. Because this professor highlighted reasons you should doubt him — he blew it — I myself think you should trust him more.
We all make mistakes. Unlike many of us (?), this professor admitted the mistake publicly, and then corrected it at length.
In this case, the professor is Daniel Willingham — one of the most important scholars working to translate cognitive psychology for classroom teachers.
He’s written a book on the subject of skepticism: When Can You Trust the Experts. So, it’s entirely in character for Willingham to correct his mistake.
But even if you didn’t know he’d written such a book, you would think you could trust him because he highlighted the reasons you should not.
Look for thinkers who highlight the limitations of the research. Who acknowledge their own biases and gaps in understanding. Who admit the strengths of opposing viewpoints.
If you hear from someone who is ENTIRELY CERTAIN that ALL THE RESEARCH shows THIS PSYCHOLOGICAL PRINCIPLE WORKS FOR ALL STUDENTS IN ALL CLASSROOMS — their lack of self-doubt should result in your lack of trust.