Online Discussion Boards: Beyond “All the Research Shows…” – Education & Teacher Conferences Skip to main content

Online Discussion Boards: Beyond “All the Research Shows…”

If your school uses a learning management system — Canvas, Blackboard, Moodle — you almost certainly have the option of creating online discussion boards for your students. A typical assignment sounds like this:

“Answer this question, and comment on at least two of your classmates’ answers.”

In theory, these online discussions offer several benefits:

  • They recreate classroom discussions, but better — all students can participate!
  • They give students a chance to think, critique, ponder — time for reflection and deeper learning.
  • They encourage students to practice writing.

Given all these potential benefits, I’m not surprised that many schools require teachers to use discussion posts. In fact, a friend of mine — who is required to use them because “all the research” says they’re beneficial — recently asked me about all that research.

Here’s what I found…

Starting Big

Often when I write a blog post, I start with one well-done study to see what it says. In this case, I started with Elicit.com — an AI platform that explores research-informed questions. I asked it to look for randomized controlled studies using active control groups for grades 6-12. I also asked it to focus on “objective measurements”: that is, I wanted to highlight what students learned in class more than how they felt about class.

Elicit identified the 25 studies it judged most relevant — although truthfully few met all the criteria I included. From my perspective, here’s the headline:

“Among studies directly comparing online discussion to active in-person instruction with objective measures, findings diverged substantially.”

More specifically, the discussion boards’ effectiveness depended on:

  • student characteristics — grade, background knowledge, socio-economic status, and so forth
  • the class topic — organic chemistry, physics, argumentative writing
  • the variable measured — correcting misconceptions, extended reasoning
  • the teachers’ training
  • the scaffolding of the discussion prompts

This sentence also jumped out at me:

Effect sizes were larger when implementations included teacher training, curriculum integration, and structured facilitation rather than simply providing platform access.

In brief: when I looked at a lot of studies, I found a muddle. These studies don’t exactly contradict one another; at the same time, they don’t add up to a useful or coherent set of recommendations.

Zooming In

Given that a big-picture survey didn’t provide coherent guidance, I thought I’d try a more granular approach. A conversation with ChatGPT, for instance, suggested that the number of posts a student composes doesn’t correlate with grades. This study, by Song and McNary, does reach that conclusion.

On the one hand, that’s helpful information. On the other hand, the study includes eighteen graduate students. Graduate students are (on average) more academically motivated and successful than most folks (on average); and, there are only eighteen of them. For that reason, I don’t draw strong conclusions from this study — certainly not for middle- and high-school teaching.

ChatGPT also notes that the instructor’s participation in the discussion board matters. This study, by Richard Ladyshewsky, suggests that students benefit not just from the professor’s participation, but from “social presence”: a focus on building community by, say, using students’ names and sharing personal reflections.

This study includes more than 18 students — it includes data from roughly 100. But like the Song and McNary study it focuses on graduate students. And — crucially — its data come from TWO professors: the one who created more “social presence,” and the one who created less. By the way, the students who experienced more social presence enjoyed the class more, but didn’t learn any more. That is: increased social presence predicted student satisfaction — not academic achievement. Here again, I don’t think we can draw strong conclusions to help teachers in general.

Settling Down

To summarize the paragraphs above: if I had found a useful set of research-informed discussion board guidelines to offer, I would happily report that finding here. Alas, I simply don’t think we have a coherent, useful body of on-point research telling us how best to manage online discussion boards … or even whether or not we should have them. Instead, we have several narrowly tailored studies which don’t add up to consistent, practical advice.

I hasten to add: this conclusion doesn’t mean that anyone has done something wrong. Researchers have looked at specific questions, gathered data, crunched numbers, and published it all. That’s what they’re supposed to do. If all that crunching doesn’t add up to good advice, that result doesn’t mean that researchers shouldn’t have done the underlying work. It means we haven’t done enough studies to find the hard-to-detect patterns that might be in there somewhere.

Instead of looking at discussion-post research, I think we should change our approach. Let’s look more generally for advice drawn from cognitive science.

For instance: working memory. We know that working memory overload brings learning to a halt. For that reason, we should carefully ensure that any discussion-board assignments don’t overload WM. We might consider:

  • how straightforward is the discussion-board technology? If the process of completing the assignment requires popping in and out of multiple multi-step threads, then students use their WM to navigate the LMS, not think about the question.
  • how many steps do students have to follow to answer the prompt itself? Am I asking a clear enough question for students to manage on their own?
  • how much WM load does this assignment create for the teacher? Is the benefit to the student worth those extra demands?

Or: long-term memory. Do my discussion-board questions require deep processing, a.k.a. “enriched encoding”? Do they offer students a chance for retrieval practice? Is there generative learning afoot?

If I simply ask “did you like the reading?” or instruct students to “write five sentences in response to the poem,” I’m probably not requiring a rich enough exploration to merit this use of our time.

Or: motivation. Self-determination theory tells us that students are motivated by — among other things — a sense of relatedness: with the material, or each other, or the teacher. Can I create discussion-board questions that foster that kind of relatedness?

Reaching beyond cognitive science for a moment, I also want to highlight the importance of opportunity cost. My question shouldn’t be “do discussion boards accomplish these goals,” but rather, “Do they accomplish them more effectively than the alternatives?”

To Sum Up

Learning management systems make online discussion boards possible. I don’t think we (yet) have a broad research pool showing that they’re beneficial in middle- and high-school classrooms. If we want to use them — or are required to do so — then we can draw on broad principles of cognitive science to ensure our students learn the most from these tools.


Song, L., & McNary, S. W. (2011). Understanding Students’ Online Interaction: Analysis of Discussion Board Postings. Journal of Interactive Online Learning10(1).

Ladyshewsky, R. K. (2013). Instructor presence in online courses and student satisfaction. International Journal for the Scholarship of Teaching and Learning7(1), n1.


Recent Blogs

Online Discussion Boards: Beyond “All the Research Shows…”
Andrew Watson
Andrew Watson

If your school uses a learning management system -- Canvas,...

The Emperor’s New School — or, the Future of Education?
Andrew Watson
Andrew Watson

Here's my pitch for you: I can run a school...

Building Better Concept Maps
Andrew Watson
Andrew Watson

Teachers regularly hear that "concept maps help students learn." While...