Type to search

Collaboration without Learning

For Those Who Teach

Collaboration without Learning

Print This Article
Active learning approaches frequently promote student conversations about the content. As students try to explain things to each other, argue about answers, and ask questions, learning happens. We can hear it and see it. It’s why we teach.

To continue reading, you must be a Teaching Professor Subscriber. Please log in or sign up for full access.


You Might Also Like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Active learning approaches frequently promote student conversations about the content. As students try to explain things to each other, argue about answers, and ask questions, learning happens. We can hear it and see it. It’s why we teach.

Teaching Professor Blog

An interesting study of student conversations over clicker questions was motivated by what researchers were hearing faculty say as they started using clicker questions (James & Willoughby, 2011). The faculty “invariably imagined idealized productive conversations wherein students would systematically discuss various question alternatives in light of their own prior knowledge and experience” (p. 123). As the researchers worked with faculty on implementing clicker questions, they started recording some of the student conversations. What they heard students say justified a more thorough analysis. They ended up recording 361 student conversations with the overarching goal of offering faculty insights into what students actually talked about when they discussed the clicker question and answer options.

Some students had the kind of productive conversations the teachers imagined: 38 percent of them discussed at least one of the multiple-choice alternatives, and the answer they finally selected represented the ideas they had discussed. But 62 percent did not have these kinds of exchanges. The researchers provide a typology of these derailed conversations, illustrating them with sample exchanges over specific clicker questions. Understanding some of their analysis requires knowledge of the astronomy content used in the questions, but some of what they heard has broad implications.

“We found that more than one-third of the conversations did not include any meaningful exchange of student ideas,” the authors wrote (p. 131). In some of these conversations students simply asserted that a particular answer was correct. They didn’t cite any evidence or offer any viable reasons but instead said things like an answer “sounded good.” In other of these conversations, students reached a consensus with no real discussion. Someone in the group proposed an answer, offered a justification—sometimes not a very good one—and everyone else agreed. From the recordings, researchers could not tell whether students went along because they didn’t know the answer, because they didn’t care, or because they didn’t feel comfortable offering another possibility. And in some conversations, students didn’t try. They announced that they didn’t know and had no idea how to figure it out and then took a wild guess.

Another category of off-target answers involved student ideas that weren’t among the answer options. These were mistakes or misunderstandings the instructors who wrote the questions hadn’t anticipated students would make. Use of clicker questions where answer information is aggregated does not reveal these alternative student ideas. The same could be said for any student discussion the teacher does hear.

Are there lessons to take from this analysis? I think so. For starters, if we want students to have the kinds of conversations that promote learning, we can’t assume they’re the automatic outcomes of collaboration. Most of us who’ve had students discuss problems in groups know that, even though we’d like to think otherwise. A lot of our students still don’t know how to carry on an intellectual conversation. They don’t understand the value of sharing ideas, considering options, evaluating answers—those back and forth exchanges that increase understanding and lead to the right answer. Teachers can start cultivating that awareness simply, as they did this study, by developing a set of guidelines—in this case, guidelines that outline ways to discuss problems and possible answers.

This study’s findings also revealed that how the clicker questions were graded influenced student discussions. If more credit was awarded for correct answers than for incorrect ones, students were more likely to be passive and to select an answer proposed by someone else, even if they did not agree with that answer. When clicker questions earned credit regardless of their correctness, there was less passivity and more discussion of answer alternatives. These results offer yet another endorsement of low-stakes grading options.

As for those conversations in which students offer alternative ideas—some of which may be brilliant (though most are not)—teachers need to hear those ideas. Teaching improves when a teacher understands student perspectives. Students are encouraged to share their ideas when teachers respond respectfully and constructively confront what may be an interesting but totally incorrect answer.

Active learning powerfully promotes learning, but it doesn’t work magically.


James, M. C., & Willoughby, S. (2011). Listening to student conversations during clicker questions: What you have not heard might surprise you! American Journal of Physics, 79(1), 123–132. https://doi.org/10.1119/1.3488097