Skip to content

What Can We Learn from Course Evaluations?

Spoiler alert: you can’t tell how well the students actually learned in your course. While feedback on your course evaluations will be helpful to understand the student experience, a recently published meta-analysis found no correlation between student evaluations of teaching (SETs) and later performance, and actually a negative correlation after grade controls. When schools connected contract renewal to SETs, there was evidence of grade inflation by those instructors.

Did I actually learn something, or do I just feel like I did?

Deslauriers et al. (2019) compared traditional lecture with active learning in an introductory physics course. Although students in the active sections learned more—as shown by higher performance on objective tests—they felt like they learned less. The authors argue that active learning requires more cognitive effort, which students may interpret as poor learning, while smooth lectures create an illusion of learning. This mismatch suggests that student perceptions alone (e.g., course evaluations) can be misleading when judging teaching effectiveness.

The Cognitive Challenges of Effective Teaching

Chew & Cerbin propose a research-based framework of nine interacting cognitive challenges that teachers must address in order to promote “optimal learning” rather than merely acceptable performance. They emphasize that teaching is not just delivering content but creating the conditions in which students learn. Each of the nine challenges represents a characteristic of how students think, learn, or struggle — the idea being that failure to address any one of these can undermine learning. The authors describe each challenge, provide examples, and suggest instructional strategies for mitigation.

Wrong answers, right learning: Using errors to deepen understanding

This systematic review examines how instructional materials that embed errors (so-called “erroneous examples”) or juxtapose incorrect and correct solutions (“contrasting erroneous examples”) can influence student learning across a variety of domains (mathematics, medicine, science). The authors reviewed 40 studies and found that these approaches can enhance learning — especially by helping students grasp both what not to do (negative knowledge) and what to do (positive knowledge) — but the benefits depend strongly on how the errors are used, what scaffolding (prompts, feedback) is provided, how complex the task is, and how much prior knowledge the learner has.

Don’t Just Learn It, Apply It.

Why should you care about a 1980 study on analogies? Because it still explains why students don’t always transfer what they’ve learned to new situations—and what we can do about it.

In this classic paper, Mary Gick and Keith Holyoak showed that people often fail to apply a known solution from one context (like a military story) to another (a medical problem) unless they’re cued to see the connection.

New Meta-Analysis Probes Technology’s Link to Cognitive Aging

In this meta-analysis, researchers found that use of digital technologies was associated with a reduced risk of cognitive impairment and reduced rates of cognitive decline, even when controlling for demographics, SES, and health. Of course, correlation is not causation, so there may be other underlying factors. However, the good news is that technology does not rot your brain! Those that had the best outcomes used technology for more cognitively demanding tasks versus scrolling through social media. It’s not the tool; it’s how you use it.