The Daily Gamecock

Since switch to online system, fewer course evaluations completed

Administrators consider how to improve response rates

 

As the semester draws to a close, course evaluation emails have started pouring into inboxes, and Helen Doerpinghaus is worried that fewer students are filling the surveys out.

Since the university has largely moved to online evaluations, Doerpinghaus, the vice provost and dean of undergraduate studies, has seen survey response rates fall. So far, response rates have dropped between 2 and 20 percentage points.

The effect has been most dramatic in the School of Music. Last fall, the first time the school used online surveys, just more than half of them were completed, according to numbers compiled by the school. In the spring, the rate fell to 40 percent, and for this semester, just less than a quarter have been finished as of Tuesday morning.

Doerpinghaus attributed the trend to an online system that doesn’t give students much of an incentive to finish their surveys or set time aside for them. When paper surveys were the norm, professors often gave students class time to finish them.

But it’s an issue that hasn’t been as dramatic in the College of Education or the School of Library and Information Science, where response rates have been relatively high. Doerpinghaus thinks that offers some insight into the root of the trend.

In the College of Education, she said, students understand the importance of feedback in teaching classes, and the School of Library and Information Science is generally more wired in — its classes are more prone to use technology than those in the School of Music are.

As Doerpinghaus sees it, the first step in increasing the response rates falls on the faculty.

They need to remind their classes that surveys are anonymous and what they can expect on them, try offering some sort of incentive — pizza for a 100-percent completion rate, she suggested — and sell them on the importance of filling evaluations out, she said. She intends to send an email to all faculty asking them to emphasize those points with their classes.

And she thinks doing so is important. Evaluation results are read by department chairs, mentioned in annual reviews and often shape how classes are taught — that’s why she thinks the dropping response rates are problematic.

“That’s terrible,” Doerpinghaus said. “For one thing, faculty aren’t getting feedback, but for another thing, when professors go up for tenure and promotion, they have to be able to demonstrate that they’re good teachers ... When [students] don’t say anything, they’re hurting the good ones.”

So far, the bulk of USC’s efforts to reverse the trend have comprised outreach to faculty, but Doerpinghaus has thought about making more substantial changes to how the university handles evaluations, too.

For now, schools and departments have control over how they distribute them. Some classes — including a smattering in English, history and foreign language, among others — still use paper forms, and there’s no single time period when the surveys go out.

Doerpinghaus has considered making that process more standard throughout the university, reducing the number of emails departments send out and publishing some evaluation results.

Though faculty decide what questions are asked, some like “Would you recommend this class to a friend?” are uniform, and she thinks putting those results out in the open could help students pick their classes and show a broader picture than websites like RateMyProfessors.com do.

“When I was teaching, two students weighed in (on Rate My Professors) — one loved me and one hated me,” Doerpinghaus said. “That’s exactly what you would predict, because unless you have a strong feeling, why would you go out there? Whereas on a course evaluation ... you get a better read for the class.”

Comments

Trending Now

Send a Tip Get Our Email Editions