Professors and administrators hope the new efforts will revive the guide, which has suffered from low response rates. Last spring, 62 percent of course evaluations were at least partially filled out, according to the Office of the Registrar.
“They’re low,” said interim Dean of the College David R. Pilbeam. “They’re never going to be 100 percent.”
As professors continue to discuss ways of improving student participation, some faculty members have turned to their own means of getting feedback and have created incentives to encourage students to weigh in.
Student participation has been “pathetic,” according to James D. Wilkinson ’65, the director of the Derek Bok Center for Teaching and Learning.
Pilbeam pointed to Yale, where undergraduates are not allowed to view grades online until they fill out an evaluation, as a model of generating high participation. Response rates at the school top 80 percent, according to the Yale registar’s office.
Professors said they find themselves unable to depend on the ratings as much as they would like.
“It was more effective in the past when the number of students who filled it out was higher,” said Robert A. Lue, executive director of undergraduate education in molecular and cellular biology. “Despite pleading e-mails...we’re only seeing 60 percent of the class. What 60 percent is that? It’s hard to interpret the results.”
Last year, Lue overstepped the Q and created his own survey for his introductory life sciences course. The questionnaire received a 90 percent response rate, according to Lue, and the responses provided more detailed information about specific assignments, lectures, and teaching methods.
Pilbeam, who teaches in the Anthropology Department, described his own practice of using “minute questionnaires” at the end of each lecture to evaluate what students thought was best or worst about the lecture in order to extract feedback from his class.
Pilbeam said the information professors receive from the evaluations is vital.
“I gather that some students don’t realize that the faculty actually read them,” Pilbeam said.
Other professors say Q responses have grown less thoughtful since the switch from paper to online forms in 2005—a move originally aimed at boosting student responses.
“Since it has gone online, it has become less useful for the faculty because students give it less time,” said Jewish studies professor Jay M. Harris, who is also a senior adviser to Pilbeam.
Music professor Thomas F. Kelly said he wonders whether the low response rates mean that evaluations simply reflect the views of strongly opinionated students, making it difficult to gauge the overall feeling of the class.
Kelly, chair of the Committee on Pedagogical Improvement, requires students in his Literature and Arts B-51 course, “First Nights: Five Performance Premieres” to hand in a printout confirming their completion of the Q response in order to get into the final lecture.
Because students had to write their final essay on this lecture, Kelly boasts one of the best response rates of any professors. All but two of the 205 students in the class filled out the evaluations.
Kelly said the feedback gleaned from the guides is crucial in helping professors make their courses better.
“If it weren’t for the Q guide evaluations, there wouldn’t be any other objective way for teachers to improve their teaching,” Kelly said.
THE GOOD AND THE BAD
In obtaining student reactions, both through the Q guide and their own methods, professors are able to make adjustments.
Ivy Livingston, a preceptor in the Classics Department, changed her Greek Aa textbooks and the timing of quizzes based on student responses.
Other professors have also found ways to integrate feedback into their curriculum. Tobias Ritter, whose Chemistry 30 class received a difficulty rating of 4.5 last year, said the numbers were a wake-up call.
“I realized that students have a lot of work,” Ritter said,
Ritter said that this year he tried to provide “additional help” for his students to ease the workload.
But students said they often experience the opposite effect: that so-called easy classes are actually much harder.
“There’s been classes I’ve taken that I heard were easy that got harder,” said Jana C. Berglund ’08, “And I would blame that on the Q guide.”
Some students question the Q’s usefulness, contributing to lower response rates.
Siobhan P. Connolly ’08, along with Berglund, said that because classes change from year to year, the previous year’s ratings are sometimes irrelevant.
“I definitely use the Q, but I don’t base any final decisions on it,” Connolly said.
Luca Candelori ’08, said he has never filled out a Q evaluation because he does not use the guide himself.
“I’m a math major, so I have to take some classes regardless of what people say,” he said. “And in math they change professors every year. For [Core Curriculum courses] and the rest, I ask my friends. I find that’s a little more reliable.”
—Staff writer Benjamin M. Jaffe can be reached at email@example.com.
—Staff writer Rachel A. Stark can be reached at firstname.lastname@example.org.