Type to search

Author: Diana Uyder

surviving negative course evaluations
[dropcap]J[/dropcap]ust in time to thwart any attempts at starting to unwind and enjoy a well-deserved break from another brutal academic year, automated results of the Student Course Satisfaction Surveys (aka evaluations) arrive in my inbox demanding attention. I know better than to open these evaluations, of course, and I tell myself—every semester—that I won’t succumb to my own curiosity. But every semester I click the link and descend into the hell that is reading (mostly untrue) student feedback about the 16 weeks of demanding work I just finished with them. Certainly, some comments are complimentary, but in these days of social media rants and the false sense of security and anonymity provided by their laptops, today’s students are more apt than ever to use online evaluations to criticize their professors rather than reflect on their own academic purpose or performance. And much of the criticism volleyed our way is untrue. A recent study of 240 participants conducted by the University of Northern Iowa revealed that approximately one third of students admitted to lying on faculty/course evaluations (Anas, 2011). And because faculty and course evaluations are used to justify annual review ratings, salary increases, and promotion activities, the university’s use of this information is especially problematic. Reading the feedback is bittersweet: Some students express gratitude and joy resulting from their newfound knowledge and skills while others insolently and inaccurately record creative recollections of information contained in the syllabus and course. As examples, I provide some of the inaccurate comments appearing in my most recent crop of evaluations and the realities surrounding the same: “This course required too many presentations and I’d rather learn from my professor than my peers. I could buy my peers coffee and learn from them for far less than it costs for me to pay for college.” Truth: One presentation was required in this teacher preparation methods course; part of an entire graduate-level degree designed to prepare students to present material to students all day long. Somehow the goals of the course and program were lost on this student. “Creating projects for this class was too expensive.” Truth: Since this is, again, a teacher preparation program being offered through a reputable university, course-related expenses are to be expected. Further, since these students are preparing themselves to enter a profession that is widely-known for its dependency on personal expenditures (room decorations, special project materials, supplies), I have bad news for this student about his/her chosen career. “Assignments shouldn’t be due over spring break. We all need a break.” Truth: The due date, originally inadvertently planned for spring break, was moved after a whole-class collaboration and agreement on a new due date. This change occurred four weeks prior to spring break. “This instructor doesn’t respond to communication in a timely manner.” Truth: All emails and calls are returned in under 24 hours. Always. My colleagues (present company included) would tell me to disregard the negative remarks and remind me that these were obviously written by entitled students disappointed in their own performance and/or by those who had trouble comprehending the course purpose and goals. They’d also tell me that the university expects both positive and negative outliers to respond to the surveys and that most satisfied students don’t bother. All true, but of little consolation. This bi-annual cycle of reading feedback from students who previously never expressed a concern and my subsequent disappointment gives me pause. Am I not as fabulous as I think I am? Possibly. Are my students completing my evaluations to somehow vindicate their own subpar performance? Perhaps. Are people, in general, now far too comfortable recording details and data online even when the details and data are inaccurate? Definitely. Suler (2004) suggests that this type of online behavior occurs due to six factors: In short, the Internet (and, relatedly, participation in online course evaluations) blurs and even eliminates the social boundaries that keep student behavior in check in the live, face-to-face setting. Reports of problematic behaviors are on the rise nationally, not only in the classroom but in society at large (Kowalski, 2003) and many administrators make the mistake of using student evaluations as a definitive and objective measure of teacher quality (Perlmutter, 2011) and place too much emphasis on student feedback. When this occurs, and the narrative feedback is consistently inaccurate and sometimes malicious, unintended instructor-related consequences can occur. Grades can be inflated (“If everyone gets an A, no one will complain”), the instructor can feel disrespected as a content-matter expert and authority figure, and the motivation to teach can be diminished (Carnegie Mellon, 2016).  More serious consequences pertain to limitations in promotions and compensation. However, since universities deem student course evaluations necessary, it’s unlikely that they’ll be discontinued anytime soon. And since the general population exceedingly lives life online, it’s probable that the brazen will continue to log on and report inaccurate, hurtful, and possibly career-limiting commentary. Under the circumstances, what’s a competent, but concerned, professor to do? When the syllabus contains detailed expectations, and the professor is collaborative and approachable, and the course is well designed and competently delivered, how can academics mitigate a now seemingly constant stream of negative feedback? I’ve asked myself these questions often, so I decided that rather than try and ignore the feedback or the subsequent feelings of distress that result, I resolved that this time will be different. This time, I’ll work through my contemplation with affirmations that I am organized, fair, reasonable and deliver an interesting and relevant course. I’ll focus on the positive remarks and I’ll engage in work/life balance activities such as yoga and meditation. And when that doesn’t result in alleviating my anguish, I’ll review my course and syllabi to ensure the following relative to the recent “feedback”: As a matter of general good practice, I’ll also: I also plan to incorporate Sorcinelli’s (2001) ideas on diminishing classroom incivilities through further decreasing anonymity in my online and video-based courses (student use of a personal profile image and brief bio—similar to the set-up on Facebook or Twitter—will be a new requirement this fall). I’ll also investigate how I might incorporate more opportunities for active learning (Sorcinelli, 1991) in my classes which already utilize synchronous and asynchronous discussion, group polling, live chat (complete with emoticons!) and more. My student-centered courses require a high level of student engagement now, but reviewing modifying and adjusting the curriculum is always a worthwhile endeavor and doing so may result in improving the learning experience for students (Oliver & Hyun, 2011). After doing all of that, I’ll breathe deeply, look forward to the semester ahead, and hope for the best. And I’ll do the same for you. References Anas, B. (2001). Study: College students lie on faculty evaluations. Daily Camera. Retrieved May 19, 2018 from http://www.dailycamera.com/ci_16995611. Carnegie Mellon University. (2016). Address problematic student behavior. Retrieved May 20, 2018 from https://www.cmu.edu/teaching/designteach/teach/problemstudent.html Kowalski, R. M. (2003). Complaining, teasing, and other annoying behaviors. New Haven, CT: Yale University Press. Oliver, S. & Hyun, E. (2011). Comprehensive curriculum reform in higher education: collaborative engagement of faculty and administrators. Journal of Case Studies in Education. 2 1-18. Perlmutter, D. (2015). How to use student evaluations wisely. Retrieved May 20, 2018 from https://chroniclevitae.com/news/1035-how-to-use-student-evaluations-wisely Sorcinelli, M. D. (2002). Promoting civility in large classes. In C. Stanley & E. Porter (Eds.), Engaging large classes: Strategies and techniques for college faculty (44-57). Bolton, MA: Anker. Sorcinelli, M. D. (1991). Research findings on the seven principles. In A. Chickering & Z. Gamson (Eds.), New Directions for Teaching and Learning, 47, 13-25. San Francisco: Jossey-Bass. Suler, J. (2004). The online distribution effect. Cyberpsychology and Behavior. 7(3), 321-326. Diana Uyder is a clinical professor who teaches in-person, online, and video conference teacher preparation courses at Northern Arizona University, where she’s led graduate-level cohorts for nearly 20 years.