Listening to a presentation on student satisfaction survey results recently, I felt a sort of ambivalent sympathy with the administrative colleague whose job it is to assure students that we’re committed to ‘closing the feedback loop’, so that they can feel that their efforts in filling out our surveys result in meaningful action.

‘Closing the feedback loop’ is considered the acme of good practice, but it’s also the Everest of communication challenges. It relies on persuading everyone involved that ‘we’ can address the issues that ‘you’ have raised. Unfortunately, like all customer surveys, ours is an attempt to manage multiple elements by proxy actions, in which the business owner surveys the customers about their satisfaction with the staff, but has surprisingly limited direct means by which to fix any problems that come up. It’s then very difficult in turn to report back to the customer on actions taken.

This is made worse by the fact that the working context for the ups and downs tracked in the survey data is quite remote from the everyday experience of the administrators who scrutinise the results, even though we all work for the same employer, at the same location, and we see each other every day at the coffee cart and the ATM. Most administrative staff have been university students, but not so many have been university teachers. Meanwhile, few university teachers have professional administrative experience in, say, marketing or customer relations.  Neither side has much idea what the other is saying to students about why universities exist or what we hope to achieve.

So really, who knows what’s going on in the lecture theatre, the classroom, the corridor or, increasingly, in students’ bedrooms and on the train, as more and more of our teaching contact takes place online? What is any of this supposed to feel like? What prompts a sense of satisfaction rather than dread in each of these circumstances? And how can the survey owners best use their brightly coloured bar graphs to direct academics to engage in remedial action at a distance, other than by hitching survey results to internal mechanisms for reward and career progress?

To do this raises the stakes considerably. As everyone and their surveymonkey knows, grievance warps the evaluation of a process that’s meant to produce challenging experiences. Few experiences are more challenging, and potentially dispiriting, than being graded—and this is where the really pointy feedback stuff is going on. As a result this is also where quite a bit of dissatisfaction occurs. What role should this dissatisfaction play in shaping career outcomes? What kind of cross-checking should occur? What kinds of reflections about contexts, constraints or opportunities might add to this narrow account of satisfaction in the very short-term?

So the awkward reality for my administrative colleague is that we don’t have one feedback loop that can be closed by institutional fiat. Rather, each encounter between students and teachers opens up multiple opportunities for feedback that are messy, compromised and heartfelt.  Academics don’t give feedback lightly; we try over and over to find the right form of words that will help students learn from the ways in which they didn’t quite achieve what they set out to do.  At the same time, we spend a lot of time wondering how students are going, and—crazy as this might sound—asking them directly about their experiences and their satisfaction while there’s still time to do something about it.

In the end, the problem for our survey culture is that there is no unifying institutional ‘we’ that can coherently animate statements about what the university thinks, practices, believes or wants to do. But it’s probably worth quietly pointing out that academics and students spend a great deal more time with each other than either do with university administrators, so perhaps the best way to keep the feedback loop open is ensure that we have enough time and energy to learn from each other as we do.

0 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.