Currently focused on the doorway of every lecture theatre, cameras record students as they enter, matching their faces to university records.
Using Hitachi Data Systems to improve student life at Curtin University, in the black: leadership, strategy, business
I’m really stuck on this article. I’ve read it over and over. I think about the world we’re making, and the world we’re mining, and I’m trying to process something that feels like grief at the way we speak about innovation. It’s not the innovation itself, entirely: it’s the claims we make for its use.
Surveillance itself isn’t new. We’re used to machine learning, and keystroke monitoring, and dashboards. Facial emotion recognition is front of mind for education researchers working on online student retention. And using the campus as a laboratory via the exploitation of students as research subjects has a long history; until Amazon Turkers came along, students were the obvious choice for surveys, experiments and observations. But tracking students in and out of lecture theatres is not fresh thinking; the most stubborn problem facing students continues to be the bit we take for granted, which is using lectures to deliver content in the first place.
So what is this about? At this level of investment, the “living laboratory” isn’t a philosophical inquiry. It has institutional weight and is getting marketing attention, because improving student life is a business matter. Students are the predictors of revenue: unless they drop out, they’re here for a fixed duration, at a mostly fixed price.
Unless they drop out.
This is why we end up here:
And specifically here:
The data allows us to generate contextual information about the lifecycle of the student, the day to day reality of the staff member, the activity pattern of a lecture theatre, and the dynamics and environmental health of a library.
But what allows us to do anything with data are the standards of our research protocols. University-technology partnerships introduce two unlike research cultures to one another, and there have been cases where the seemingly lower standards in commercial research have dragged university researchers into some wet sand. Facebook emotion manipulation study? Yes, that one.
So while it’s good to read that “[e]xcept where specific consent is given, data collected is not linked to an individual” this is quite an odd promise: surely the point of behavioural monitoring is to know whose behaviour you’re looking at? Because this isn’t just space utilisation surveying, whether by clipboard or thermal counter mapping. This promises to ticket a specific struggling individual and send resources to help, and that means somewhere in the system knowing who they are, and being able to track them over time.
And there’s a set of steak knives thrown in: insights about the “day to day reality of the staff member”, who is presumably also known as an identifiable individual somewhere in this system. Unless this project is suggesting that day to day reality is substitutable among different members of the same sample population, which is really not what day to day reality is, not at all.
So, trying to be fair, maybe de-identification is a detail that’s been poorly represented in marketing. Conceivably the project itself has robust consent standards that aren’t visible here. Possibly students and staff involved in its trials are keenly aware of the volume of data on their behaviour that is collected, and have been told where it is stored, and how and with whom it can be shared. Ideally all experimental participants are regularly reminded how to review, delete or use their own “living laboratory” data in whatever way is useful to them. You’d hope.
This bit, however, is unambiguously in the road map.
In the future, the system will identify students who live only kilometres from each other and drive to attend classes around the same time.
From there … the university can put those students in touch with one another for carpooling and study buddy reasons.
And truly, this is not trivial. This weird mix of Snapchat maps and the High There! hopper service is coming to a campus near you because no one at the executive level thinks it’s even slightly creepy to put students in touch with one another on the basis of their private data. So this should make us all sit up: in this world, digital privacy must quickly become a core literacy in every discipline at every level. (Happily, understanding the day to day reality of surveillance is a more work-facing graduate capability than mastering the passive voice so it’s a win-win, of some shabby kind.)
In her fine essay on the need for a digital sanctuary movement on US campuses, Amy Collier argues that as higher education becomes more intensively extractive, “we need to recognize and deconstruct our perspectives on the relationship of data to our understanding of student learning.” We need to unthink the assumption that when we measure visible actions like showing up, or logging on, we’re generating insights into what a student is thinking. We need to caution ourselves against data hubris and remember that watching what someone is doing is the most limiting way of learning what motivates them to do it.
This is the story digital ethnographer Mike Wesch tells in his beautiful 2015 video The Sleeper. Like many of us, he had the experience of teaching a student who regularly fell asleep in class. Students are shift workers, carers, commuters, overloaded social beings, and sometimes just tired with the day to day reality of their lives. He decided to learn more by taking the student out to lunch. He learned that his observational judgement, using the data visible to him, was wrong; he also realised that this misplaced assumption had real consequences for this student, and all students similarly written off.
Because that’s the real tragedy. It’s not just that I saw David in a certain way. It’s that he saw himself that way too.
Like Mike Wesch, Amy Collier argues that we need to act with far more care, and attend conscientiously to the risk of unintended consequence as we hoard data on student behaviour and mistake this for their day to day reality.
We in higher education need to seriously consider how we think about and handle student data, and we need to respectfully and empathetically acknowledge where our practices may cause harm.
She’s right. It’s time to pause, and to reset our goals.
We have built the extractive technology to track students minutely. We can continue to invest in improving its efficiency and extending its range. We can boast and promise and envision the seamless world in which human gesture is all the window we need into human thought. We can forget everything we know about the history of surveillance and social vulnerability.
Or like the other mining industries in our world economy, we can start to think ahead to the risks and consequences of carrying on like this. This future is not inevitable, and our concerns are not naive. This technology is part of a business strategy thats trying to fight its way out of the bag of Baumol’s cost disease, by turning service into product. Counting students in and out of lecture theatres is not trying to improve student life or learning. It’s searching for solutions that will contain the labour cost of actually listening to students about why they come to lectures, or don’t.
All around us are troubling signs of the automation of educational care, from a future that we need to challenge. To live well with the technology we are developing in universities, we are increasingly going to need the courage and the humility to interrogate its use.
All this owes a lot to Audrey Watters and Chris Gilliard for keeping track of those who are keeping track of us.