This blogpost is inspired by a panel discussion on educational surveillance that took place on February 4th 2021 and implicitly quotes and paraphrases panellists Ben Williamson, Chris Gilliard, Seda Gürses and Valerie Steeves. You can watch the full discussion here.
Stakes were high in August 2020 as hundreds of students chanted “fuck the algorithm” in front of the UK’s Department for Education to protest its decision to moderate teacher predicted grades algorithmically as they were unable to sit their exams amidst the COVID-19 pandemic. Students saw their chances to go to university diminished as the software caused a considerable downgrading of their performance. Reinterpreting their choice of words, the students’ voices were heard as the government replaced the algorithmically moderated test results with the original teacher predicted grades.
This is just one of the many examples of how technological implementations can threaten the values and integrity of our public institutions – like education – and how due care and careful thought are necessary to protect these values. A more wide-spread phenomenon that was introduced during the pandemic is what is called the Zoomification of education – in reference to the business video-calling software Zoom – by which we increasingly got to know each other as mere rectangles on a screen, further receding the communal experience of education.
Under the guise of optimisation, a lot of tech companies were eager to lend a helping hand when universities were forced to move online. Consequently, cloud-services like Zoom subtly replaced universities’ codes of conduct for the company’s terms of service and successfully waived responsibility for content moderation. Think of the so-called Zoom-bombs where meetings were hijacked by often offensive and discriminatory material, or the impotence universities faced when lectures were cancelled unilaterally. Remote proctoring software like Proctorio, ProctorU or Exam Soft introduced facial recognition into the school system and the prevalent license extensions and financial investment that came along with it often meant that these companies determine the school’s trajectory in terms of policy. From an educational perspective, these evolutions seem at least problematic.
Another unusual mutation in pedagogy is observed with the introduction of Class Dojo, an application allowing teachers to award positive behaviour points for students displaying the correct comportment and conduct. This is a fundamental shift in thinking about child education as socializing young children into knowledge and forms of skill and practice is replaced by assessments of character and mindset, and a prime example of how companies are actively exploiting school procurement and vetting loopholes by marketing their products directly to the teachers.
All these transformations paint a bleak picture for the future of education in the hands of big tech companies. But is there something we can do? Is there a way to resist these transformations and protect the values – to educate and provide knowledge for better societies – we cherish? Can we reimagine the future of education without walking into the trap of the cloud? Based on concrete examples, emerging processes, initiatives and scientific literature, we can suggest some survival tactics for the attempted tech take-over of education.
First, when framing the students as customers, student pushback often means more than faculty pushback and can lead to considerable changes in the way these systems are implemented. The “fuck the algorithm” protest is a prime example. Second, having in place strong governance processes ensuring accountability will avoid having systems in place that are not scrutinized multi-disciplinary, and should not have been there in the first place due to its inherent risk for a certain relevant aspect of implementation. Third, to persuade decision makers buying into the digital transformation hype, the collection of a body of evidence in which the drawbacks and consequences of these technological implementations are very specific and clear is absolutely necessary. Fourth, coming together and actively thinking about alternatives opens the door for other alternatives actually coming to fruition. ‘A catalog of formats for digital discomfort… and other ways to resist totalitarian zoomification’ is one of such initiatives.
While alternative imaginaries might look attractive at first sight, always keep in mind to critically assess whose vision is enacted, what interests are at play, and who is best served.
 This thinking has been developed in the literature in different forms:
- Frank Pasquale’s second-wave algorithmic accountability (https://lpeproject.org/blog/the-second-wave-of-algorithmic-accountability/)
- Evgeny Morozov’s digital socialism
- Marion Fourcarde’s ideas on how to repurpose the use of cloud infrastructures from the perspectives of those most affected
This post was written by Bram Visser, a PhD student at the VUB Chair in Surveillance Studies.