This blogpost is inspired by a panel discussion on democratic surveillance that took place on January 27th 2021 at the annual CPDP conference and implicitly quotes and paraphrases panellists Marion Oswald, Quirine Eijkman, Koen Gorissen, Arne Hintz and moderator Rosamunde Van Brakel.
Aware of their duties and responsibilities to protect the public, enforce law and guarantee public safety, making effective use of data available from a variety of sources is an understandable temptation police institutions face to address contemporary security issues. As such, police in the Netherlands recently scanned social media to anticipate and better inform the deployment of officers responding to riots related to far-reaching COVID19-restrictions. In Belgium, zealous officers deployed thermal-imaging-equipped drones to facilitate their surveillant activities and in the UK, South Wales Police deemed the indiscriminate use of AFR Locate (real time automated facial recognition software) at certain events necessary and proportionate to achieve its statutory obligations.
In healthy democracies, checks and balances prevent power from deteriorating possibly resulting in human rights violations. In the above-mentioned Belgian and British examples, the use of surveillance technologies was halted ex post due to, inter alia, deficiencies in the DPIA’s. Set up as an accountability measure in the GDPR, these assessments should identify high risk situations and foresee in additional measures to address these risks. “Unlawful and in violation with human rights”, the UK Court of Appeal ruled the use of AFR Locate by South Wales Police. Additionally, deficiencies in the DPIA “prohibited an adequate assessment to the risks and rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from these deficiencies”. Similarly, the Belgian Supervisory Body for Police Information halted the use of drones equipped with thermal-imaging camera’s due to deficient DPIA’s, inconsistencies and uncertainties in the required procedure for the deployment of such intrusive surveillance technologies.
When ex post evaluations reveal essential deficiencies in the way these DPIA’s are conducted, the door has been left wide open for violations of data protection law and human rights. The fact that surveillance actively contributes to discrimination and cumulative disadvantage of marginalized and vulnerable groups in society further exacerbates the implications of these developments and necessitates an adequate representation of interests in the decision making process of those most affected.
It is unlikely that these interests will be adequately represented in a fit-for-all form, but rather subject to cultural differences, temporal evolutions and sector-specific characteristics. In line with the recent academic interest in democratic innovation – where more direct forms of advancing citizen voices are being proposed outside of the traditional decision-making process – the UK has been host to a number of initiatives such as citizen juries, deliberative polling, citizen assemblies, or citizen-based policy developments demonstrating government recognition of a growing need for participation in data-related decisions. The Innovation in Democracy programme is set up specifically to encourage and help local governments to implement innovative models of deliberative democracy. Similarly a recent OECD report on innovative citizen participation includes best practice principles for deliberative processes for public decision making.
Another avenue for adequate interest representation is to call upon interdisciplinary expert committees better placed to grasp the technical, legal, social, ethical and other complexities of the issues at stake. The West Midlands Police’s Ethics Committee was established to provide independent advice to the Chief Constable and the Police & Crime Commissioner regarding ethical issues arising from the work of the Data Analytics Lab, set up to inform WMP’s operations and strategic decisions based on data analysis. Combined, the members of the Committee provide advice that is informed by expertise on surveillance, human rights, data science, ethics, youth justice, anti-social behaviour, governance, police transformation, diversity and equality. In assessing how data sets will be used, how they relate to particular groups and how an algorithmic tool will be used in practice, the Committee not only addresses issues at the start of the project but also applies a continuous evaluation to be able to adjust for unexpected outcomes should they occur.
These alternative avenues attempting an adequate representation of interests have their own limitations. Citizen juries are usually one-off events, tied to specific policy processes and limit citizens’ ability to fundamentally question the use of data, effectively moving the goal post. The work of the Ethics Committee is extremely hard, exhausting and time consuming for everyone involved due to the complexity of the issue and the interdisciplinary approach. As society changes, adequate representation requires expert committees to reflect these changes and thus be subject to continuous evaluation themselves. These considerations all add to the complexity of the matter but nevertheless offer valuable building blocks for a more participatory infrastructure of oversight and decision making transcending a narrow data protection approach.
This post was written by Bram Visser, PhD student at the VUB Chair in Surveillance Studies.