‘Is it proportionate to scan and store the faces of peace protesters?’

‘Privacy enables personhood.’

Watching brief: Mike Nellis says Quakers should be concerned about the surveillance state

‘Privacy enables personhood.’

by Mike Nellis 19th January 2024

British Quakers have a longstanding commitment to civil liberties, human rights, and democratic accountability. This is grounded in our testimonies to peace, truth and integrity. In 2020 Quakers in Britain co-signed an open letter to Boris Johnson, then prime minister, expressing concern about: the deteriorating rights of migrants and minorities; threats to the Human Rights Act; and increases in judicial and police power over protestors. These remain live issues. The planned roll-out of controversial facial recognition technology, and the imminent dismantling of democratic accountability over state surveillance, are probably less familiar, but not unconnected. Intrusive technologies are one of the ‘threats to democracy’ recently explored in the BBC’s 2023 Reith lectures. Quakers, I hope, will become more concerned about the ‘invasive technification’ of everyday life.

Let’s backtrack. In 2006, Richard Thomas, then the information commissioner, warned that, with vast amounts of personal data becoming available to governments and corporations, Britain was ‘sleepwalking into a surveillance society’. He questioned whether the scale of data extraction from citizens could ever serve the common good. In 2009, seven Friends in the Quaker Civil Liberties Network drew ‘the database state’ into the purview of our concerns. They worried about the growth of authoritarianism and the meaning of liberty when government knows so much about us.

The Conservative government of 2010 seemed keen on ensuring democratic accountability in the light of emerging technologies. In 2012 it appointed separate biometric and surveillance camera commissioners (BSCC) to monitor the police use of fingerprints, DNA, and public-space CCTV. But a decade on, the present Conservative government wishes to dispense with them.

This set of trusted safeguards is being set aside without adequate explanation. Alongside this, new governance procedures for using artificial intelligence (AI) in policing, of which facial recognition is a type, are being established. These place more emphasis on innovation than regulation, which suits commercial providers. The Data Protection and Digital Information Bill, currently before parliament, hints that the work of the BSCC might be accommodated in the already overworked and under-resourced Information Commissioner’s Office, but research has already questioned the viability of this.

In consequence, significant oversight functions will simply cease after the BSCC’s demise. These are: reviewing police handling of biometric samples in particular cases; maintaining an up-to-date surveillance camera code of practice; providing guidance on procuring tech from reliable commercial sources; and providing annual reports to the Home Office and to parliament. Proper accountability requires them all.

Why lose effective democratic safeguards, when the need for them is arguably greater than ever? The reason is this: over time, the BSCC resisted government aspirations to expand police use of facial recognition technology across England and Wales unless certain operational standards were met. But most police forces across the country want this technology, as does the police minister. Ergo, the commissioners must go.

Mass facial recognition technology undermines the last vestiges of privacy in public places. It is a singularly intimate form of surveillance, and can use our faces to trace information about us on the internet. Whether used for trawling crowds in streets, malls and arenas, or for identifying suspects from police databases, it blurs the distinction between mass and targeted surveillance. The latter may be justifiable to prevent terrorism or to pursue organised criminals, but is it proportionate to scan and store the faces of peace protesters or even shoplifters? In any case, the technology is not reliable. It scandalously overpredicts ‘matches’ for women and black people, resulting in disproportionate stopping and false arrests.

Technical solutions might yet eradicate racial/gender bias, but a perfected system is no less worrying. Some US police forces already use facial recognition without any external regulation, while other US cities have banned it completely. The European Union’s Artificial Intelligence (AI) Act regards the use of real-time biometric identification systems in public as ‘intrusive and discriminatory’, and wants their use tightly regulated. It is disconcerting that the Westminster government (excluding Scotland, although it is affected) is abolishing its surveillance commissioner. In 2012, parliament took tentative steps towards limiting the intrusions of surveillance; it is now more open to maximising them.

I doubt that Quakers believe the liberal fiction that ‘if one has nothing to hide one has nothing to fear’. Privacy enables personhood. Modern police states grow incrementally, combining data from government and commercial sources. As citizens and consumers we are seduced and coerced into giving our data away in every online interaction and smart environment through which we pass. We may care little how that data is aggregated and used, or by whom, simply because as individuals we have negligible influence over the process. Why worry about extensive facial recognition when pervasive surveillance is already normal?

There are undoubtedly dangers, including digital dangers, from which citizens do need protection. But ramping up investment in hi-tech policing creates dangers and vulnerabilities of its own, all the more so when traditional public services are being pared to the bone. There are so many social problems to which AI is not an obvious solution, despite the hype which pretends otherwise. AI is unfathomable to most lay people, which should make democratic accountability – informed by experts who at least understand this not-so-brave new world – more, not less, important.

How then should we go forward? Some Quakers have founded the Just Algorithms Action Group (JAAG), to help concerned people to engage with our pervasive digital environments and the surveillance latent within them. The Irish Catholic philosopher John O’Donohue may rouse us differently, celebrating ‘the holiness of the human gaze’ but deploring its opposite, ‘the intrusive stare’. ‘When you are stared at’, he says, ‘the eye of the Other becomes tyrannical. You become the object of the Other’s stare in a humiliating, invasive and threatening way’. Furthermore, he writes pertinently, ‘the face always reveals the soul: it is where the divinity of the inner life finds an echo and image. When you behold someone’s face you are gazing deeply into his or her life’.

Quite so. While the US poet Richard Brautigan might have thought of ‘machines of loving grace’ that ‘watch over’ us, it is one thing for God to know us better than we know ourselves, and quite another for states and corporations to aspire to the same.

For information on the Just Algorithms Group, see www.jaag.org.uk.


Comments


Please login to add a comment