The ongoing political deliberation over the proposed facial recognition technology (FRT) legislation in the government indicates mounting international scepticism over this widely deployed monitoring tool. The bill, formally known as the Draft General Scheme of Garda Síochána (Recording Devices) (Amendment) Bill, does not provide satisfactory responses to these queries. Furthermore, one of the principle European Union regulations permitting the use of FRT, the newly ratified Artificial Intelligence Act, also falls short on key clarifications.
The fact that FRT will primarily be controlled under an Artificial Intelligence Act reveals the potential issues embedded in this technology, emphasizing its worrying prominence on the EU’s list of increasingly pervasive, powerful, and covert mass monitoring techniques. FRT strives to recognise potential culprits or detect illicit activities by examining and deciphering stored or live footage, then correlating a face to a database of potential offenders.
However, the database does not consist of individuals identified based on plausible reasons to believe they may have executed or engaged in the offence under consideration. In fact, the database includes all of us. Techniques like these could once have been rejected as excessively intrusive, broad, and incompatible with democratic principles.
However, they might still be rejected, as legal disputes are expected to reach the European Court of Justice (ECJ), especially when such technologies begin usage, and reports of unjustified arrests and unwarranted monitoring escalate, which is inevitable.
With FRT, we also become subjects of a compounded digital surveillance system driven by FRT and AI. Furthermore, there is still a lack of convincing proof that scanning communities, collecting massive facial databases, and trawling through them for possible suspects serves any substantial success. The benefit of such sporadic successes does not justify the drastic invasion of all our privacy by essentially storing us in a digital police parade featuring our sensitive biometric data.
FRT adds to the list of excessively intrusive technologies and practices that form my informal “summer of surveillance” series. This is an insight into how the EU and its member countries are steadily venturing into levels of citizen-monitoring that would have been taboo just a few years ago.
The storing of everyone’s communication data and propositions to introduce weak digital “back doors” into popular communication tools like encrypted applications are some strategies being considered. Inevitably, these tactics result in apps with compromised encryption.
The supposed precision of this technology has been repeatedly disproved, with many of the precision statistics derived from unrelated use scenarios that involve comparing two relatively clear images, not footage from the real world.
Significantly, Facial Recognition Technology (FRT) is known, through numerous studies, to demonstrate severe gender and racial bias. This mirrors issues with the data it has been trained on. This is an issue similar to those found systemically in AI, which is closely linked with FRT.
Sadly, more than just studies show these biases; they are experienced by real individuals, often people of colour, whose lives are distressingly altered when wrongly linked to a crime by FRT.
Recently, I led a discussion for the Irish Council for Civil Liberties on FRT, featuring guest Robert Williams. Williams, a black man hailing from Detroit, was wrongfully detained in front of his family and neighbours in his driveway upon returning from work due to an incorrect FRT match for a minor offence.
Williams shared the continuous trauma ensuing his arrest for his family and him especially his children. He was held for questioning by the police for over a day before being released. The American Civil Liberties Union (ACLU) then took up his dreadful experience. Following this, Williams won his groundbreaking lawsuit against the Detroit Police Department last week, leading to the implementation of new procedures and protections by the department.
He urged Ireland not to implement FRT while visiting and said, “You have the chance here in Ireland to avoid its introduction. I hope your Government will consider experiences like mine and carefully reconsider before introducing FRT into law enforcement.”
Humorously, he also mentioned that Detroit could have used the tens of millions spent on the ineffective FRT to tackle the poverty that often leads people to crime.
In February, the Joint Oireachtas Committee on Justice interrogated the proposed FRT Bill, seeking clarification from An Garda Síochána and the Justice Department about the technology’s application and the image databases it would utilise. But, no responses have yet been provided.
The only acknowledged source of biometric data in Ireland is the contentious Public Services Card. Despite initially professing its use to be confined to social services, it has now been extended to so many other areas that it has brought the Data Protection Commissioner into the picture. It is imperative that this source is not utilised for FRT.
The extant Irish Bill is more ambiguous than the stipulations of the EU AI Act on FRT, which is also causing apprehension. The issues with the case of Williams exemplify the potential perils inherent in FRT. It is the State’s responsibility to provide lucid responses about its precise plans for the usage of FRT and its actual necessity.