Force defends facial recognition software amid criticism over inaccurate matches
South Wales Police introduced the technology last year.
A police force has defended its use of facial recognition technology after it was revealed that 2,000 people at the 2017 Champions League final in Cardiff were wrongly identified by the software as potential criminals.
South Wales Police began trialling the technology in June last year in a bid to catch more criminals, using cameras to scan faces in a crowd and compare them against a database of custody images.
As 170,000 people descended on the Welsh capital for the game between Real Madrid and Juventus, 2,470 potential matches were identified.
However, according to data on the force’s website, 2,297 – 92% – were found to be “false positives”.
South Wales Police admitted that “no facial recognition system is 100% accurate”, but said the technology had led to more than 450 arrests since its introduction.
It also said no-one had been arrested after an incorrect match.
A spokesman for the force said: “Over 2,000 positive matches have been made using our ‘identify’ facial recognition technology with over 450 arrests.
“Successful convictions so far include six years in prison for robbery and four-and-a-half years imprisonment for burglary. The technology has also helped identify vulnerable people in times of crisis.
“Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops.
“Since initial deployments during the European Champions League Final in June 2017, the accuracy of the system used by South Wales Police has continued to improve.”
2️⃣0️⃣0️⃣0️⃣positive matches reached with our ‘Identify’ facial recognition technology in past 9️⃣ months with over 4️⃣5️⃣0️⃣ arrests. 🚓 https://t.co/iA7dEwRWkb— South Wales Police (@swpolice) May 4, 2018
The force blamed the high number of false positives at the football final on “poor quality images” supplied by agencies including Uefa and Interpol, as well as the fact it was its first major use of the technology.
Figures also revealed that 46 people were wrongly identified at an Anthony Joshua fight, while there were 42 false positives from a rugby match between Wales and Australia in November.
All six matches at the Liam Gallagher concert in Cardiff in December were valid.
Chief Constable Matt Jukes told the BBC the technology was used where there was likely to be large gatherings, as major sporting events and crowded places were “potential terrorist targets”.
“We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that,” he said.
“But we don’t take the use of it lightly and we are being really serious about making sure it is accurate.”
The force also said it had considered privacy issues “from the outset”, and had built in checks to ensure its approach was justified and proportionate.
However, civil liberties campaign group Big Brother Watch criticised the technology.
In a post on Twitter, the group said: “Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool.”