News

Why You Should Be Worried About The Use Of Facial Recognition Surveillance

by Niellah Arboine
Steffi Loos/Getty Images News/Getty Images

Technology has come in leaps and bounds — it's growing, adapting and changing. This is also means the ways in which technology is harnessed and used are also changing. But now live facial recognition surveillance has come under scrutiny and there are growing calls for the police and other organisations to stop using it. It's worrying to think, that as the BBC reports, the same type of technology used to unlock smartphones, is also being used to track the faces of members of the public by the police. This doesn’t sound like a far cry from an episode of dystopian technology drama Black Mirror.

According to the BBC, facial recognition allows people’s identifications to be checked in real time via CCTV usually from watch lists made up by the police. Software maps existing images held by the police by looking at the geometry of their face, they are then able to scan faces in a crowd or in public, the technology then compares it to their database in order to get a matching ID. Any faces that are flagged by the system could be stopped by police or security.

The Guardian reported that this style of technology has come about thanks to sites like Facebook, Flickr and Instagram who hold “billions of photos of people’s faces”. These images are used to “train deep neural networks, a mainstay of modern artificial intelligence, to detect and recognise faces.”

Earlier this year, The House of Commons Science and Technology committee warned that innocent people could have their face included in these watch lists created to catch criminal suspects. In addition, facial bias could mean the wrong people could be identified, including people of colour and people wearing makeup. The committee said in their Biometrics Strategy and Forensic Services report: “We reiterate our recommendation from our 2018 Report that automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved.”

Scott Barbour/Getty Images News/Getty Images

As the New York Times notes, facial recognition sadly has a long history of bias. If systems are trained to only recognise faces of white men, for example, that means it could potentially be less accurate for people of colour. According to the Huffington Post, a study by the Massachusetts Institute of Technology showed that the software had “an error rate of 34.7% for darker-skinned women, compared with 0.8% for lighter-skinned men.”

The report from the House of Commons Science and Technology committee also raised concerns about images of unconvicted people being kept on databases, it read: “progress has stalled on ensuring that the custody images of unconvicted individuals are weeded and deleted. It is unclear whether police forces are unaware of the requirement to review custody images every six years, or if they are simply ‘struggling to comply.’”

It seems only fair that members of the public should be made aware of how facial recognition is being used, and what effects this could have on them, especially if they are a person of colour who could disproportionately affected by the lack of racial accuracy of these systems.