Search

Opinion | Police Can't Tell Black People Apart - Facial Recognition Will Only Make It Worse

Updated: Mar 13

It’s 2020, but why does it feel like we’re living in George Orwell’s ‘1984’ dystopia?


On the 24th January, The Metropolitan police announced via Twitter the formal inception of its latest software, live facial recognition. Their insensitive attempt at making the streets safer may result in quite the opposite.



Why is this problematic?


It is a known fact that black and brown people suffer greatly under the hands of the police who, time and time again, falsely accuse them due to their inability to tell them apart. This poor excuse has given the police an umbrella to hide under when consciously pointing the finger in the wrong direction. Under the guise of a uniform and badge, a symbol of protection has implanted fear into the lives of many. With facial recognition software standing alongside police checks, the worry of a system which already discriminates against marginalised people increasing has reached an all-time high.


Studies show that black people in London are more than four times as likely to be stopped and searched as white people. For the black and brown population who are stopped and searched more frequently by the police, this could rapidly become a life-changing issue with facial recognition coming into the mix. It’s too easy to flag someone as suspicious by simply observing them and coming to the conclusion that their actions/appearance isn’t the norm. The potential for misuse is alarmingly great.



This Big Brother approach to 'cracking down on crime' actually provides inaccurate readings and is highly invasive of people's privacy.



Tuesday 11th February began the official deployment of facial recognition technology. This saw a series of vans dotted around the Stratford Centre in London where thousands of people were scanned without consent. Signage located alongside the vans announced the news as the public were automatically scanned once they were close enough to read the signs. A software which was scrapped last year after a trial at King’s Cross in London has been given the go-ahead.


During the trial of the King’s Cross facial recognition software, Researcher’s at the University of Essex discovered that “it could be sure the right person had been identified in only 19% of the 42 cases studied.” In 2017, The Met tested facial recognition at Notting Hill Carnival where it was concluded that the system was wrong 98% of the time. What does this prove? This software is dehumanizing people turning them into “walking ID cards”. Time and time again, this unpredictable surveillance tool which could easily be used and abused is affecting the lives of many. It is a constant reminder that wherever you go, there is a potential that you will be identified or misidentified. Whichever it is, you’re forced to live with it. For a piece of software which is as unreliable as this, it makes one question whether the Met Police really values the people it’s one job is to protect.



The gravity of it all powerfully highlights the fact that the people’s voice doesn’t matter.


Despite the authoritative and assertive entry of facial recognition, the people are resisting, with petitions such as Resist Facial Recognition surfacing. Alongside this, in a bid to protect their privacy, some have turned to concealing their faces which soon resulted in fines of £90 being given out.


In spite of this racist agenda taking place, history proclaims that the struggle for freedom and safety among black and brown people has been an on-going fight which no one is planning to step down from anytime soon.


Words: Joke Amusan


24 views

Looking Glass Collective ©2019. All rights reserved.

  • White Instagram Icon
  • Facebook Clean

Berlin-London