Police vs privacy replay at Cardiff-Swansea football match
Police vs privacy replay at Cardiff-Swansea football match
Fans, privacy rights campaigners and a Welsh Police and Crime Commissioner have already criticised the plans as being “a step too far”.
South Wales Police used the surveillance technology in October 2019 at the previous match between Cardiff City and Swansea City, attracting protests from fans.
For that match, held in October – at Swansea’s ironically named Liberty Stadium – a football supporters’ association called on fans to wear Hallowe’en masks to counteract the police’s deployment of its facial-recognition surveillance system. Vince Alm, spokesman for the Football Supporters’ Association Wales, told human rights think tank Big Brother Watch at the time that the problem was that “we haven’t had a say and we can’t opt out”.
Alm added: “Thousands of innocent fans who have never committed a crime in their lives, including children, will have their faces scanned and data collected by police.”
South Wales Police defended the practice, writing on Twitter that its watchlist “is event-specific and is only being used to reduce the threat of, or likelihood of, disorder. Those on our watchlist have previously been convicted of offences at football matches and all have valid banning orders not to attend today’s game.”
With its “robust policing plan”, South Wales Police said it would aid identifying people barred from attending matches.
Despite the surveillance operation producing no successful facial-recognition matches, the police force plans to use the tool again when the two teams meet this weekend. The Football Supporters’ Association Wales intends to protest the police monitoring at this weekend’s match, alongside Big Brother Watch.
Silkie Carlo, director of Big Brother Watch, said: “Police repeatedly targeting football fans with this new and dangerous mass-surveillance tool treats them like suspects, erodes public freedoms and wastes public money. South Wales Police are acting like big brother and seem tone deaf to public concerns.
“We will keep fighting facial-recognition surveillance until its use is ended. It’s one of the most extreme surveillance technologies in the world and has no place in Britain. Government should urgently issue a ban on police and private companies monitoring the public with this authoritarian surveillance technology.”
Alm added: “It’s unbelievable that police are targeting us with facial-recognition surveillance again. Fans coming out for a local football match, including hundreds of families and children, will be treated like they’re in a police line up and have their faces scanned without their consent.
“We protested against it in October and we’ll protest again. We shouldn’t be made to feel like criminals just for going to a football match.”
Arfon Jones, Police and Crime Commissioner for North Wales, said: “It’s disproportionate to use facial-recognition technology to take pictures of supporters at football matches. It’s a step too far and creates the potential for miscarriages of justice.
“I’m sure there are people from North Wales who will be going down to the game and risk having pictures taken of them without their consent. I have a responsibility to represent them and to oppose fishing expeditions that invade their privacy”.
The blanket use of facial-recognition cameras in public places is facing a legal challenge in two separate human rights cases that claim the surveillance breaches privacy rights. One crowdfunded challenge is being brought by civil liberties campaign group Big Brother Watch and Green peer Baroness Jenny Jones against the Metropolitan Police, which has since paused its use of the technology.
British human rights and technology groups are united with UK MPs in demanding an immediate cessation to the use of facial-recognition technology in public spaces by UK law enforcement services.
The second legal challenge is being pursued by Dr Ed Bridges against South Wales Police, after he believed his face was scanned by the force at a peaceful anti-arms protest while doing his Christmas shopping in 2017. In September 2019, two High Court judges dismissed the case, saying the use of the technology was not unlawful and that the current legal regime is adequate to “ensure appropriate and non-arbitrary use” of it.
However, Lord Justice Singh subsequently gave an order granting permission for Bridges to appeal against the decision, with the Court of Appeal expected to hear the case by January 2021.
In the wake of a string of legislative bans of facial-recognition surveillance in the US, including in San Francisco, Oakland, Berkeley and Somerville, Big Brother Watch is calling for an “urgent ban” in the UK.
Police facial-recognition has already been shown to be “almost entirely inaccurate“, according to Big Brother Watch.
The pressure group highlighted figures showing a large number of ‘false positives’ had resulted from the use of facial-recognition cameras by police at sporting and cultural events in recent years.
For the Metropolitan Police, 98 per cent of ‘matches’ were in fact wrong, while for South Wales Police the figure was 91 per cent that were wrong.
Both police forces have insisted the accuracy of the technology is improving over time. They also say safeguards are in place to prevent tangible action being taken against innocent people.
Elizabeth Denham, the Information Commissioner, said there had been a lack of transparency about how the “particularly intrusive” software was being used. However, she acknowledged it represented “both a risk and opportunity” for public protection.
“There may be significant public safety benefits from using facial-recognition technology – to enable the police to apprehend offenders and prevent crimes from occurring,” Denham said.
The Information Commissioner’s Office (ICO) has since issued guidelines around AI, urging companies to make sure it is “transparent and accountable” in order to assuage people’s fears over use of the technology.
Research conducted by the ICO, a public body, showed that over 50 per cent of people are concerned about machines making complex automated decisions about them.
The ICO’s first draft regulatory guidance laid down four principles, obliging companies to adhere to GDPR standards when considering AI implementations. The four principles are transparency, accountability, context and impact.
In the US, the American Civil Liberties Union (ACLU) has filed a lawsuit against US government bodies in an effort to expose information about how government agencies use facial-recognition technology.
Separate lawsuits have been filed by the ACLU and its Massachusetts branch against the US Department of Justice, the FBI and the Drug Enforcement Agency in the US District Court for the District of Massachusetts.
After its freedom of information requests were met with silence, the ACLU hopes that filing the lawsuits will help expose records of how law enforcement agencies are collecting and using biometric data, including for surveillance purposes.
Meanwhile, a survey of 600 organisation leaders and 2,000 members of the UK public by Fujitsu has shown that over half (54 per cent) believe that governments must take more responsibility in regulating the gathering and use of personal data.
The Fujitsu report, ‘Driving a trusted future in a radically changing world’ also revealed that while citizens are demanding consumer-style experiences in their engagements with public services, citizens remain concerned about how their data is used. The biggest concerns centre around sharing their personal data (35 per cent); lack of trust in how organisations use their personal data (34 per cent), and doubts about the reliability of technology (31 per cent).
On the other side of the debate, over two-thirds (67 per cent) of leaders said they are concerned they will never fully satisfy citizens’ expectations, while a further 48 per cent felt that organisations are put under too much pressure to positively drive society.
Fans, privacy rights campaigners and a Welsh Police and Crime Commissioner have already criticised the plans as being “a step too far”.
South Wales Police used the surveillance technology in October 2019 at the previous match between Cardiff City and Swansea City, attracting protests from fans.
For that match, held in October – at Swansea’s ironically named Liberty Stadium – a football supporters’ association called on fans to wear Hallowe’en masks to counteract the police’s deployment of its facial-recognition surveillance system. Vince Alm, spokesman for the Football Supporters’ Association Wales, told human rights think tank Big Brother Watch at the time that the problem was that “we haven’t had a say and we can’t opt out”.
Alm added: “Thousands of innocent fans who have never committed a crime in their lives, including children, will have their faces scanned and data collected by police.”
South Wales Police defended the practice, writing on Twitter that its watchlist “is event-specific and is only being used to reduce the threat of, or likelihood of, disorder. Those on our watchlist have previously been convicted of offences at football matches and all have valid banning orders not to attend today’s game.”
With its “robust policing plan”, South Wales Police said it would aid identifying people barred from attending matches.
Despite the surveillance operation producing no successful facial-recognition matches, the police force plans to use the tool again when the two teams meet this weekend. The Football Supporters’ Association Wales intends to protest the police monitoring at this weekend’s match, alongside Big Brother Watch.
Silkie Carlo, director of Big Brother Watch, said: “Police repeatedly targeting football fans with this new and dangerous mass-surveillance tool treats them like suspects, erodes public freedoms and wastes public money. South Wales Police are acting like big brother and seem tone deaf to public concerns.
“We will keep fighting facial-recognition surveillance until its use is ended. It’s one of the most extreme surveillance technologies in the world and has no place in Britain. Government should urgently issue a ban on police and private companies monitoring the public with this authoritarian surveillance technology.”
Alm added: “It’s unbelievable that police are targeting us with facial-recognition surveillance again. Fans coming out for a local football match, including hundreds of families and children, will be treated like they’re in a police line up and have their faces scanned without their consent.
“We protested against it in October and we’ll protest again. We shouldn’t be made to feel like criminals just for going to a football match.”
Arfon Jones, Police and Crime Commissioner for North Wales, said: “It’s disproportionate to use facial-recognition technology to take pictures of supporters at football matches. It’s a step too far and creates the potential for miscarriages of justice.
“I’m sure there are people from North Wales who will be going down to the game and risk having pictures taken of them without their consent. I have a responsibility to represent them and to oppose fishing expeditions that invade their privacy”.
The blanket use of facial-recognition cameras in public places is facing a legal challenge in two separate human rights cases that claim the surveillance breaches privacy rights. One crowdfunded challenge is being brought by civil liberties campaign group Big Brother Watch and Green peer Baroness Jenny Jones against the Metropolitan Police, which has since paused its use of the technology.
British human rights and technology groups are united with UK MPs in demanding an immediate cessation to the use of facial-recognition technology in public spaces by UK law enforcement services.
The second legal challenge is being pursued by Dr Ed Bridges against South Wales Police, after he believed his face was scanned by the force at a peaceful anti-arms protest while doing his Christmas shopping in 2017. In September 2019, two High Court judges dismissed the case, saying the use of the technology was not unlawful and that the current legal regime is adequate to “ensure appropriate and non-arbitrary use” of it.
However, Lord Justice Singh subsequently gave an order granting permission for Bridges to appeal against the decision, with the Court of Appeal expected to hear the case by January 2021.
In the wake of a string of legislative bans of facial-recognition surveillance in the US, including in San Francisco, Oakland, Berkeley and Somerville, Big Brother Watch is calling for an “urgent ban” in the UK.
Police facial-recognition has already been shown to be “almost entirely inaccurate“, according to Big Brother Watch.
The pressure group highlighted figures showing a large number of ‘false positives’ had resulted from the use of facial-recognition cameras by police at sporting and cultural events in recent years.
For the Metropolitan Police, 98 per cent of ‘matches’ were in fact wrong, while for South Wales Police the figure was 91 per cent that were wrong.
Both police forces have insisted the accuracy of the technology is improving over time. They also say safeguards are in place to prevent tangible action being taken against innocent people.
Elizabeth Denham, the Information Commissioner, said there had been a lack of transparency about how the “particularly intrusive” software was being used. However, she acknowledged it represented “both a risk and opportunity” for public protection.
“There may be significant public safety benefits from using facial-recognition technology – to enable the police to apprehend offenders and prevent crimes from occurring,” Denham said.
The Information Commissioner’s Office (ICO) has since issued guidelines around AI, urging companies to make sure it is “transparent and accountable” in order to assuage people’s fears over use of the technology.
Research conducted by the ICO, a public body, showed that over 50 per cent of people are concerned about machines making complex automated decisions about them.
The ICO’s first draft regulatory guidance laid down four principles, obliging companies to adhere to GDPR standards when considering AI implementations. The four principles are transparency, accountability, context and impact.
In the US, the American Civil Liberties Union (ACLU) has filed a lawsuit against US government bodies in an effort to expose information about how government agencies use facial-recognition technology.
Separate lawsuits have been filed by the ACLU and its Massachusetts branch against the US Department of Justice, the FBI and the Drug Enforcement Agency in the US District Court for the District of Massachusetts.
After its freedom of information requests were met with silence, the ACLU hopes that filing the lawsuits will help expose records of how law enforcement agencies are collecting and using biometric data, including for surveillance purposes.
Meanwhile, a survey of 600 organisation leaders and 2,000 members of the UK public by Fujitsu has shown that over half (54 per cent) believe that governments must take more responsibility in regulating the gathering and use of personal data.
The Fujitsu report, ‘Driving a trusted future in a radically changing world’ also revealed that while citizens are demanding consumer-style experiences in their engagements with public services, citizens remain concerned about how their data is used. The biggest concerns centre around sharing their personal data (35 per cent); lack of trust in how organisations use their personal data (34 per cent), and doubts about the reliability of technology (31 per cent).
On the other side of the debate, over two-thirds (67 per cent) of leaders said they are concerned they will never fully satisfy citizens’ expectations, while a further 48 per cent felt that organisations are put under too much pressure to positively drive society.
Jonathan Wilsonhttps://eandt.theiet.org/rss
https://eandt.theiet.org/content/articles/2020/01/police-vs-privacy-replay-at-key-cardiff-swansea-football-match/
Powered by WPeMatico
