Big Brother Is Watching: UK police to increase use of AI facial recognition despite inaccuracies

Big Brother Is Watching: UK police to increase use of AI facial recognition despite inaccuracies

The UK government wants its police to use AI-based facial recognition to solve more crimes and use it to bring charges against perpetrators, despite repeated warnings by activists and AI experts that the technology is highly inaccurate and is prone to make mistakes.

The UK’s Home Office is planning to increase its use of facial recognition technology for the purpose of tracking and locating criminals within law enforcement and other security agencies. The technology that the UK Home Office is using has already made headlines for identifying the wrong people, a number of times.

New biometrics to be added
In a document released on Wednesday, the government outlined its intentions to potentially implement new biometric systems at a national level within the next 12 to 18 months.

This move follows criticism from privacy advocates and independent researchers who have raised concerns about the technology’s accuracy and bias, especially in its treatment of people of colour.

Members of Parliament have previously called for a halt to its use on the general population until clear legislation is established by the parliament.

The government is now inviting submissions from companies offering technologies capable of “resolving identity using facial features and landmarks,” including live facial recognition systems that involve scanning the general public to identify specific individuals on police watch lists.

The Home Office is particularly interested in innovative AI technologies that can efficiently process facial data for individual identification, as well as software that can be integrated with the department’s existing technologies and CCTV cameras.

Facial recognition’s past failures in the UK
Facial recognition software has been tested in public spaces over the past five years by law enforcement agencies like South Wales Police and London’s Metropolitan Police, including trials in shopping centers and during events such as the Notting Hill Carnival and the recent coronation.

As per a Financial Times report, private entities, such as the owners of King’s Cross in London, used facial recognition to scan the public for known troublemakers and shared this data with the Metropolitan Police, although they have since ceased this practice.

EU to ban AI-based facial recognition?
In contrast, the European Parliament is moving toward banning the use of AI-driven facial recognition software in public spaces through its Artificial Intelligence Act. The legality of using live facial recognition on the general population in the UK is currently a subject of debate, as is the question of whether widespread use of the technology infringes on citizens’ rights.

It’s worth noting that not only law enforcement but also schools and private retailers, including the Southern Co-op and J Sainsbury, have begun adopting facial recognition technology.

In 2020, an appeal court ruled that earlier trials of facial recognition software by South Wales Police were unlawful, although the force continues to use the technology. The Metropolitan Police recently announced that it conducted a review of the technology’s effectiveness and found “no statistically significant bias in relation to race and gender,” with a low chance of false matches.

UK going the China route
Researchers specializing in facial recognition technologies and technology ethics, have stressed the need for a proper legislative framework before widespread use, and often compared it to China’s version of generalised use of live facial recognition, without any rules or regulations.

The government’s Defence and Security Accelerator, which operates under the Ministry of Defence, is overseeing the submission process for the Home Office’s facial recognition technology initiative.

The Home Office has emphasized that facial recognition technology is already in use across various applications within the realms of UK policing and security. These applications include efforts to prevent and detect crime, enhance security measures, and locate individuals wanted by law enforcement, among other functions. Furthermore, expanding the use of this technology is deemed a top priority for the Home Office.

Last year, an independent review led by former deputy mayor of London, Matthew Ryder, highlighted the “urgent need” for new legislation regarding live facial recognition technology. This need was identified through an analysis of existing laws pertaining to human rights, privacy, and equality, which were found to be insufficient in addressing the challenges posed by this technology.

0 Comments: