The UK government wants to expand the use of AI-based face recognition by police

The Home Office wants to use controversial facial recognition technology to find and track criminals in the police and other security agencies.

In a document published on Wednesday, government officials outlined their ambitions for the potential deployment of new systems across the country over the next 12-18 months.

Privacy campaigners and academics have criticized the technology as inaccurate and biased against people with darker skin. MPs had previously requested a moratorium for its use by the general public until parliament establishes clear laws.

The government has asked companies to submit proposals for technologies “that can resolve identity using facial landmarks and features”, such as live facial recognition, which involves screening individuals on police watchlists for those who are identified by the technology.

The Home Office has a particular interest in artificial intelligence technologies, which could efficiently process facial data to identify individuals. It also shows an interest in software that can be integrated into existing technologies and CCTV cameras.

South Wales Police, London’s Metropolitan Police, and other police forces have been using facial recognition software for the last five years in public places, including shopping malls, at events like the Notting Hill Carnival, and more recently during the coronation.

Previous reports revealed that private owners in King’s Cross, London, were using facial recognition to scan the public for known troublemakers. They also shared the data with Metropolitan Police. Since then, they have stopped using this technology.

By contrast, the European Parliament is working to ban facial recognition software using AI in public places via its Artificial Intelligence Act.

There is a lot of debate about whether live facial recognition technology should be used on all citizens and if it violates their rights.

The UK police force is not the only one to adopt the technology. Private retailers such as Southern Co-op, J Sainsbury’s, and J Sainsbury have also begun using it.

South Wales Police continues to use facial recognition technology, despite the ruling of the appeal court in 2020. South Wales Police stated that it would pay “serious consideration” to the court’s findings and that the force had changed its policies since the trials.

The Met Police released a report last month that it had reviewed the effectiveness of the technology. It found “no statistically meaningful bias in relation race or gender and the chances of a false matching are just one out 6,000 people passing the camera”.

Stephanie Hare is an independent researcher and technology ethicist who specializes in facial recognition. She believes that there should be a legal framework. “What I do not like is the China’s version of generalised live facial recognition. “We have to regulate and put guardrails around it.”

The Home Office is submitting its submissions to the Defence and Security Accelerator of the Ministry Of Defence.

The Home Office said that facial recognition technology was being used to prevent and detect crimes, improve security and find wanted criminals.

Matthew Ryder, a barrister, former deputy mayor of London, conducted an independent legal review last year. After analyzing existing laws in the areas of human rights, privacy, and equality, he found that they were inadequate. The report recommended a moratorium for live facial recognition, which is the surveillance of people both in public and private places.

Paul Taylor, National Policing’s chief scientific advisor, stated that he “strongly supported” the implementation of facial-recognition technology in law enforcement, and was “encouraged” by its potential.

He said that by establishing robust governance structures, implementing strict protocols for data protection, and ensuring accountability and transparency, we could strike the balance between individual privacy rights and public safety.