The facial-recognition system at King’s Cross is to be investigated by the UK’s data-protection watchdog.
Media exposure of live facial recognition at the site prompted the Information Commissioner’s Office (ICO) to look into how it was being used.
The ICO will inspect the technology in place and how it is operated to ensure it does not break data protection laws.
The regulator said it was “deeply concerned” about the growing use of facial-recognition technology.
Fair and transparent
The Financial Times was the first to report a live face-scanning system was being used across the 67-acre (0.3-sq-km) site around King’s Cross station in London.
Developer Argent said it used the technology to “ensure public safety” and it was just one of “a number of detection and tracking methods” in place at the site.
But the use of cameras and databases to work out who is passing through and using the site has proved controversial.
So far, Argent has not said how long it has been using facial-recognition cameras, what is the legal basis for their use, or what systems it has in place to protect the data it collects.
In its statement, the ICO said: “Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all.”
The regulator said it was keen to ensure that King’s Cross developer was using the technology in accordance with UK laws governing the use of data.
- King’s Cross developer quizzed on face recognition
- Facial recognition use prompts call for new laws
“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way,” said the ICO.
It must have documented how and why it believed its use of the technology was legal, proportionate and justified, it added.
Argent has not yet responded to a request for comment by BBC News.
The mayor of London is also quizzing developer Argent about its use of facial-recognition systems.
Sadiq Khan wrote to the company and said there was “serious and widespread concern” about the legality of facial recognition.