France: AI surveillance cameras erected for Olympics may become permanent

AI-powered mass surveillance cameras being erected across Paris in preparation for the 2024 Olympics may be there to stay, the French government has said.

France’s Parliament drew heavy criticism in March when it authorized a mass surveillance network to be built across the capital to tackle “security risks” during next year’s Olympic and Paralympic Games. The network includes public-facing cameras equipped with facial recognition technology which will scan faces and objects and process the images in real time using an AI algorithm.

Over 60 MPs opposed the bill authorizing the cameras, and groups such as Amnesty International condemned the decision over potential privacy violations. But in May the French Constitutional Council — France’s highest constitutional authority — ruled in favor of the bill, which permits the cameras to remain until March 31, 2025.

Subsequently, in late September France Sports Minister Amélie Oudéa-Castéra said the government is considering making the cameras permanent fixtures.

"Intelligent video surveillance will only be valid from [sic] the Olympic Games and it will apply, for the future, under this experimental phase, for other major events, provided that they present a particular risk to the safety of people," Oudéa-Castéra told France 3according to Inside the Games.

"Obviously, there will be no extension of this experiment without a precise and transparent evaluation of its effectiveness with regard to the security challenges of our country.

"If it proves itself and is surrounded by guarantees, the French expect us to act for their security and to use new means, including digital, to promote this security." 

Last month Oudéa-Castéra told the French Senate she is also not ruling out deploying the army for the games, though the question will be left unanswered until the beginning of next year.

The minister’s remarks regarding extending the surveillance “experiment” indefinitely come as neighboring Britain also moves to increase surveillance.

Last week British Policing Minister Chris Philp penned a letter to police forces nationwide urging them to increase their use of both passive and active facial recognition searches.

Passive — or retroactive — facial recognition (RFR) involves police combing through CCTV footage after a crime is committed and matching the suspect’s face with a police database. 

“Every force uses RFR to some extent already, but its use is very variable between Forces and could be greatly increased,” Philp urged police.

In the UK, police can search their own individual database or tap into the Police National Database, which Philp recommends.

“Searching the whole Police National Database (PND) image set rather than just local force ones will maximise the chance of a match, and I encourage routine use of RFR across the entire range of crimes.”

Active — or live — facial recognition (LFR), on the other hand, involves law enforcement using “special purpose cameras” to scan crowds for people on police watchlists.

“I am also very supportive of the use of Live – or Active – Facial Recognition (LFR) to deter and detect crime in public settings that attract large crowds,” assuring police there is a “sound legal basis” for its use. To illustrate the benefit of using LFR, Philp cited a recent sports game where the use of LFR helped police arrest three people, including “one who admitted using threatening and abusive words and being in breach of a court order.”

But UK law enforcement has recently come under fire for its practices regarding its databases and watchlists. Police have yet to delete over three million images of people who were never charged with a crime despite being ordered to do so by a court in 2012.

“So when we’re having conversations about new technologies such as facial recognition, the conversation often comes back to: ‘Why would we trust you to get this bit right? When you’ve still got legacy problems from 10 years ago from other images?’ People want to know with facial recognition: how do they find their way onto a watch list, and how could they get off it? And that’s really important,” UK Biometrics and Surveillance Camera Commissioner Fraser Sampson told the Guardian.

Police watchlists are known to include non-criminal taxpayers. At the Formula 1 Aramco British Grand Prix in July live facial recognition was used, but a Freedom of Information request later revealed that only 234 out of 790 names on the police’s LFR watchlist were criminal suspects.