French police are deploying video surveillance powered by artificial intelligence (AI) at two Paris metro stations used by fans attending pop superstar Taylor Swift'ss concerts in the city.
The decree, published by the capital's Prefecture de Police earlier this week, justified the use of the controversial technique by saying that "in the current context, these concerts are events that are particularly exposed to the risk of acts of terrorism".
The deployment will last a week from May 7 to 14 at the Nanterre Préfecture and La Défense Grande Arche metro stations, both of us which serve the Paris La Défense Arena.
A bill that authorised the use of AI-driven surveillance was voted into law by the French Assemblée Nationale and the Senate in an accelerated procedure in May 2023 ahead of the 2024 Olympic Games in Paris this summer.
However, several experts have already raised concerns about the use of the technology.
What is AI-powered video surveillance?
Known as "algorithmic video surveillance" ("vidéosurveillance algorithmique" in French, or VSA), the technique involves using AI-powered algorithm software to analyse in real-time the video feed coming from surveillance cameras.
This task would ordinarily require a lot more resources if it had to be done by a team of humans as it requires constant attention.
The algorithm is designed to recognise and sound an alert if suspect behaviour is detected, such as a suitcase being left behind.
However, these types of surveillance devices "do not use any biometric identification system, do not process any biometric data and do not use any facial recognition techniques," according to the French legal framework.
But some critics oppose techniques such as VSA, including the French advocacy group La Quadrature du Net which has voiced doubts about its efficiency in securing an event like the Swift concerts or the Olympics.
"Because algorithmic video surveillance works with machine learning, you need past situations to teach the algorithm to spot this kind of situation in the future. But we don't have very large quantities of images of terrorist attacks or crowd movements," Bastien Le Querrec, a legal expert working for the group, told Euronews Next.
"In reality, what we see is that this technology is designed, trained, and effective on very low-level delinquency".
Why is it controversial?
For its opponents, algorithmic surveillance poses risks regarding individual individual freedoms and rights.
"What we've been saying since 2019 is that this algorithmic video surveillance is a higher and almost absolute level of surveillance," Le Querrec said.
"By automating this surveillance, we increase its impact tenfold. Surveillance has very strong consequences on fundamental freedoms," he added, mentioning the possible implications for freedom of expression, freedom of association, and freedom of association.
The association isn’t the only one to worry about the system. The NGO Amnesty International France said in a statement that the technology was based on personal data collecting "concerning regarding privacy rights".
"Algorithmic video surveillance carries with it the risk of stigmatising certain groups of people and the risk of discrimination," the NGO added.
Le Querrec raised similar concerns, for example, about it being used to discriminate against homeless people.
Opponents also worry that the use of AI-powered video surveillance won’t be a temporary measure.
The 2023 law authorises its use for the Paris Olympics and Paralympics, but also provides for its use at sporting and cultural events on an experimental basis until March 31 2025, six months after the end of the Games.
However, the French minister of sport and the Olympics, Amélie Oudéa-Castéra has already signalled that the government could extend its use for events well into the future "if it [the technology] proves its worth".
"The Olympic Games law is the first step towards legalising AI-driven video surveillance," said Le Querrec, adding that a bill was already in discussion planning to deploy it on public transport.
The official text of the law states that it plans "to deploy algorithmic processing to select and export images requisitioned by the courts".
Data stored for a year
Since the law was passed, AI-powered video surveillance has been sporadically deployed and tested during large-scale meetings like concerts and football matches in France.
In Paris, the technical aspects are assigned to the company Wintics - one of the start-ups benefiting from the surveillance contracts given by the Ministry of the Interior - and its software Cityvision.
For each use of the technology, the corresponding prefecture is required to publish a decree.
However, several voices pointed out that by publishing them so late - a day before their deployment - the police are trying to avoid legal challenges as much as possible.
"People who are going to attend Taylor Swift concerts will not be able to take legal action if they feel that this order is illegal and disproportionately infringes protected rights or fundamental freedoms,” Le Querrec said.
"There is a kind of organisation of impunity on the part of the police prefect".
The Paris Police Prefecture did not immediately respond to Euronews Next’s request for comment but its decree states that Cityvision "has been attested to as compliant" by the Interior Ministry and that the agents "received training in the protection of personal data".
The data will be stored for a year, the police decree added.