Luggage left in a public place, a fire breaking out, a person moving against the crowd, or holding a weapon, a minor wishing to buy a pack of cigarettes… All these situations have one thing in common: they can be filmed and analyzed in real time by so-called “augmented” cameras, thanks to the use of artificial intelligence.
This emerging technology called algorithmic video surveillance has been tested within the framework of the Olympic and Paralympic Games. In total, 185 cameras from the Paris prefecture scanned in real time so-called “accident-causing” behavior during the competition, in Parisian transport and around the event sites. The prefect of Paris Laurent Nunez said he was in favor of the generalization of this “ decision support » during a hearing in the law committee, then the office of the Minister of the Interior Bruno Retailleau affirmed that this track was indeed “ under review ».
Many associations, including La Quadrature du Net, have filed several appeals against this technology, which according to them, would have been used by police services without prior declaration. It would collect biometric data, such as clothing and facial expressions. Such collection would violate European and French law regarding the protection of personal data.
Absence de prudence
If the use of algorithmic cameras mainly causes ink to flow in the police field, it makes less noise in the commercial field, notes Jérôme Durain, senator from Saône-et-Loire and member of an evaluation committee of Article 10 of the Olympic Games Act, which concerns algorithmic image processing. “The private field is a blind spot for social prudence in this area of artificial intelligence”he notes. In fact, some 200 French tobacco shops have already acquired “smart” cameras over the last year, programmed to check the age of their customers before providing them with cigarettes and gambling.
The French company Bergens has developed technology to scan the customer’s face as soon as they are less than a meter away, and to guess their age thanks to anonymous training carried out on thousands of faces. The company assures that the software does not take any photos and does not collect any biometric data. The cameras deployed in tobacco shops operate with a margin of error of around one year between the estimate and the actual age, explains Bergens. As a precaution, however, the algorithm was designed with a tendency to overestimate the age rather than underestimate it.
Conclusions rendered in December on video surveillance
Whether for commercial or government use, this device is criticized for its effectiveness. In 2023, the Bergens company, for example, had to update its algorithm, tobacconists having noted that it incorrectly calculated the ages of people of color. These errors of assessment are the consequence of a certain number of discrimination biases, which artificial intelligence is known to amplify.
On the law enforcement side, a senatorial report exposed last April the limits of intelligent cameras, deployed on an experimental basis during a Depeche Mode concert a month earlier. A document claiming that the device was “far from the set objectives”. Since the Olympic Games, however, the system has been improved, assured Laurent Nunez at the end of September. “The results of the experiment are positive for us”he boasted while recognizing that this technology had not led to any arrests.
Jérôme Durain is categorical. “For the moment, these technologies have been of no use. Some use cases do not work at all, notably the detection of abandoned weapons and packages.” The senator was also surprised by the eagerness shown by the executive, which moved forward on a possible extension of the system even before the evaluation committee rendered its conclusions, expected on December 31, 2024. Since then, Matignon backpedaled and declared that he would wait for the committee’s opinion before embarking on the path of generalization.