Co-op is using facial recognition technology across more than a dozen stores to scan shoppers faces in real-time in a bid to reduce crime and abuse against staff.
Southern Co-op is using the controversial technology in 18 stores across the country, while other regional Co-op franchises are now understood to be trialing its use.
According to Wired the retailer, which has experienced a sharp rise in crime levels during the pandemic, began introducing the technology over the last 18 months.
Co-op has reportedly introduced a facial recognition system from Facewatch, which scans the faces of shoppers when they enter the store to check them against a watch list of known suspects.
The system will then alert store staff via their smartphone immediately if someone “who has a past record of theft or antisocial behavior” enters the store.
The Court of Appeal has previously criticised the lack of transparency around the creation of such watch lists, which are understood to be compiled based on decisions from Co-op staff.
In July, Co-op announced it was issuing body cameras to staff in response to a “unprecedented crime wave”, which it says was largely carried out by repeat offenders when they are confronted by staff.
A Co-op spokesperson told Wired that “only images of individuals known to have offended within our premises, including those who have been banned/excluded, are used on our facial recognition platform.”
“Using facial recognition in this limited way has improved the safety of our store colleagues,” he added.
Despite the apparent reduction in crime, the use of facial recognition technology remains a controversial topic, and its largely silent adoption across the private sector has drawn major criticism from privacy advocate groups.
Co-op’s use is largely protected under GDPR regulations as it is classed as a “legitimate business interest”, helping minimise crime and improve employee safety.
However legal officer at Privacy International Ioannis Kouvakas says: “You still need to be necessary and proportionate. Using an extremely intrusive technology to scan people’s faces without them being 100 per cent aware of the consequences and without them having the choice to provide explicit, freely given, informed and unambiguous consent, it’s a no go.”