Over 20 AI experts call for Amazon to stop selling its facial recognition tech to law enforcement

Amazon has been called on by more than 20 AI experts to stop selling its facial recognition software to law enforcement.

Last week a petition was signed by two dozen industry experts which claimed that the two of the retailer’s top executives responsible for its AI software had misrepresented aspects of its research which suggest that it works less accurately on women and people of colour.

This follows the release of a study by the MIT Media Lab’s Joy Boulamwini and the University of Toronto’s Doborah Raji into Amazon’s Rekognition technology found that error rates shot up 31 per cent when attempting to identify women with darker skin tones.

The petion subsequently said the response to these peer-reviewed research findings from Amazon’s general manager of AI Matthew Wood and its vice president of global public policy Michael Punke, were “disappointing”.

It added that Amazon should “thoroughly examine all of its products and question whether they should currently be used by police”.

READ MORE: Amazon’s facial recognition tech biased against women of colour

Signatories of the petition include 2019 Turing Award winner Yoshua Bengio, former head of Amazon Web Services Anima Anandkumar, and researchers from leading AI developers like Google, Microsoft, Harvard and Berkely.

“Facial analysis technologies, when they are packaged up as cloud-based services, can be appropriated for malicious intent, even in ways that the companies supplying them aren’t even aware of,” information science doctoral candidate and petition signee Morgan Klaus Scheuerman told Quartz.

“Amazon has a lot of influence in the field of facial analysis technology. It is one of the major suppliers of these types of computer vision services and can help shape norms around its development and use.”

Click here to sign up to Retail Gazette‘s free daily email newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *