Rite Aid Corp. has been banned from using facial-recognition technology driven by artificial intelligence to stop shoplifting after federal regulators determined it had falsely flagged a disproportionate number of women and people of color.

In a settlement with the Federal Trade Commission, the pharmacy chain has agreed not to use the technology, which relied on AI to identify people it believed had previously shoplifted from its stores, for the next five years.

According to court filings, Rite Aid was accused of using the technology in hundreds of its stores from 2012 through 2020. But the FTC said the program routinely made errors in matching prior shoplifters with the wrong people, leading staffers to falsely accuse innocent shoppers of having stolen from the stores.

The errors most often involved women and people of color, the FTC suit said. 

According to the filing, Rite Aid had provided customer data to third-party companies it worked with in developing the security, which violated consumer privacy protections.

“Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk,” said Samuel Levine, director of the FTC’s bureau of consumer protection. 

As part of the settlement, Rite Aid agreed to delete all biometric information it had collected from its customers.

In a statement, Rite Aid said it respected the FTC’s findings about the consumer privacy violations, but “fundamentally disagreed” with the allegations around the functioning of its facial-recognition systems, saying it related only to a pilot program it had stopped years earlier.

“The allegations relate to a facial recognition technology pilot program the company deployed in a limited number of stores. Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC’s investigation regarding the company’s use of the technology began,” the statement read.

Rite Aid filed for Chapter 11 bankruptcy protection in October and the company said its agreement with the FTC would require the approval of bankruptcy court.

The FTC said earlier this year that it would be more closely monitoring the use of biometric information for security.

Source link