Despite their efforts to improve the shopping experience, some retailers actually end up alienating customers by introducing new technology that's supposed to help them. The frustration surrounding self-checkout kiosks alone has created a debate about how some advances can feel like a step backward. But other changes can have an even more serious impact when they involve our personal information or details. And now, Rite Aid is under fire from federal officials for its use of facial recognition in stores.
RELATED: Costco Rolling Out Controversial New Shopping Carts: "I Loathe These".
On Dec. 19, the Federal Trade Commission (FTC) and the drugstore chain reached a settlement after a complaint was lodged that the retailer had overstepped privacy boundaries in using the AI-backed technology on shoppers, CNN reports. The agency claimed that facial recognition was ostensibly being used to spot customers "deemed likely to engage in shoplifting or other criminal behavior" and prevent them from entering stores or remove them from the premises from 2012 to 2020.
However, the FTC says the burgeoning technology would often malfunction and falsely flag individuals as criminals, leading to public accusations of theft by employees, detainment, and unwarranted searches.
The agency's complaint also states that the technology created a "heightened risk" of racial bias against shoppers, saying that more false positives were generated in predominantly Black and Asian communities than in White ones.
Specific complaints against Rite Aid include an instance in which an 11-year-old girl was flagged as a thief and traumatized by the experience and another in which a Black woman had police called on her after facial recognition software erroneously targeted her, The Washington Post reports.
The agency also states that shoppers were never informed that the technology was being used, and store employees were told not to publicly admit to its use.
"Rite Aid's reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers' sensitive information at risk," Samuel Levine, director of the FTC's Bureau of Consumer Protection, said in a statement. "Today's groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices."
In a statement responding to the settlement, Rite Aid said it was "pleased to reach an agreement" with the agency. However, the company added: "We fundamentally disagree with the facial recognition allegations in the agency's complaint," saying that it had only rolled out the technology as a "pilot program...in a limited number of stores" and had stopped using it more than three years ago.
"Rite Aid's mission has always been and will continue to be to safely and conveniently serve the communities in which we operate," the company wrote. "The safety of our associates and customers is paramount. As part of the agreement with the FTC, we will continue to enhance and formalize the practices and policies of our comprehensive information security program."
As a result of the settlement, Rite Aid will be banned from using facial recognition technology in its stores for five years and must delete any images it collected as part of the program. According to the agency, the company must also keep the FTC updated on its compliance measures.
Some experts say the company's settlement could profoundly impact potential future uses of facial recognition.
"These are the types of common sense restrictions that have been a long time coming to protect the public from reckless adoption of surveillance technologies," Joy Buolamwini, an AI researcher with research background on the racial biases of the technology, told The Post. "The face is the final frontier of privacy, and it is crucial now more than ever that we fight for our biometric rights, from airports to drugstores to schools and hospitals."