A face recognition-equipped Detroit roller rink reportedly kicked out a Black teen on June 10 after misidentifying her as a person who’d allegedly gotten into a fight there in March.
According to Gizmodo, the girl, Lamya Robinson, says security scanned her face upon entry and then forbade her from entering, despite her claim that she’d never been in the building before.
WJBK reports Robinson’s parents are considering filing a lawsuit against Riverside Arena skating rink.
In a statement to WJBK, the rink admitted that they used the technology, claiming that Robinson was a 97 percent match for the other girl.
“One of our managers asked Ms. Robinson (Lamya's mother) to call back sometime during the week,” the business said. “He explained to her, this our usual process, as sometimes the line is quite long and it's a hard look into things when the system is running.”
“This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that,” the business added.
"To me, it's basically racial profiling," Lamya’s mother Juliea Robinson told the news station. "You're just saying every young Black, brown girl with glasses fits the profile and that's not right."
"I was like, that is not me. Who is that?" Lamya added. "I was so confused because I've never been there."
The horrid mishap comes as groups are moving to ban business owners from using facial recognition on customers or workers in their stores.
Tawana Petty who heads Data 4 Black Lives, one of 35 organizations signing onto a campaign calling for retailers to not use facial recognition, says Robinson’s experience is far too common.
"Facial recognition does not accurately recognize darker skin tones," Petty said. "So, I don't want to go to Walmart and be tackled by an officer or security guard, because they misidentified me for something I didn't do."
The Cambridge, Massachusetts based Algorithmic Justice League is a digital advocacy organization founded in 2016 by MIT computer scientist Joy Buolamwini. The mission of the AJL is to raise awareness of the social implications of artificial intelligence through art and research. They are compiling the stories of AI gone wrong, particularly where Black people are misidentified and discriminated against.
As more companies put these identifying programs into place, without regulation, more incidents such as Lamya Robinson’s will certainly happen.