Facebook apologized Friday (Sept. 3) after its artificial intelligence software asked users watching a video featuring Black men if they wanted to see more "videos about primates."
Calling the message “an unacceptable error,” the social giant disabled the topic recommendation feature and launched an investigation into the cause of the error, NPR writes. The offensive video had been online for more than a year.
The video was posted in June by Daily Mail on June 27, 2020, and displayed Black men getting into altercations with police officers and white civilians. According to The New York Times, the video ended with the message, “keep seeing videos about Primates.”
Former Facebook employee Darci Groves drew attention to the problem, calling it “egregious” and “unacceptable.” She asked her former colleagues to look into the problem, which they did.
A spokesperson from the company says they have made improvements to their artificial intelligence. “It’s not perfect,” the spokesperson told Times and there’s “more progress to make.”
This isn’t the first time artificial intelligence has gone wrong. Back in 2015, Google apologized for labeling a photo of two Black people as gorillas. “We're appalled and genuinely sorry that this happened,” a Google spokesperson told BBC.
(Photo by Marc Piasecki/Getty Images)