Cataloging and filing images for search is an easier task with Google’s advances in photo identification and categorization technology.
But like all types of technology, it is far from perfect.
Very far from perfect, it would seem.
Google is at the forefront of developing advanced artificial intelligence to identify and classify photos.
Every once in a while, however, even the best AI gets it wrong – like that time Google Photos identified a group of African Americans as “gorillas.”
Not only was this label offensive, it represented a kind of failure on Google’s part that was deemed by many to be unacceptable. Google apologized and promised to do better in the future.
Part of fulfilling that promise was removing the label “gorilla” altogether along with other words that could interpreted in an offensive way.
User Jackie Alcine brought the matter to Google’s attention through Twitter and Google acted immediately to remove the label for group photos. Now the term is gone entirely.
Wired magazine put Google Photos through the ringer to see if it could trip the new algorithm up in labeling photos. The publication reported impressive results and noted there were some images for which no result was returned at all, among them including “gorilla,” “chimp,” “chimpanzee,” and “monkey” while photos featuring other primates returned results like “baboon,” “gibbon,” “marmoset,” and “orangutan.”
Attempting to search for a specific ethnicity is apparently no longer featured either.
A spokesperson for Google confirmed that the term “gorilla” was banned in the initial wake of the controversy while other terms were added later. The representative cited the imperfect nature of AI-based technology and emphasized that it was still in its infancy. The removal of these labels may or may not be permanent – in fact, it may be temporary until the technology is advanced enough to avoid personal relations disasters like the initial mislabeling.