GOOGLE PHOTOS THINKS BLACK PEOPLE ARE PRIMATES
The origin of this story is not new, it’s from two years ago. That’s when the algorithm tasked with object recognition for Google Photos started to misidentify black people as gorillas. Yes, that’s right. Gorillas. But two years later Google’s response is still official policy. What was Google’s response? Simply to censor the words “monkey,” “chimpanzee,” “chimp” and….. “gorilla.” So in case it’s not clear yet, Google didn’t fix the algorithm. It also didn’t engineer a new one that would operate properly. Google simply “fixed” the issue by banning the words of various mammals. No words, no automated labeling on images of black people.
BLACK ENGINEER STUNNED TO SEE FRIENDS LISTED AS GORILLAS
The whole situation arose two years ago after a black software engineer named Jacky Alciné ran into the issue directly. The algorithm told him that pictures of his friends were gorillas. Obviously, Jacky Alciné and his employer Google were not pleased with the results, to say the least. So Google simply “blinded” the algorithm so that the AI couldn’t use those words for object recognition. But as of today that’s still the fix. Media recently tested this with 40,000 images on the Google Photos platform which is a standalone app. Google Photos uses AI to categorize and group images automatically.
GOOGLE SIMPLY CENSORS OFFENSIVE TERMS, NO FIX IN TWO YEARS
Some are looking at this issue as an accountability problem for apps and algorithms that use AI. Is it the case that Google gets to hide behind an “automation” issue for these massive racial gaffs? Or should it be more the case that the technology giant should be more embarrassed by ducking the issue. Why is it so hard Google with all its resources to simply fix the issue? As of now there’s no clear answer. But Google is not alone in the last decade with these types of issues.
Read More:
JEW HUNTER, THE N-WORD AND OTHER PLATFORMS’ SNAFUS
Just last year Facebook and its algorithm came under fire when “jew hunter” was listed as a valid form of employment. Then there’s the WeChat hot potato from its Chinese translation services. That offered the “n-word” as a real English translation for an African American woman arriving late for work. And way back in 2010 Nikon’s face-detection software thought all images of East Asian folks were blinking. Like, all the time in every photo. So clearly these things will only continue to come up as the virtual gets managed in bulk by automated systems.
JANE GOODALL SHIT OUT OF LUCK WITH GOOGLE PHOTOS
So for now, good luck if you need to use Google Photos to identify a “monkey,” “chimpanzee,” “chiMp” or a “gorilla.” They’re on vacation so Google can try to be PC.