Facial recognition technology is going to be controversial no matter how you cut it but it probably isn’t doing itself any favors with developers that are trying to push it forward using methods that are ethically questionable.
From sensors in China that can read a crowd and identify everyone in it to today’s story of contractors for Google’s Pixel 4 using homeless people and students to train facial recognition tech, there’s just a lot of sketchy stuff going on in this field.
The Verge reports that some contractors for Google not only used homeless people to help train the Pixel 4’s facial recognition technology (giving them $5 gift cards in return), but also used college students that might not have been aware they were being recorded.
It gets worse. The Verge reports that the company, Randstad, sent out teams in Atlanta with the purpose of recording the faces of homeless people with darker skin. If ever there was a moment for the word “yikes,” this is probably it.
Specifically, the New York Daily News says that the teams were directed to “go after people of color, conceal the fact that people’s faces were being recorded and even lie to maximize their data collections.” The NY Daily News also quoted an ex-staffer as saying, “They said to target homeless people because they’re the least likely to say anything to the media…The homeless people didn’t know what was going on at all.”
As The Verge points out, it isn’t out of the ordinary that Google needs more data in one area or the other but the method here is leaving a lot to be desired. Further, it is probably not the best marketing to “target” any group with any kind of campaign – research or otherwise. It just doesn’t look good. As The Verge reports, neither companies have commented on the New York Daily News original report which you can read here.
What do you think? Sketchy behavior by contractors or a total misunderstanding? Let us know your thoughts in the comments below.
Also, don’t forget to check out our photography news articles on Light Stalking by clicking here.