Testing face recognition on people of color isn't a bad thing unless you do it the wrong way

It's no secret that using any sort of software that does more than a cursory "look" at people has issues when used by a person of color. Microsoft had issues with the Kinect and even a simple heart-rate sensor on your smartwatch can struggle here. Google is trying to avoid these kinds of issues by training its smart face unlocking using people of color in the hopes that more data fed into the system can help overcome the problem.

You can't use any sort of face recognition software unless it works for everyone.

I think most of us can agree that this is a good thing. We're all people and though skin color is simply a cosmetic difference, it is a valid concern in some cases. Working to make it less of one is a great idea. But it's still going to always be something that's almost uncomfortable to talk about because it focuses on the idea that skin color implies a difference beyond the visual. It's human nature to try to be unoffending and politely shy away from anything outside our comfort zone.

More: I'm big and black and heart rate monitors are terrible for people like me

The issue isn't really anything we should feel uncomfortable about, that's just how we're wired inside. But taking to task something that's already controversial and bungling it beyond belief is something Google should know better than to do and having third-party contractors deceive people and target vulnerable folks like the homeless to gather this data is downright stupid.

Gestures

In case you're not aware, that's exactly what Google is accused of doing according to the New York Daily News. Interviews with current and former employees claim that they were directed to target the homeless because they would be less likely to talk to the media. Or that they were simply playing a "selfie game" or that the $5 gift card they would receive could be exchanged for cash in certain states.

As mentioned, gathering this data is for a good cause. We saw Apple do the same thing prior to the launch of Face ID and for the same reason — you only fix the issue of face recognition working so poorly with darker skin by collecting more data. Google needs to do this for the launch of the Pixel 4. It's not what Google was doing here or why it was doing it — it's how.

The problem isn't what Google is doing, it's how.

Using trickery to exploit anyone is never a good look. When you're doing it and targeting a group of people for a specific reason, you need to be up-front with them and let them know why you need their help. I can't speak for anyone besides myself, but I don't think someone with darker skin would think what Google is trying to do is a bad thing. The trope that "I have friends who are black" is tired and old, but I do, and none of them think collecting more data to make face recognition better for people of color is something Google needs to hide. This small sample isn't conclusive of anything in itself and once again we have to realize that the whole subject of skin color can be touchy, but there is only one way to go about this sort of testing — honestly.

This whole mess makes a bad situation worse, and now it seems like Google may feel that most homeless people aren't white or that people of color are easier to fool and that makes me more than a little upset. I'm sure I'm not alone here and there are plenty of people who may think twice the next time Google wants an opinion or some personal data.

Google, be better.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.