Rekognition misidentified darker-skinned women as men 31 percent of the time

Wednesday, July 24th, 2019

Amazon’s Rekognition face-recognition software doesn’t always work that well, particularly on people of color:

An MIT study released earlier this year found that Rekognition misidentified darker-skinned women as men 31 percent of the time, yet made no mistakes for lighter-skinned men.

Comments

  1. Felix says:

    These sorts of articles usually point out that AI systems learn from examples they are given. Which is spun to wave away such unfashionable behavior.

    What generally goes unspun is that such systems, given more input, will probably outperform humans in the realm of “they all look alike”. “They all look alike” will remain the exclusive domain of humans.

    BTW, false positives are not the same as no false negatives.

  2. TRX says:

    If “darker-skinned” is a euphemism for “west African”, there’s a body type with a layer of fat over the most of the body. That’s the body type parodied in some old cartoons.

    I personally, have problems distinguishing male from female with many of that body type, without markers like hairstyle, makeup, or clothing.

    For, say, white northern Europeans there are a number of strong gender indicators, particularly around the nose and mouth. They teach that stuff in art class. Those indicators are less prominent or missing from some racial types.

    I get the idea a bunch of managers and programmers wrote some software without clearly understanding their problem set, else they would have admitted that the accuracy of their software depends on race.

    That’s the sort of thing that makes people scream “racist!” or “sexist!” when they’re trying to keep you from getting a fat government contract…

Leave a Reply