Via a suggestion from Haunt Fox @Haunt_Fox I looked at a research paper claiming to show:
These findings illustrate the shooter bias toward both human and robot agents. This bias is both a clear indication of racism towards Black people, as well as the automaticity of its extension to robots racialized as Black.
See also Robots and Racism: New Study Suggests That Humans Apply Racial Biases, Stereotypes to Black and White Robots.
The study presented test subjects with a series of 128 images. Half contained a gun held by a person or robot. The other half had some other object being held. Half of the robots and people had a dark skin color and half were white. The test measured the response time of test subjects to make shoot/no-shoot decisions and their accuracy in making those decisions.
I found it “interesting” the researchers did not break out the supposed discovered bias by the various racial identities who participated in the study. Only seven subjects out of 163 in Experiment 1 identified as “Black or African American” so I could be persuaded this isn’t an adequate sample size. But 19 out of 172 subjects in Experiment 2 identified as “Black or African American”. I would think this should be a sufficient sample to test one or more additional hypothesis.
For example, were “Black or African American” people also biased against people and robots with dark skin tones? If these participants behaved essentially identical to “White, Caucasian, or European American”, “Hispanic or Latino American”, and “Asian American” participants then I would be strongly inclined to believe there was some aspect of the testing that caused what appeared to be the bias against dark skinned people and robots rather than actual bias. That is, unless it is claimed “Black or African American” people are also biased against their own race. From reading the study this could be true but it wasn’t made as clear as I would have liked it to be.
The authors did not mention doing this sort of validation of the test procedure and did not supply us with the raw data so that we could do this validation for ourselves. It would seem to me this is an obvious check on the validity of their experiment. If the racial identity of the subject did not correlate with the time required to make a shoot/no shoot decisions but there was a consistent bias toward shooting more quickly at black people and robots then doesn’t that strong imply it is an artifact of the testing rather than a bias of the subjects?
One could easily conclude they did not provide that information because it contradicted their predisposed conclusion. The study may well demonstrate the prejudice of the researchers rather than the prejudice of the study participants.
So, what could have the experiment measured rather than a racial bias? As suggested by Haunt Fox:
Just looking at the images of the targets it seems to me there are some serious visual-contrast issues that might prove major confounders.
The researchers supplied eight of the 128 images they used. Here are two of them:


As Haunt Fox observed there are significant contrast differences. The accuracy rates were generally slightly lower for the black people and robots. If the pictures above are representative then this is as expected. But if rapid identification of the gun in a low contrast situation contributed to time differences I would have expected the lower contrast images to take longer. This isn’t making sense.
I wonder about the 120 images they didn’t supply. Did they have contrast issues that were even worse and have some sort of bias not displayed in the pictures supplied? Were all the guns black? What about some chrome colored guns? Could an association of a dark skin colors with the presence of a gun have been created?
I’m skeptical this study tested what they claimed they were testing. I think there is a good chance they demonstrated their and/or test procedure bias rather than a “clear indication of racism” toward dark skinned robots.
Like this:
Like Loading...