Twitter Investigates Racial Bias Seen In Photo Previews

JAKARTA - The social network Twitter is testing face and photo detection features using neural network technology. Unfortunately, some users actually say this feature curates faces in a racist manner.

Reported by Gizmodo, Monday, September 21, a number of netizens found that this new Twitter feature only chooses white faces over black users. Not only that, Twitter's auto-crop feature also sensitively puts black users slightly down from their general position.

Other users also reported the same thing, as they tweeted problems with the detection algorithm from Zoom which were uploaded to Twitter. Where the Zoom background filter does not recognize the faces of its black users.

Regarding this issue, Twitter design chief Dantley Davis tweeted that the company was investigating an algorithmic error that occurred in its neural network system. Where Twitter is still doing trials in several sectors, such as facial recognition.

Meanwhile, the head of Twitter technology Parag Agrawal also revealed in a tweet that the feature still needs improvement. Where the Twitter algorithm is still studying the behavior of Twitter users.

"Our team tested for bias before submitting the model and found no evidence of racial or gender bias in our tests. But it is clear from these examples that we have more analysis to do," Twitter spokeswoman Liz Kelley told Gizmodo.

Twitter promises to investigate algorithm problems that its users find. However, Twitter stressed will not be racist and biased in the development of this new feature.

Earlier in 2018, Twitter published a blog post explaining how neural network technology is used to make photo preview decisions. One of the factors that causes the system to select part of the image is a higher contrast level.

This could explain how the Twitter algorithm seems to prefer white faces. The decision to use contrast as a determining factor may not have been deliberately racist, but showing a white face more often than a black face is a biased result.