Twitter's photo-cropping algorithm had a bias towards young, light-skinned females, study shows

The microblogging site had, in March, disabled the system
Twitter
Twitter

Media reports say that a now disabled automatic photo-cropping system by Twitter discriminates on the basis of colour, gender, weight and age, according to the results of an open competition to find algorithmic bias.

The microblogging site had, in March, disabled the system as users noted that it favoured white and female faces while auto-cropping images. It then launched an algorithmic bug bounty competition offering prizes of up to $3,500 to analyse how the technology incorrectly handles photos, the Verge reported.

The top entry, contributed by Bogdan Kulynych, a graduate student in computer security at EPFL in Switzerland, showed that Twitter's cropping algorithm favours faces that are "slim, young, of light or warm skin colour and smooth skin texture, and with stereotypically feminine facial traits".

These algorithmic biases amplify biases in society, literally cropping out "those who do not meet the algorithm's preferences of body weight, age, skin colour", Kulynych noted in his summary.

The second and third-placed entries showed that the system was biased against people with white or grey hair, suggesting age discrimination, and prefers English over Arabic script in images, the report said.

"When we think about biases in our models, it's not just about the academic or the experimental ... but how that also works with the way we think in society," Rumman Chowdhury, director of Twitter's META team (which studies Machine learning Ethics, Transparency, and Accountability) was quoted as saying at the DEF CON 29 conference, while presenting the results.

"I use the phrase 'life imitating art imitating life'. We create these filters because we think that's what beautiful is, and that ends up training our models and driving these unrealistic notions of what it means to be attractive," he added.

Twitter's open approach is a contrast to the responses from other tech companies when confronted with similar problems.

"The ability of folks entering a competition like this to deep dive into a particular type of harm or bias is something that teams in corporations don't have the luxury to do," Chowdhury said.

According to Patrick Hall, a judge in Twitter's competition and an AI researcher working in algorithmic discrimination, biases exist in all AI systems but companies must work proactively to find them.
 

*Edited from an IANS report 

Related Stories

No stories found.
X
Indulgexpress
www.indulgexpress.com