Inicio Today GRAVE: Facebook discriminates against Uruguayans by race, sex and religion and does...

GRAVE: Facebook discriminates against Uruguayans by race, sex and religion and does not show their posts

626

The U.S. Department of Housing and Planning He sued the social network after discovering that its advertising algorithm offered unequal access to housing and job opportunities. For example, secretarial work is mainly reserved for women.

All algorithms are sessed and Facebook's algorithms would be no less so.

And that trend has led to the U.S. Department of Housing and Urban Development. To sue the tech giant for allowing advertisers to target their ads based on race, gender and religion, the kind of things they prohibit. The company then announced that it would end approval.

However, there is new evidence that Facebook's algorithm, which automatically decides who to serve an ad to, also discriminates against ads it shows to its more than 2 billion users based on their demographic information.

A team from Northeastern University (USA). ), Led by researchers Muhammad Ali and Piotr Sapiezynski, he published a series of identical advertisements with minor variations in the budget, title, text or image available. They found that these subtle adjustments had a big impact on the audience each ad reached, especially when it came to job ads or real estate.

For example, early childhood education and vacant secretarial positions were primarily aimed at women, while cleaning jobs and taxis were more aimed at minorities. Home sales ads were also more likely to appear to white users, while rental ads were more likely to show to minority groups.

In response to the results, a company spokesperson said in a statement: "We have made significant changes to our ad targeting tools and we know this is only a first step. We have reviewed our ad serving system and the industry is hiring executives, teachers and civilian human rights experts on this issue and we will explore more changes."

In a way, the news should come as no surprise: the bias of algorithmic recommendation has been a known problem for many years. In 2013, for example, Harvard University(U.S.) professor of political science and technology Latanya Sweeney published an article showing racial discrimination implicit in Google's ad serving algorithm. The problem goes back to the basis of how these algorithms work. All of them are based on machine learning, a technique that specializes in finding patterns in massive amounts of data and applying them to make decisions. There are many ways to introduce biases into this process, but the two most obvious in the case of Facebook relate to problem definition and data collection.

Bias occurs in the definition of problems when the goal of a machine learning model is not aligned with the need to avoid discrimination. Facebook's advertising tool allows advertisers to choose from three optimization goals: the number of ad visits, the number of clicks and engagement generated, and the number of sales made. But these business objectives have nothing to do with, for example, maintaining equal access to housing. As a result, if the algorithm finds that it can generate more interactions by showing homes for sale to more white users, it ends up discriminating against black users.

Bias in data collection occurs when training data replicates existing prejudices. Facebook's advertising tool bases your optimization decisions on people's historical preferences. If a greater proportion of people from minority groups had shown greater interest in rental housing in the past, the machine learning model would have identified this model and applied it at the end of days. Once again, the system will blindly follow the path of discrimination in employment and housing, without anyone explicitly giving it that order.

Although these machine learning behaviors have been studied for some time, the new study provides a more direct view of the scope of their impact on access to housing and employment opportunities. 'These results are highly controversial!' warned the director of the Center for Ethics, Society and Informatics at the University of Michigan (USA). ), Christian Sandvig, in statements to The Economist. The expert added: 'The article assures us that if (.) Big Data is used in this way, it will never give us a better world. In fact, things are likely to get worse by accelerating the global problems that are causing things. are unfair. '

The good news is that there are ways to address this problem, but it won't be easy. Many AI researchers are already looking for technical solutions to machine learning biases to create more equitable models of online advertising. A recent article from Yale University (USA). ) And the Indian Institute of Technology, for example, suggests that it may be possible to restrict algorithms to minimize discriminatory behavior, albeit at a low cost to advertising revenue. Policy makers should get into this mud if platforms start investing in these solutions, especially if they improve their results.

Facebook discriminates against Uruguayans

On the other hand, in recent weeks it has been known that Facebook not only discriminates in advertising but does so in the posts that users make. In Uruguay, the social network does not display publications of people of color, sexual and religious inclination.

As told to this portal, the social enterprise not only does not display the publications but without any communication through it, it blocks its accounts.

Facebook reportedly wants its users to at least pay to post on their walls, which not only seems aberrant to us, but the arbitrariness of this company harms the work of millions of people who use them for this purpose.

Suscríbase
Las noticias en su email

Reciba las noticias, promociones, novedades y más

¡No enviamos spam! Lee nuestra política de privacidad para más información.