Additional research shared exclusively with CNN by Global Witness suggests that this algorithmic bias is a global issue, the human rights group says.
“Our concern is that Facebook is exacerbating the biases that we live with in society and actually marring opportunities for progress and equity in the workplace,” Naomi Hirst, who leads Global Witness’ campaign strategy on digital threats to democracy, told CNN.
Meta spokesperson Ashley Settle said in a statement that Meta applies “targeting restrictions to advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library.”
“We do not allow advertisers to target these ads based on gender,” Settle said in the statement. “We continue to work with stakeholders and experts across academia, human rights groups and other disciplines on the best ways to study and address algorithmic fairness.”
Meta did not comment specifically about the new complaints filed in Europe. The company also did not respond to a question asking in which countries it now limits targeting options for employment, housing and credit ads.
Missing out on jobs because of your gender
Efforts to address those disparities included removing the option for advertisers to target employment ads based on gender, but this latest research suggests that change is being undermined by Facebook’s own algorithm, according to the human rights groups.
As a result, the groups say, countless users may be missing out on the opportunity to see open jobs they could be qualified for, simply because of their gender. They worry this could exacerbate historic workplace inequities and pay disparities.
“You cannot escape big tech anymore, it’s here to stay and we have to see how it impacts women’s rights and the rights of minority groups,” said Linde Bryk, head of strategic litigation at Bureau Clara Wichmann. “It’s too easy, as a corporation, to just hide behind the algorithm, but if you put something on the market … you should also be able to control it.”
Global Witness conducted additional experiments in four other countries — including India, South Africa and Ireland — and says the research shows that the algorithm perpetuated similar biases around the world.
In France, for example, Facebook is often used for job searches by people of lower income levels, meaning the people most affected by its alleged algorithmic biases may be those already in marginalized positions, said Caroline Leroy-Blanvillain, lawyer and member of the legal force steering committee at Fondation des Femme.
Pat de Brún, head of Amnesty International’s big tech accountability team, said he was not necessarily surprised by the findings of Global Witness’ research. “Research consistently shows how Facebook’s algorithms deliver deeply unequal outcomes and often reinforce marginalization and discrimination,” de Brún told CNN. “And what we see is the reproduction and amplification of some of the worst aspects of society.”
“We have this illusion of neutrality that the algorithms can provide, but actually they’re very often reproducing those biases and often obscuring the biases and making them more difficult to challenge,” he said.
Gendered targeting
To conduct the experiments cited in the complaints, Global Witness ran a series of job ads in France and the Netherlands over two-day periods between February and April. The advertisements linked to real job postings found on employment websites, and researchers selected positions — including preschool teacher, psychologist, pilot and mechanic — traditionally associated with gender stereotypes.
Global Witness targeted the ads to adult Facebook users of any gender who resided in, or had recently visited, the chosen countries. The researchers requested that the ads “maximize the number of link clicks,” but otherwise left it up to Facebook’s algorithm to determine who ultimately saw the advertisements.
The ads were often shown to users along heavily gendered lines, according to an analysis of the data provided by Facebook’s ad manager platform.
“Just because advertisers can’t select it, doesn’t mean that the ‘gender’ [category] doesn’t weigh in the process of showing ads at all,” one of the Netherlands complaints states.
In France, for example, 93% of the users shown a preschool teacher job ad and 86% of those shown a psychologist job ad were women, while women comprised just 25% of users shown a pilot job ad and 6% of those shown a mechanic job ad, according to Facebook’s ad manager platform.
Similarly, in the Netherlands, 85% of the users shown a teacher job ad and 96% of those shown a receptionist job ad were women, while just 4% of those shown a job ad for a mechanic were women, according to Facebook’s data. Certain roles were less strongly skewed — a package delivery job ad, for example, was shown to 38% women users in the Netherlands.
The results mirrored those Global Witness has found in the United Kingdom, where women were more often shown ads for nursery teacher and psychologist jobs, and men were overwhelmingly shown ads for pilot and mechanic positions.
In some cases, the degree of gender imbalance in how users were targeted for certain jobs varied by country — in India, just 39% of the users shown a psychologist job ad were women, while in Europe and South Africa, women were more likely than men to be shown psychologist job ads. A further exception was pilot ads shown in South Africa, which were more balanced, with 45% of users shown a pilot ad being women.
Global Witness also ran tests in Indonesia, but Facebook’s ad manager was unable to identify the genders of many of the users who were shown the advertisements, making it difficult to conduct a robust analysis of the results there.
“Even though Facebook may have become less fashionable in certain countries, it remains the key communications platform for much of the world … as the public square where public discourse happens,” Amnesty International’s de Brún said. “They should be ensuring these discriminatory outcomes do not happen, intentionally or not.”
Because little information is publicly available about how Facebook’s algorithm works, the complaints acknowledge that the cause of the gender skew was not exactly clear. One of the Netherlands complaints speculates about whether the algorithm may have been trained on “contaminated” data such as outdated information about which genders typically hold which roles.
Seeking algorithmic transparency
“People don’t look for jobs or housing in newspapers, or even the radio, anymore, they go online, that’s where all information flows for economic opportunities,” said Peter Romer-Friedman, one of the attorneys representing Real Women in Trucking. “If you’re not part of the group that’s receiving the information, you lose out on the opportunity to hear about and pursue that job.”
Romer-Friedman was also on the negotiating team that worked on the 2019 settlement agreement with Facebook. At the time, he said, he and others raised concerns that while Facebook’s promised changes were a step in the right direction, the same bias issues could be replicated by the platform’s algorithm.
Meta declined to comment on the EEOC complaint from Real Women in Trucking; filings in cases with the agency are not publicly available.
“What we’re hoping with these complaints is that it forces [Facebook] to the table to crack open the black box of their algorithm, to explain how they can correct what appears to be … discrimination by their algorithm,” Global Witness’ Hirst said. “I think we know enough about gendered workforces and gendered jobs to say that Facebook is adding to the problem.”
—-
Credits
Commissioning Editor: Meera Senthilingam
Editor: Seth Fiegerman
Data and Graphics Editor: Carlotta Dotto
Illustrations: Carolina Moscoso for CNN
Visual Editors: Tal Yellin, Damian Prado, David Blood and Gabrielle Smith