13:16
09:59
14:15
10:28
09:59
17:20
13:16
09:59
14:15
10:28
09:59
17:20
13:16
09:59
14:15
10:28
09:59
17:20
13:16
09:59
14:15
10:28
09:59
17:20
The US government and Meta have agreed to settle a lawsuit that accused the company of facilitating housing discrimination by letting advertisers specify that ads not be shown to people belonging to specific protected groups.
According to the US Department of Justice, advertisers could use Facebook tools to advertise to people that were similar to a pre-selected group. When deciding who to advertise to, they could take things like a user’s estimated race, national origin, and sex into account, meaning it could end up cherry-picking who saw housing ads.
It was in 2019 that the government first brought a case against Meta for algorithmic housing discrimination. The company took some steps to address the issue, but clearly, they weren’t enough for the feds.
According to the settlement, Meta will have to stop using a discriminatory algorithm for housing ads and instead develop a system that will
address racial and other disparities caused by its use of personalization algorithms in its ad delivery system.
In response, Meta announced that it plans to tackle this issue by using machine learning, so that the new system will
ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.