.

Exposing the Social Media Truth

Even though human beings are, by far, the smartest species our world has ever seen, they have repeatedly shown a knack of making mistakes. Now, you can argue how these mistakes actually help us in becoming better over time, but if history has taught us anything, it’s that some of these mistakes can also go on to cause irreparable damage along the way. Such a volatile dynamic expectantly calls for a defensive cover, and we will, on our part, find the stated cover once we bring dedicated regulatory bodies into the fold. Having these well-defined authorities within each and every area was a game-changer, as it instantly gave us a safety net which we never had before. However, this whole utopia will take a substantial hit soon, and it will do so on the back of technology’s arrival. You see, with technology and its layered nature taking over the landscape, certain people would get an unprecedented chance to break the rules and face no consequences for doing so. This will cause a lot of disruption, but fortunately enough, we are now witnessing yet another power shift. In fact, one recent settlement involving Meta does a lot to prove the same.

Meta has officially reached a settlement with the US authorities in relation to a lawsuit, which accused the company of facilitating housing discrimination by letting advertisers specify that ads not be shown to people belonging to specific protected groups. As a part of the settlement, Meta will have to stop using the discriminatory algorithm for housing ads. In its place, the company has been asked to put-together a system that will “address racial and other disparities caused by its use of personalization algorithms in its ad delivery system.” This merry-go-around will also spell the end for Meta’s Special Ad Audiences tool that was allegedly selecting the audience on the basis of their race, national origin, and sex etc, an approach which DOJ deemed as the clear violation of US Fair Housing Act. However, as it only concerns one particular law, the court’s prohibition is applicable just on housing ads.

When quizzed about how it will achieve compliance in the stated context, Meta mentioned its plan of using machine learning technology to build a system, which will “ensure the age, gender and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” Notably enough, the company will have until December 2022 for proving the efficacy of this system. Nevertheless, even if Meta gets the approval, there will be a third-party to “investigate and verify on an ongoing basis.” Apart from the measures, the company will also need to cough up a penalty worth $115,054

 

Hot Topics

Related Articles