We live in a world that is hugely interconnected. As a result of it, we always have to be cautious about how we go about things in a particular sphere, because it could very well lead to implications that might be felt in a whole another area. To ensure this is consistently done across the board, we have introduced many dedicated governing forces that are tasked with supervising operations of every nature. Now, even though the core purpose of these forces has remained pretty much the same over the years, their role in our lives has been reinvented time and again. The restructuring of this role is historically inspired by our evolving objectives and in an age where digital realm reigns supreme, things like privacy issues and social security in general have become an agenda of utmost importance. In our pursuit to better protect the modern generation from all the detrimental effects of a tech-centric world, we are now finally ramping up our efforts, and the latest piece of it has seemingly put Facebook under buckling pressure.
Things have gone from bad to worse for Facebook after a former company employee took the role of a whistleblower and outed some of its most private secrets. Frances Haugen, the whistleblower, appeared on the 60 minutes show and accused Facebook of using algorithms that encouraged hate speech across its platform. Upon leaving her job at the company earlier this year, Frances had leaked a string of documents to the Securities and Exchange Commission, a move she justified by the intention of bringing more regulation within the stratosphere of organizations like Facebook.
‘There was conflict… between what was good for the public and what was good for Facebook,” Haugen told the host, Scott Pelley, “and Facebook chose over and over again to optimize for its own interests — like making more money.”
One of the many intriguing things mentioned by Frances during the show was the fact that a bigger chunk of these problematic algorithms were put in place during 2018. Even though at the time Mark Zuckerberg had advertised the changes as a symbol of company’s growing focus on social well-being rather than just being a product for fun, the documents shared by Frances tell an utterly different story.
“Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” the internal documents revealed.
Facebook, in response, has so far deflected all the allegations. The company’s Vice President of Global Affairs has even come out and claimed that it was all a part of people’s willingness to find a false sense of comfort in finding a “technological, or technical, explanation for the issues of political polarization in the United States.”