.

Working Our Way Towards a Healthier and More Secure Social Media Space

Despite all the intelligence at their disposal, human beings have failed rather sensationally at not making mistakes. This dynamic, in particular, has already been reinforced quite a few times throughout our history, with each testimony practically forcing us to look for a defensive cover. We will, on our part, find the stated cover once we bring dedicated regulatory bodies into the fold. Having a well-defined authority across each and every area was a game-changer, as it instantly gave us a safety cushion against our many shortcomings. Now, the kind of utopia you would generally expect from such a development did arrive, but it couldn’t quite stick around for very long. Talk about the factors that ended up limiting it, the discussion will have to begin from technology. You see, the moment technology got its layered nature to take over the scene, it allowed every individual an unprecedented chance to exploit others for their own benefit. In case this didn’t sound bad enough, the whole runner soon began to materialize on such a massive scale that it expectantly overwhelmed our governing forces and sent them back to square one. After spending a lengthy spell in the wilderness, though, it seems like the regulatory contingent is, at last, preparing for a major comeback. The same has turned more and more apparent in recent times, and truth be told, Meta’s latest decision might just make that trend bigger and better moving forward.

Meta has officially rolled out new privacy updates for teens on Instagram and Facebook. According to certain reports, these updates look to bolster user privacy by handing you more control in terms of who can see your friends lists, the posts you can be tagged in, the people and pages you follow, and who is allowed to comment on your public posts. While the stated settings will be automatically applied to everyone under the age of 16, or in some countries under 18, who is looking to join either of the two social media platforms, the ones who are already onboard will also be encouraged to opt for the new newly-introduced options. Apart from that, Meta is testing ways to protect teens from any unknown suspicious adults who might try to message them. So, let’s say, if an adult is blocked or reported by a young user, their account won’t show up in suggestions for other teen users as well. Furthermore, the social media conglomerate is also mulling over the idea of removing the message button on teens’ Instagram accounts when they are being viewed by adults who have a suspicious track record on the platform. Hold on, there is more. Meta is even working alongside the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried their intimate images might be shared online without their consent.

“We’ve been working closely with NCMEC, experts, academics, parents and victim advocates globally to help develop the platform and ensure it responds to the needs of teens so they can regain control of their content in these horrific situations,” Meta said in a blog post. “We’ll have more to share on this new resource in the coming weeks.”

Once this platform is successfully built, Meta’s plan is to make it available for all companies across the tech industry.

Next in line of collaborations is Meta’s partnership with Thorn and the latter’s NoFiltr brand. Here, the organizations are trying to create educational materials that reduce the shame and stigma surrounding intimate images, thus conceiving a safer experience for people at all age levels, and most importantly, teen users.

Hot Topics

Related Articles