.

An Alarming Revelation

Our world is always dealing with a number of forces at a given time. If the huge volume wasn’t enough, these forces also tend to be quite different in their nature. Now, while interacting with such forces is a part and parcel of our daily life, it’s important to realize that not every single one of them leaves a positive impact on us. What’s even worse, though, is that we are only responsible for a bigger chunk of detrimental effects in play here. You see, human beings are well-known to be a little self-centered at times, and in a bid to place our personal interests over everything else, we sometimes end up presenting others with unwarranted difficulties. Hence, as a way of better protecting the larger interest, we brought dedicated regulatory bodies into the fold. The idea would go on to become a big success, but the dynamics will change drastically upon technology’s arrival. All of a sudden, there were whole new definitions for what is right and what is wrong. Many tech platforms were able to use this confusion advantageously, as they fell off the ethical wheel and started exploiting their users in various hidden ways. Nevertheless, it seems like the regulators are finally catching up. Over the recent past, we have witnessed a flurry of social media companies coming under extreme scrutiny, and to avoid a similar treatment, TikTok announced one major policy change a few days ago. A report from the Wall Street Journal now reveals how that change walked into existence..

According to WSJ report, TikTok, after Instagram, has turned into a new source of harmful content for young people. The report talks at length about how extreme weight loss challenges, purging techniques, and deadly diets advertised by TikTok are increasingly causing severe eating disorders prominently in young girls. To back it up with concrete evidence, WSJ performed a full-fledged experiment, which saw it creating over 100 accounts “that browsed the app with little human intervention,” 12 of these were bots registered as 13-year olds that streamed content over weight loss, alcoholism, and gambling. The early observations told a lot about how the platform was actually functioning. Basically, as soon as a bot would stop watching, let’s say, gambling-related videos and turn to alcohol-related content, TikTok quickly responded by making significant changes in the user feed by flooding the bot’s feed with alcohol-related videos. The sheer lack of thought to structuring the feed as per a particular age was alarming. By the end of experiment, WSJ discovered that from 255,000 videos watched; nearly 32,700 contained a description or metadata that matched with keywords concerning weight loss. On the other hand, over 11,615 videos happen to be purposed around eating disorders. Interestingly, the report also shared how these videos deliberately misspelled eating disorder-related keywords, so to not get flagged by TikTok

The announcement from TikTok regarding a renewed approach around its content recommendation system came just a day before WSJ report was released. We don’t know if it was a mere coincidence or not, but WSJ did share about contacting TikTok during the formation of the report.

“While this experiment does not reflect the experience most people have on TikTok, even one person having that experience is one too many,” said TikTok spokesperson, Jamie Favazza, in response to the report.

Hot Topics

Related Articles