Introducing a New Age of Moderation

As smart as human beings have proven themselves to be, it hasn’t saved us from our own flaws. In fact, these flaws have popped up on the surface time and time again throughout our history, with their each appearance bringing a unique level of detriment into play, therefore forcing us look for a well-defined defensive cover. To the world’s credit, we’ll find the stated cover once we bring dedicated regulatory bodies into the fold. The move was a massive game-changer, considering it instantly made us feel more protected than ever before. However, the entire utopia was also pretty short-lived, and if we were to figure out the reason behind it, we’ll see how technology was really the main culprit. You see, by giving people an easy shot at exploiting others and their flaws, technology would end up deflating the momentum that we so dearly mustered up under regulatory stewardship. This ushered us into a phase where the rule breakers were able to get their way without facing any consequences at all, but fortunately enough, it seems like we might be about to get over this hump. The said dynamic is clearly visible across some recent events, and Discord’s latest move only bolsters its presence moving forward.

After running a beta program, Discord is now finally releasing AutoMod, an autonomous moderation tool built directly into the platform to take the strain off human moderators, for the wider public. Apart from it, the social media platform, from here onwards, will allow more servers to offer extensive membership packages to their users, while also providing them with new features, including an analytics dashboard, free trials, and custom server emoji, to better manage those memberships. Talk about AutoMod, it is specifically designed to scan the servers for a selected set of words and phrases. If the tool finds anything that matches its prohibited words’ list, it will automatically dole out stringent responses like blocking messages, alerting admins, and timing out the offenders. Interestingly, while the tool already comes with three separate lists of such content, moderators will have the option to add another three into the mix. According to certain reports, the future AutoMod versions are also likely to be equipped in terms of detecting and blocking harmful links.

The moderation tool is available to moderators, who have the “manage server” permission. Furthermore, at the moment, it is only available for community servers.

While explaining the intention that went into creating AutoMod, Discord said:

“Community moderation should be a hobby you celebrate, and stepping away to make time for yourself shouldn’t rack your mind with guilt of what could be happening when you’re gone. Moderating your growing community should feel rewarding and fulfilling, not add constant stress from dealing with bad actors or unruly members.”

Hot Topics

Related Articles