Discord is giving admins and moderators a new tool to keep their servers safe. On Thursday, the company introduced AutoMod, a feature that can automatically detect and block harmful messages before they’re posted. Accessible through Discord’s “Server Settings” menu, the tool allows admins and moderators to create a list of words and phrases they want Discord to look for, along with a set of repercussions for those who use them.
For instance, you can configure AutoMod to prevent a user from sending messages or joining voice channels after triggering the bot's safeguards. It’s also possible to set up the tool to automatically notify you when someone writes something offensive. Discord has put together three starting lists that cover “certain categories of not-nice words or phrases.” Moderators can add up to three additional custom filter lists to suit the needs of their users. At launch, AutoMod is only available to Community servers.
Alongside AutoMod, Discord is introducing two new resources to help admins. The first is a dedicated hub with articles penned by experienced community builders. “Whether you’re just getting started or need help onboarding the newest round of moderators to the team, the Community Resources page is here to help your team thrive at any stage of your journey,” Discord says of the hub.
The other new resource is a dedicated admin community server run by Discord staff. Here, the company says moderators can gather to chat and learn from one another. Discord also plans to run educational events and share news through the space.
Discord also announced today it’s expanding the availability of Premium Memberships, a monetization tool the company debuted at the end of 2021, to more US servers this summer. With the expansion, the company is also adding new features to the suite, including an updated analytics dashboard and the option to offer free trials to people.