Tech
AutoMod could help human moderators clean up sites like Reddit
It's not perfect, but it's a step in the right direction.
AutoMod is exactly what it sounds like: a computer-automated moderator. The software uses rules to identify words that violate posting policies in an app or on a website, thereby saving human moderators time and mental space. It’s a powerful tool used across sites like Reddit, Twitch, and Discord — but it’s not perfect by any definition of the word, so humans still need to moderate alongside it.
AutoMod’s shortcomings are well-known at this point. It’s unable to understand context, which means it often auto-deletes posts that shouldn’t have been deleted. Users are then forced to ask for re-moderation by a human. AutoMod also can’t understand images, so it’s forced to utilize user-added descriptors to filter through pictures.
Mental health savior — Moderation is a notoriously draining occupation, especially on uber-popular sites like Reddit, where, between March and October of 2018, about 17.4 million posts were removed. Because human moderators are employed with the goal of removing offensive or otherwise pernicious content, they’re forced to watch videos of murders and read racist hate speech on a daily basis. This is where AutoMod does its best work: in cutting down the mental health burden for moderators.
Toward a clean, automated future — Though AutoMod can only detect and delete some kinds of rule-breaking posts, it’s still beloved by human moderators. Developers are still finding innovative ways to program AutoMod — such as teaching it spam patterns in image posts — and from here it’s only set to grow smarter. Perhaps someday soon we’ll figure out how to teach computers the intricacies of human language, and then AutoMod will be unstoppable. And in the meantime, any method by which we can cut down on the rate of moderators experiencing post-traumatic stress disorder is worth using.