Automated Content Moderation Increases Adherence to Community Guidelines
Online social media platforms use automated moderation systems to remove or reduce the visibility of rule-breaking content. While previous work has documented the importance of manual content moderation, the effects of automated content moderation remain largely unknown, in part due to the technical and ethical challenges in assessing their impact using randomized experiments. Here, in a large study of Facebook comments (n=412M), we used a fuzzy regression discontinuity design to measure the impact of automated content moderation on subsequent rule-breaking behavior (number of comments hidden or deleted) and engagement (number of additional comments posted). We found that comment deletion decreased subsequent rule-breaking behavior in shorter threads (20 or fewer comments), even among other participants, suggesting that the intervention prevented conversations from derailing. Further, the effect of deletion on the affected user's subsequent rule-breaking behavior was longer-lived than its effect on reducing commenting in general, suggesting that users were deterred from rule-breaking but not from continuing to comment. However, hiding (rather than deleting) content had small and statistically insignificant effects. Overall, our results suggest that automated content moderation can increase adherence to community guidelines.
Continue reading and listening
Stay in the loop.
Subscribe to our newsletter for a weekly update on the latest podcast, news, events, and jobs postings.