The Complexity of Social Media Moderation
Moderating social media platforms is complex and expensive. Have we been doing it wrong? Is there a better approach? Possibly.
Jagmeet makes a short video about Bhangra dance moves, in which he makes a hand gesture a few times. The social media platform’s automated system flags it for moderation. Janil, sitting in her contracted office in a sea of cubicles sees the video and interprets it, based on her training as inappropriate and has the post deleted since it seems to be a negative political statement. It isn’t. But this is part of the complexity of social media moderation in a hyper-connected world.
For years, most social media platforms approach to moderation has been based on a sort of binary code of universalities and broad brushes. But what if the way platforms have been approaching moderation needs to be viewed through a different lens?
This is becoming an even bigger concern as platforms like X and Facebook pull back on moderation and take approaches like “community notes”, abrogating their responsibilities and passing it on to the users. In some ways, community-based approaches can do prove useful and work, but not as the only solution.