You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Trolls clearly enjoy the magnification effect they can have by spreading hate speech to everyone subscribed to a group, but groups are also used by spammers, can also just have off-topic posts, or reply posts that probably aren't of general interest. It's hard to tell automatically whether a reply or a new post is likely to contribute to the conversation.
Pre-moderation of posts by volunteer moderators could make groups more useful and less prone to abuse. I've seen this model work well in other social media services, particularly for large groups.
If a group is configured to identify a particular set of moderators, then it could only boost posts if one of the moderators has liked the post, for example.
The text was updated successfully, but these errors were encountered:
Sometimes it feels like there are more posts by scammers asking for money from instances where moderators believe (or even are?) the scammers and don't take any action than real posts.
Trolls clearly enjoy the magnification effect they can have by spreading hate speech to everyone subscribed to a group, but groups are also used by spammers, can also just have off-topic posts, or reply posts that probably aren't of general interest. It's hard to tell automatically whether a reply or a new post is likely to contribute to the conversation.
Pre-moderation of posts by volunteer moderators could make groups more useful and less prone to abuse. I've seen this model work well in other social media services, particularly for large groups.
If a group is configured to identify a particular set of moderators, then it could only boost posts if one of the moderators has liked the post, for example.
The text was updated successfully, but these errors were encountered: