The Verge reports on the new rules Facebook is adding to slow the spread of misinformation and other harmful content on its Groups feature. From the report: Some of the new policies encourage more active administration of groups. If administrators step down, they can invite members to take their place; if nobody does, Facebook will apparently “suggest” admin roles to members, then archive the group if that fails. Also, if group members accrue a community standards violation, moderators will have to approve all their posts for 30 days. If the moderators repeatedly approve posts that violate Facebook’s guidelines, the group could be removed.
The health guidelines take a broader approach by focusing on an entire category of content, not specific rule-breaking behavior. Facebook says that although groups can “be a positive space for giving and receiving support during difficult life circumstances … it’s crucial that people get their health information from authoritative sources.” Facebook also says it’s continuing to limit content from militia groups and other organizations linked to violence. Groups that discuss potential violence will be removed, and it will soon down-rank even non-violating content in the News Feed.
Read more of this story at Slashdot.