Facebook implements new moderation policies in response to U.S. violence
What you need to know
- Facebook today announced new updates to its content moderation policies in response to violence at the U.S. Capitol this afternoon.
- The company also said it would be updating the warning label applied to posts about the election to note Joe Biden's confirmation as the victor.
- It also confirmed that all moderation policies introduced in the lead-up to the election would remain active.
Facebook tonight announced an update to its content moderation policies following a violent insurrection in the U.S. Capitol tonight.
The company said that it had been removing content which either praised the incident, called for armed support, or aimed to incite a repeat either tomorrow or in the coming days. It also acknowledged its removal of Trump's video posted following the event, noting that it "contribute[d] to, rather than diminish[ed], the risk of ongoing violence."
Facebook will also be updating the electoral misinformation labels it introduced last year to read "Joe Biden has been elected President with results that were certified by all 50 states. The US has laws, procedures, and established institutions to ensure the peaceful transfer of power after an election."
It will be keeping active the other new policies and measures introduced in the lead up to the election and adding new ones including:
- Increasing the requirement of Group admins to review and approve posts before they can go up
- Automatically disabling comments on posts in Groups that start to have a high rate of hate speech or content that incites violence, and
- Using AI to demote content that likely violates our policies.
The company follows social media competitor Twitter which took drastic action including suspending the outgoing President's Twitter account and deleting offending tweets.
Be an expert in 5 minutes
Get the latest news from Android Central, your trusted companion in the world of Android