Community Standards Policies
Facebook's Community Standards related to false news state "There is also a fine line between false news and satire or opinion. For these reasons, we don't remove false news from Facebook but instead, significantly reduce its distribution by showing it lower in the News Feed"
For inauthentic behaviors Facebook states that users must not:
"Engage in or claim to engage in Inauthentic Behavior, which is defined as the use of Facebook or Instagram assets (accounts, pages, groups, or events), to mislead people or Facebook: about the identity, purpose, or origin of the entity that they represent, about the popularity of Facebook or Instagram content or assets, about the purpose of an audience or community, about the source or origin of content, to evade enforcement under our Community Standards"
Facebook bans the use of "Videos that have been edited or synthesized, beyond adjustments for clarity or quality" included in this are videos where the subject says things they did not actually say, and that the video is a product of AI or machine learning. This ban does not extend to parody or satire.
Timeline of Changes:
November 10th 2016: Facebook CEO Mark Zucker denies that "fake news" on Facebook had influenced the U.S. presidential election.
December 2016: Facebook pledges to hire third-party fact checkers against misinformation.
April 27, 2017: Facebook acknowledges that governments and non-state actors were using the platform to influence political discourse.
October 2017: Facebook announces that ads linked to the Russian Internet agency were seen by an estimated 10 million people prior to and after the election. In November Facebook increased the estimate to 126 million users.
January 4, 2018: Mark Zuckerberg made it his yearly challenge to protect Facebook users from interference from nation states.
March 2018: increasing evidence that Aggregate IQ utilized Facebook data to influence the Brexit vote in the U.K.
April 2018 Zuckerberg testifies before congress and apologizes over Russia's Interference and the harvesting of user's private data by a company affiliated with the Trump campaign.
July 2018: After Facebook warns of increased expenses due in part to beefing up security and hiring more moderators.
September 5, 2018: Facebook and Twitter CEOs pledged to better protect their social networks against manipulation and foreign intervention. By this point both Facebook and Twitter were using artificial intelligence to protect against manipulation.
November 7, 2018: Leading up to the election Facebook blocked an unspecified number of accounts suspected of being part of foreign disinformation campaigns. Some accounts were claimed to be associated with the Internet Research Agency in Russia.
October 2019: Facebook announced new security systems. This included a special security tool for candidates and elected officials which monitors for hacking attempts. Facebook also would label state-controlled media as state controlled, invest more in media literacy, and label fact-checks more clearly. Facebook still allowed politicians to run advertisements which contained misinformation.
December 2019: Facebook bans misleading information about how to participate in the US Census.
February 2020 removes COVID-19 disinformation "with imminent physical harm" from both Facebook as well as Instagram. Display pop-ups from the WHO and other public health officials in searches. They banned ads promoting COVID-19 cures.
March 2020 Commits to invest $2 million in additional funding for fact checking organizations and $100 million in the news industry. The platform provided free advertising space for the WHO and other health agencies to raise awareness, they also launched the COVID-19 Educational Center with the WHO. Launches Get Digital, a media literacy initiative targeting youth.
May-June 2020: Facebook declined to remove President Trump's posts suggesting protestors in Minneapolis could be shot, as well as misinformation about voting by mail.
June 2020: Facebook would add labels to voting related posts which direct them to information from state and local election officials. Facebook launches the Voting Information Center and begins prioritizing original news reporting.
August 2020: Facebook begins to restrict QAnon but it is not banned.
September 3, 2020: Facebook curbs political ads in seven days before the U.S. election.
October 6, 2020: Facebook bans all groups that openly support QAnon.
October 12, 2020: Facebook removes holocaust denial content or content that distorts facts about the holocaust.
October 7, 2020: Facebook planned to halt all political advertisements after poll closings on November 3rd. It planned to label posts that cast doubt on the election with links to official information. Facebook began banning messages that promoted carrying weapons to polling places.
January 7, 2021: Facebook bans Donald Trump from posting on Facebook or Instagram for the remaining duration of his presidency.
January 27, 2021: Facebook ensures that holocaust denial content is taken off the platform
February 8 2021: Facebook began removing more vaccine misinformation claims, such as that the disease is man-made and that it is safer to get the disease than the vaccine. Despite this there continues to be a wealth of disinformation surrounding COVID-19 still on Facebook.