- Facing criticism that it spreads extremism and misinformation, Facebook is taking new steps to respond.
- It will more closely monitor groups that spread fake information and limit links that are more prominent on Facebook than across the web.
- It’s also adding more expert fact checkers from outside the company.
Facebook says it’s rolling out a wide range of updates aimed at combating the spread of false and harmful information on the social media site. The updates will limit the visibility of links found to be significantly more prominent on Facebook than across the web as a whole. Facebook is calling this a “click-gap” signal.
The company is also expanding its fact-checking program with outside expert sources, including The Associated Press, to vet videos and other material posted on the social network. Facebook Groups will also be more closely monitored to prevent the spread of fake information, including “reducing the reach of Facebook Groups that repeatedly share misinformation.”
The company has been facing criticism for the spread of extremism and misinformation on its flagship site and on Instagram. Congress members questioned a company representative Tuesday about how Facebook prevents violent material from being uploaded and shared on the site.
In a post on the Facebook Newsroom page, Guy Rosen, VP of integrity, and Tessa Lyons, head of News Feed integrity wrote: “Since 2016, we have used a strategy called “remove, reduce, and inform” to manage problematic content across the Facebook family of apps. This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information so they can choose what to click, read or share. This strategy applies not only during critical times like elections, but year-round.”