Back in October, and ahead of a contentious presidential election, Facebook temporarily stopped using algorithms to recommend political and civic groups to its users. Now, as part of broader efforts to curb political friction on the social media platform, Facebook CEO Mark Zuckerberg announced this week that the policy will be a permanent one.
Watchdog and advocacy groups, though, have raised concerns that go far beyond political bantering and say Facebook groups have been used by extremist groups to spread misinformation online and to organize dangerous attacks. Facebook’s move to permanently stop recommending political groups comes in the aftermath of the Jan. 6 insurrection at the Capitol staged by loyalists to former President Donald Trump. It also aligns with new warnings from the Department of Homeland Security that the country faces a mounting threat from “violent domestic extremists” who were emboldened by the Capitol attack.
While homeland security’s domestic terrorism alert didn’t pinpoint any specific groups that could be plotting future attacks, it warned their motivation would include anger stemming from “the presidential transition, as well as other perceived grievances fueled by false narratives.”
In a call with analysts to discuss the company’s fourth-quarter earnings, Zuckerberg said he wants to cut back altogether on the amount of political content users see in their Facebook feeds.
“One of the top pieces of feedback that we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services,” Zuckerberg said in the call, according to CNBC.
Facebook Groups are communities on the social media platform that form around shared interests. Those that are public can be searched for and joined by anyone on Facebook.
U.S. Sen. Ed Markey, D-Mass., has lambasted Facebook’s system for recommending political groups, saying they pose grave threats to American democracy and public safety. In a letter he sent to Zuckerberg, he said the groups are “are breeding grounds for hate, echo-chambers of misinformation, and venues for coordination of violence, including explicit planning for the insurrection at the US Capitol on January 6, 2021.”
Researchers from the University of Colorado-Boulder in the fall published a paper in the journal Human Communication Research that shows Facebook tends to be a more fertile breeding ground for fake news than Twitter, and those who are on the far ends of the liberal-conservative spectrum are most likely to hit the share button.
What do you think of Facebook’s strategy to curtail political content in your news feed and to stop recommending political groups?