WASHINGTON — Last March, as false claims about vaccine safety threatened to undermine the world’s response to COVID-19, researchers at Facebook found they could reduce vaccine misinformation by tweaking how vaccine posts show up on users’ newsfeeds.
“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study, according to The Associated Press.
Yet despite evidence that it worked, Facebook took a full month to implement the changes at a pivotal time in the global vaccine rollout.
The AP also reports that the company completely shelved some suggestions from the study.
Facebook’s own documents revealed how comments on posts are a hotbed for anti-vaccine messages. But when another researcher suggested disabling comments on vaccine posts, that idea was ignored.
“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of internet watchdog Center for Countering Digital Hate, according to the AP. “It drives attention, and attention equals eyeballs, and eyeballs equal ad revenue.”
Those revelations were contained in disclosures made to the Securities and Exchange Commission by former Facebook employee-turned-whistleblower Frances Haugen.
In response to the AP's report, Facebook said it has made “considerable progress” with downgrading vaccine misinformation in users’ feeds.