In 2021 a former Facebook employee accused the social media network's algorithms of fueling the spread of misinformation.
Prior to the 2020 presidential election, Facebook enacted measures to reduce the spread of misinformation. The whistleblower alleged the company of removing those measures after the election, allowing the rapid spread of misinformation leading up to the insurrection at the U.S. Capitol.
Rebekah Tromble, a researcher at George Washington University, said, "Facebook has an official program for flagging mis-and-disinformation, it partners with fact checkers. But on the other hand, it doesn't do enough to ensure that those fact checks, algorithmically, get in front of the people who actually saw the disinformation in the first place."
Tromble explained that maintaining an algorithm that prevents misinformation would be costly to the social media giant.
She noted that if Facebook was serious about combatting misinformation it would "have to fundamentally change the underlying algorithm which ultimately motivates, incentivizes the sorts of disinformation that are fueled by fear and anger to run rampant on the platform."
- Dr. Rebekah Tromble, associate professor and director of the Institute for Data, Democracy and Politics at George Washington University