Everybody knows it’s happening, what do we know about it? Facebook misinformation OP-ED
October 19, 2021
Frances Haugen was identified in a “60 Minutes” interview last Sunday (the third of October) as
the woman who anonymously filed complaints with federal law enforcement that the
company’s own research shows how it magnifies hate and misinformation.
“Facebook, over and over again, has shown it chooses profit over safety,” she said. Haugen,
who will testify before Congress this week, said she hopes that by coming forward the
government will put regulations and laws in place to govern Facebook and other social media
platforms’ activities.
Preparing for the 2020 election, Facebook implemented safe guards and other precautions to
stop the spread of misinformation, but then took them off completely after Biden had won the
election, leading up to the US Capital riot on January 6th.
“And as soon as the election was over, they turned them back off or they changed the settings
back to what they were before, to prioritize growth over safety.”, said Frances Haugen.
She said Facebook prematurely turned off safeguards designed to stop misinformation and
other information after Joe Biden defeated Donald Trump last year in the election, saying that
the misinformation contributed to the Jan. 6 invasion of the U.S. Capitol.
After the election Facebook decided to dissolve a unit on civic integrity where she had been
working, which Haugen said was the moment she realized “I don’t trust that they’re willing to
actually invest what needs to be invested to keep Facebook from being dangerous.”
In 2019, a year after Facebook changed its algorithm to encourage engagement, its own
researchers identified a problem, according to internal company documents.
“The company set up a fake Facebook account, under the name “Carol,” as a test and followed
then-President Trump, first lady Melania Trump and Fox News. Within one day, the algorithm
recommended polarizing content. The next day, it recommended conspiracy theory content,
and in less than a week, the account received a QAnon suggestion”, the internal documents
said.
By the second week, the fake account’s Feed was filled with misleading or false content. In the
third week, “the account’s News Feed is an intensifying mix of misinformation, misleading and
recycled content, polarizing memes, and conspiracy content, interspersed with occasional
engagement bait,” the internal documents said.
The problems are definitely the algorithms that decide what shows up on users’ feeds, and how
they favor hateful content. Haugen said, “A 2018 change to the content flow contributed to
more divisiveness and ill will in a network ostensibly created to bring people closer together.”
the woman who anonymously filed complaints with federal law enforcement that the
company’s own research shows how it magnifies hate and misinformation.
“Facebook, over and over again, has shown it chooses profit over safety,” she said. Haugen,
who will testify before Congress this week, said she hopes that by coming forward the
government will put regulations and laws in place to govern Facebook and other social media
platforms’ activities.
Preparing for the 2020 election, Facebook implemented safe guards and other precautions to
stop the spread of misinformation, but then took them off completely after Biden had won the
election, leading up to the US Capital riot on January 6th.
“And as soon as the election was over, they turned them back off or they changed the settings
back to what they were before, to prioritize growth over safety.”, said Frances Haugen.
She said Facebook prematurely turned off safeguards designed to stop misinformation and
other information after Joe Biden defeated Donald Trump last year in the election, saying that
the misinformation contributed to the Jan. 6 invasion of the U.S. Capitol.
After the election Facebook decided to dissolve a unit on civic integrity where she had been
working, which Haugen said was the moment she realized “I don’t trust that they’re willing to
actually invest what needs to be invested to keep Facebook from being dangerous.”
In 2019, a year after Facebook changed its algorithm to encourage engagement, its own
researchers identified a problem, according to internal company documents.
“The company set up a fake Facebook account, under the name “Carol,” as a test and followed
then-President Trump, first lady Melania Trump and Fox News. Within one day, the algorithm
recommended polarizing content. The next day, it recommended conspiracy theory content,
and in less than a week, the account received a QAnon suggestion”, the internal documents
said.
By the second week, the fake account’s Feed was filled with misleading or false content. In the
third week, “the account’s News Feed is an intensifying mix of misinformation, misleading and
recycled content, polarizing memes, and conspiracy content, interspersed with occasional
engagement bait,” the internal documents said.
The problems are definitely the algorithms that decide what shows up on users’ feeds, and how
they favor hateful content. Haugen said, “A 2018 change to the content flow contributed to
more divisiveness and ill will in a network ostensibly created to bring people closer together.”
While speaking to “60 Minutes,” Haugen explained how the misleading and fake content
reaches Facebook users.
“There were a lotta people who were angry, fearful. So, they spread those groups to more
people. And then when they had to choose which content from those groups to put into
people’s News Feed, they picked the content that was most likely to be engaged with, which
happened to be angry, hateful content. “
“So, imagine you’re seein’ in your News Feed every day the election was stolen, the election
was stolen, the election was stolen. At what point would you storm the Capitol, right?” Haugen
said.
“And you can say, ‘How did that happen?’ Right? Like, ‘Why are we taking these incredibly out-
there topics? QAnon, right, crazy conspiracies. Why are these the things that Facebook is
choosing to show you?’ And it’s because those things get the highest engagement,” Haugen
said, comparing it to “gasoline on a fire.”
Haugen is testifying to the Senate Commerce Committee on Tuesday (October 5th). “Facebook,
over and over again, has shown it chooses profit over safety,” she said.
reaches Facebook users.
“There were a lotta people who were angry, fearful. So, they spread those groups to more
people. And then when they had to choose which content from those groups to put into
people’s News Feed, they picked the content that was most likely to be engaged with, which
happened to be angry, hateful content. “
“So, imagine you’re seein’ in your News Feed every day the election was stolen, the election
was stolen, the election was stolen. At what point would you storm the Capitol, right?” Haugen
said.
“And you can say, ‘How did that happen?’ Right? Like, ‘Why are we taking these incredibly out-
there topics? QAnon, right, crazy conspiracies. Why are these the things that Facebook is
choosing to show you?’ And it’s because those things get the highest engagement,” Haugen
said, comparing it to “gasoline on a fire.”
Haugen is testifying to the Senate Commerce Committee on Tuesday (October 5th). “Facebook,
over and over again, has shown it chooses profit over safety,” she said.