WASHINGTON: Sixteen months before the presidential election last November, a Facebook researcher described an alarming development. She was getting content on the QAnon conspiracy theory within a week of opening an experimental account, she wrote in an internal report.
On November 5, two days after the election, another Facebook staff member posted a message alerting colleagues that comments containing “combustible election misinformation” were visible under numerous posts. Four days later, a company data scientist wrote in a note to colleagues that 10% of all U.S. opinions on political material – a surprisingly large number – were posts that alleged the vote was fraudulent.
In each case, Facebook employees sounded the alarm bells about disinformation and inflammatory content on the site and urged action, but the company either failed or struggled to resolve the issues. . New internal Facebook documents provided by former employee-turned-whistleblower Frances Haugen provide insight into how the company appears to have stumbled in the January 6 riot. It quickly became apparent that even after years of being under the microscope for insufficiently monitoring its platform, the company had failed to understand how rioters spent weeks swearing – on Facebook itself – to d ‘prevent Congress from certifying Joe Biden’s electoral victory.
Facebook blamed the proliferation of poll lies on former President Donald Trump and other social platforms. In January, Sheryl Sandberg, chief operating officer of Facebook, said the January 6 riot on Capitol Hill was “largely organized on platforms that don’t have our capacity to stop hate.” Facebook CEO Mark Zuckerberg told lawmakers in March that the cabinet “did our part to ensure the integrity of our election.” But the documents show how well Facebook was aware of extremist groups on its site that tried to polarize American voters.
What the newspapers don’t offer is a full picture of decision making within Facebook. Some internal studies have suggested the company is struggling to control the speed at which information is disseminated, while other reports have suggested that Facebook is concerned about losing engagement. Yet what was undeniable was that Facebook’s own employees believed the social network could have done more.
On November 5, two days after the election, another Facebook staff member posted a message alerting colleagues that comments containing “combustible election misinformation” were visible under numerous posts. Four days later, a company data scientist wrote in a note to colleagues that 10% of all U.S. opinions on political material – a surprisingly large number – were posts that alleged the vote was fraudulent.
In each case, Facebook employees sounded the alarm bells about disinformation and inflammatory content on the site and urged action, but the company either failed or struggled to resolve the issues. . New internal Facebook documents provided by former employee-turned-whistleblower Frances Haugen provide insight into how the company appears to have stumbled in the January 6 riot. It quickly became apparent that even after years of being under the microscope for insufficiently monitoring its platform, the company had failed to understand how rioters spent weeks swearing – on Facebook itself – to d ‘prevent Congress from certifying Joe Biden’s electoral victory.
Facebook blamed the proliferation of poll lies on former President Donald Trump and other social platforms. In January, Sheryl Sandberg, chief operating officer of Facebook, said the January 6 riot on Capitol Hill was “largely organized on platforms that don’t have our capacity to stop hate.” Facebook CEO Mark Zuckerberg told lawmakers in March that the cabinet “did our part to ensure the integrity of our election.” But the documents show how well Facebook was aware of extremist groups on its site that tried to polarize American voters.
What the newspapers don’t offer is a full picture of decision making within Facebook. Some internal studies have suggested the company is struggling to control the speed at which information is disseminated, while other reports have suggested that Facebook is concerned about losing engagement. Yet what was undeniable was that Facebook’s own employees believed the social network could have done more.