Facebook is radicalizing its users
In the summer of 2019, a new Facebook user named Carol Smith subscribed to the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith's account noted his interest in politics, paternity and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.
Although Smith has never expressed an interest in conspiracy theories, in just two days he recommended Facebook to join groups dedicated to QAnon , a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a gang of pedophiles and the devil. .
Smith did not follow the recommended QAnon groups, but whatever algorithm Facebook used to determine how it handled the platform, it proceeded the same way. Within a week, Smith's feeds were filled with groups and pages that violated Facebook's own rules , including those that ban hate speech and misinformation.
Smith was not a real person. A Facebook researcher invented the account, along with other "test user" accounts in 2019 and 2020, as part of an experiment studying the platform's role in misleading and attracting users through its recommendation systems.
That researcher said Smith's Facebook experiment was "a barrage of extremist and conspiratorial content and graphics."
The research group has consistently found that Facebook has pushed some users into "rabbit holes," the increasingly narrow echo chambers where violent conspiracy theories have flourished. People radicalized through rabbit holes make up a small slice of the total users, but on the scale of Facebook , that could mean millions of individuals.
The findings, reported in a report titled "Carol's Journey to QAnon," were among thousands of pages of documents disclosed to the Securities and Exchange Commission that were submitted to Congress in redacted form by legal counsel Francis Hogan, who served as the official Facebook Product Manager until May. Haugen now confirms whistleblower status and has made several specific complaints that Facebook is placing profits on public safety. Earlier this month, she testified about her allegations before a Senate subcommittee .
Copies of the revelations — which redacted the names of researchers, including the author of "Carol's Journey to QAnon" — have been digitally shared and reviewed by a range of news organizations, including NBC News . The Wall Street Journal published a series of reports based on several documents last month.
“While this was a study of a single virtual user, it is an excellent example of the research the company is doing to improve our systems and helped inform our decision to remove QAnon from the platform,” a Facebook spokesperson said in response to questions via email.
Denied Mark Zuckerberg, CEO of Facebook Tags , on widespread allegations Haugen, a motivation for "pioneering research program in the industry" for his company and its commitment to "identify the problems the task and work on it." Documents from Haugen partially support these claims, but they also highlight the frustration of some employees involved in this research.
Among Haugen's disclosures are research, reports, and internal publications that suggest Facebook has long known that its algorithms and recommendation systems push some users to their limits. While some directors and CEOs have ignored internal warnings, anti-vaccine groups, conspiracy theory movements, and disinformation agents have exploited, threatening public health, personal safety, and democracy in general.
"These documents effectively confirm what outside researchers have been saying for years, which Facebook has often dismissed, " said Renee DeResta, director of technical research at the Stanford Internet Observatory and one of the first indications of the dangers of Facebook's recommendation algorithms .
Facebook's research shows how easy it is for a relatively small group of users to hijack the platform, and for DiResta, it settles any remaining question about Facebook's role in the growth of conspiracy networks.
"Facebook literally helped facilitate the cult," she said.
Pattern on Facebook
For years, company researchers have been running experiments like Carol Smith to gauge the platform's hand in radicalizing users, according to documents seen by NBC News .
This internal work repeatedly found that recommendation tools drove users to extremist groups, findings that helped inform policy changes and modified recommendations and news ratings. These rankings are an ever-evolving, evolving system widely known as an "algorithm" that pushes content to users. But research at the time was not enough to inspire any movement to change the groups and pages themselves.
Hugin told reporters this month that this hesitation was indicative of a "pattern at Facebook ". "They want the shortest path between their current policies and any action."
"There is a great deal of reluctance to proactively solve problems," Hugin added.
A Facebook spokesperson questioned the search did not prompt the company to act and pointed to the changes to the groups announced in March .
While QAnon followers committed real violence in 2019 and 2020, groups and pages related to the conspiracy theories rose, according to internal documents. The documents also show how teams within Facebook have taken concrete steps to understand and address these issues - which some employees saw as too little, too late.
By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unpublished internal investigation .
A year after the FBI designated QAnon as a potential domestic terrorist threat following confrontations , planned kidnappings , harassment campaigns and shootings , Facebook labeled QAnon a "violent inciting conspiracy network" and banned it from the platform, along with militia and other violent movements. Social. A small team working across several Facebook divisions found that its platforms have hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, "praising, supporting or representing" the conspiracy theory.
A Facebook spokesperson said in an email that the company "has taken a more aggressive approach to how we reduce content that potentially violates our policies, as well as not recommending groups, Pages, or people who regularly post content that potentially violates our policies. "
For many employees within Facebook , the implementation came too late, according to posts left on Workplace , the company's internal message board.
“We have known for over a year now that our recommendation systems can very quickly lead users down their path to conspiracy theories and groups,” one integrity researcher, whose name has been deleted, wrote in a post announcing he is leaving the company. “This fringe group grew into prominence nationally, with QAnon candidates for Congress and QAnon hashtags and groups heading into the mainstream. We were only ready to act *after* things escalated to tatters.”
We should be concerned
While the Facebook ban seemed effective at first, a problem remained: removing groups and pages did not eliminate the most radical QAnon followers, who continued to organize on the platform.
said Marc-Andre Argentino, research fellow at the International Center for the Study of Radicalization at King's College London, who has extensively studied Canon.
Believers simply renamed them as anti-child trafficking groups or migrated to other communities, including those around the anti-vaccine movement.
It was a normal fit. Researchers within Facebook who study the platform's niche communities have found that violent conspiratorial beliefs are linked to Covid-19 vaccine hesitancy . In one study, researchers found that members of the QAnon community were also highly concentrated in anti-vaccine communities. Likewise, anti-vaccine influencers have taken advantage of the pandemic and have used Facebook features like groups and live broadcasts to grow their movement.
"We do not know if QAnon created the preconditions for vaccine frequency beliefs," the researchers wrote. “It may not matter either way. We should care about the people affected by both problems.”
See also the

No comments:
Post a Comment