Stock Groups

What Facebook knew about how it radicalized users

[ad_1]

This illustration, taken on October 4, 2021, shows the Facebook logo and stock symbol through broken glass.

Dado Ruvic | Reuters

Summer 2019 will see a brand new FacebookCarol Smith, a user from Wilmington North Carolina, signed up to the platform. She described herself as a political conservative mom. Smith had an account that indicated an interest and passion for politics, parenting, and Christianity. She also followed several of her favourite brands including Fox News, then-President Obama. Donald Trump

Smith was not interested in conspiracy theories. But, Facebook suggested Smith to join QAnon group within two days. It is a conspiracy theory and movement which claims that Trump secretly saves the world.

Smith did not follow any of the recommended QAnon pages, but the algorithm Facebook used to decide how Smith should interact with it continued on. Smith found her feed full of pages and groups violating Facebook’s rules in less than a week.

Smith isn’t real. Facebook created the account along with others in 2019. and 2020 to test the effectiveness of its recommendation systems in misinforming users and polarizing them.

Smith’s experience on Facebook was described by the researcher as “extreme, conspiratorial, graphic content.”

According to research, Facebook was a major factor in some people falling into so-called “rabbit holes”, which were increasingly narrow echo chambers that promoted conspiracy theories and violence. Although these people make up only a fraction of all users on Facebook, this can translate to millions.

This report, titled “Carol’s journey to QAnon”, contains the findings. It was one of thousands of pages that were disclosed to Securities and Exchange Commission. These documents were redacted by Frances Haugen (legal counsel), who served as a Facebook product manger until May. Haugen has now declared her whistleblower status, and filed numerous complaints that Facebook put profit above public safety. She testified earlier this month before a Senate subcommittee about her claims.

Versions of the disclosures — which redacted the names of researchers, including the author of “Carol’s Journey to QAnon” — were shared digitally and reviewed by a consortium of news organizations, including NBC News. The Wall Street Journal published several reports that were based on the information last month.

A Facebook spokesperson stated that the study was only one user’s, but it shows the extent of the research done by the company to improve its systems. This information helped to inform the decision to delete QAnon.

Facebook CEO Mark ZuckerbergHaugen claims were denied in large part. Haugen defended his “industry-leading” research program and commitment to “identify and address important issues.” Although the documents Haugen released partially support these claims, they also show the frustrations experienced by some employees who were involved in the research.

Haugen revealed research, reports, and posts from internal Facebook groups that suggested Facebook knows its recommendation algorithms push some users towards extremes. It was clear that some management and executives did not listen to the warnings. However, conspiratorial movements, anti-vaccine activists, and disinformation agents exploited this permission, placing at risk public safety, democracy, health, and democratic order.

“These documents effectively confirm what outside researchers were saying for years prior, which was often dismissed by Facebook,” said Renée DiResta, technical research manager at the Stanford Internet Observatory and one of the earliest harbingers of the risks of Facebook’s recommendation algorithms.

Facebook has revealed that it is possible for a very small percentage of its users to take over the platform. DiResta believes this answers all questions regarding Facebook’s role and contribution in growing conspiracy networks.

According to her, “Facebook helped facilitate the formation of a cult.”

“A Facebook pattern”

According to documents obtained by NBC News, researchers at companies had conducted experiments such as Carol Smith’s for years to determine if the platform was capable of radicalizing users.

These internal investigations repeatedly revealed that users were being pushed into extremist groups by recommendation tools. This information was used to inform changes in policy and updates to news feed rankings and recommendations. Rankings are the system behind pushing content to users. They’re a complex, constantly-evolving system commonly known as “The Algorithm”. The research did not inspire any movements to alter the groups or pages.

Haugen said this month that the reluctance was indicative “a pattern at Facebook”. They want to find the fastest path from their existing policies to any future action.

Haugen stated that there is a lot of hesitancy about solving problems proactively.

Facebook spokeswoman denied that research did not push the company to take action and pointed out changes made to Facebook groups in March.

Internal documents show that QAnon fans committed real-world violence between 2019 and 2020. However, the number of conspiracy theories related groups and pages exploded. The documents also show how teams inside Facebook took concrete steps to understand and address those issues — some of which employees saw as too little, too late.

According to unreleased internal investigations, Facebook had thousands of pages and groups for private QAnons by summer 2020. There were also millions of followers and members.

One year ago, the FBI declared QAnon a domestic terrorist threat. This was in response to standoffs. harassment campaigns. shootings. Facebook then labeled QAnon a ‘Violence Inciting Conspiracy Network’ and removed it from its platform along with other violent social movements. One small group working in several departments of Facebook discovered that its Facebook and Instagram platforms hosted hundreds of advertisements worth thousands of dollars. These ads were worth millions of views and “praising,” supporting or representing the conspiracy theory.

A Facebook spokesperson explained in an email how the company had “taken more aggressive approaches in how we reduce content which is likely to violation our policies in addition to not suggesting Groups, Pages and people who frequently post content that violates our policies.”

According to messages left by employees at Facebook on Workplace (the company’s internal messaging board), the enforcement was too late for many.

In a posting, one integrity researcher said that “we’ve known over a decade now that our recommendation system can very quickly lead users along the path to conspiracy theories or groups.” She had written the post in an announcement about her departure from the company. This fringe group is now gaining national attention, and QAnon hashtags are trending on the main stream. We were willing to act only * after * things had spiraled into a dire state.”

“We need to be worried”

Although Facebook’s ban appeared to be effective at first, there was a problem: The elimination of pages and groups didn’t completely eliminate QAnon’s extreme followers who organized on Facebook.

“There was enough evidence to raise red flags in the expert community that Facebook and other platforms failed to address QAnon’s violent extremist dimension,” said Marc-André Argentino, a research fellow at King’s College London’s International Centre for the Study of Radicalisation, who has extensively studied QAnon.

Believers simply rebranded as anti-child-trafficking groups or migrated to other communities, including those around the anti-vaccine movement.

It seemed a perfect fit. Facebook researchers found that violent conspiratorial beliefs were linked to Covid-19 vaccine resistance. Researchers found that QAnon members also tended to be anti-vaccine community members in one study. The opportunity presented by the pandemic was also embraced by anti-vaccine activists who used Facebook’s livestreaming and groups features to expand their networks.

Researchers wrote that they don’t know if QAnon has created vaccine hesitancy beliefs. The researchers said that it may not make a difference. Both problems can affect people.

QAnon followers also jumped on groups that supported former President Donald Trump’s falsified claim that the 2020 elections were stolen. They also trafficked with a variety of conspiracy theories that claimed voters, Democrats, election officials, etc., to cheat Trump of a second term. According to BuzzFeed News’ April report, this new group, which was largely formed on Facebook, stormed Washington Capitol on January 6.

The report found that conspiracy groups were among the fastest growing on Facebook. But Facebook was unable to stop their “meteoric rise,” as the researchers explained. This is because they looked at each entity in isolation rather than as part of a larger movement. BuzzFeed News was told by a spokesperson for Facebook that while it had taken many measures to reduce election misinformation, it wasn’t able to capture everything.

Facebook enforcement was “piecemeal”, according to the researchers. The group also wrote that they were building protocols and policy discussions in order to improve next time.

‘A head-heavy problem’

The Capitol attack prompted employees to be honest about their feelings.

One group invoked lessons learned from QAnon’s time to warn against permissiveness regarding anti-vaccine contents and groups. Research found this content made up nearly half of vaccine impressions on the platform.

The report stated that “In rapidly developing situations, we have often taken little action at first due to a combination policy and product limits making it extremely difficult to design, obtain approval for and roll out new interventions fast.” QAnon was used as an example to show how Facebook responded to societal concerns about the potential harms of implementing entity takedowns in a crisis that had previously seen “limited or zero action.”

In addition to reviving efforts to reverse the election, the effort also prompted more proactive cleaning of the platform.

Facebook’s team responsible for “Dangerous Content,” formed a work group early in 2021 in order to find ways to address the users who have been a challenge to Facebook. These communities included QAnon, Covid denialists, misogynists incel movement, as well as other hate and terrorist groups that pose a danger to individuals and society.

It wasn’t their goal to be eradicated. We wanted to slow the growth of the newly-branded “harmful topics communities” using the same algorithmic techniques that allowed them grow beyond control.

The team stated that they know how to identify and remove malicious content and adversarial actors. However, we still don’t understand how to manage them.

They got very creative in a February report. The integrity team described the internal system it used to prevent users being subjected to societal harms like radicalization, polarization, and discrimination. It was based on its existing recommendation systems. This program builds on an earlier research effort called “Project Rabbithole.” Cornelis Drebbel, a Dutch engineer of 17th century famed for creating the first navigational submarine and first thermostat.

Drebbel’s task was to find and then stop the harmful paths users took on Instagram and Facebook, as well as in Anti-Vacine and QAnon communities.

The Drebbel research team posted a post praising the previous work on users. Drebbel is able to increase this scale, they stated.

The group posted to Workplace on Facebook, its internal messaging board. It stated that “group joins” can serve as a signal for those who are moving towards disruptive and harmful communities. This can be a way to prevent harm from occurring.

Facebook’s Deamplification Roadmap features prominently the Drebbel group. It is a multistep plan that was posted on Facebook Workplace Jan. 6, and includes an extensive audit of recommendations algorithms.

The Drebbel group published a March update about the progress of a study, and offered a path forward. Researchers could identify “gateway” groups that feed into anti-vaccination or QAnon communities. They wrote that Facebook might be able to put up barriers to prevent people falling down the rabbit hole.

Drebbel’s study of “Gateway Groups” looked at a number of anti-vaccine and QAnon organizations that were removed because they violated policies regarding misinformation, violence, and incitement. To study the way users were enticed, it used data from these purge groups. Drebbel found 5,931 QAnon group members with 2.2 millions total, of which half joined via so-called gateways groups. The study found 1 million gateway organizations for 913 anti-vaccination group members with 1.7million. Facebook acknowledged the need for more.

In an earlier report, Facebook integrity workers warned that the anti-vaccine movements could grow more extreme.

The report stated that “expect to find a bridge between the online and offline worlds.” We might witness motivated users form sub-communities to collaborate with others to take action against vaccination.

This year, a separate group of cross-department experts reported that the U.S.’s vaccine hesitancy closely resembled QAnon/Stop the Steal movements. They are “primarily driven mainly by authentic actors” and community building.

The team stated that they found this problem, as with many others at Facebook, to be a very head-heavy one. A small number of actors are responsible for creating a lot of content and growth.

Facebook spokesperson stated that they had been “focusing on outcomes” regarding Covid-19. According to the survey conducted by Carnegie Mellon University, the company saw vaccine hesitancy fall by half.

We will see if Facebook’s integrity programs can prevent the next conspiracy theory movement from forming or the violent organisation of already existing movements. However, their recommendations for policy may be more important now that Jan. 6, violence revealed the immense influence and dangers that even small extremist communities wield.

According to a report from 2021, “The power of communities, when they are based on harmful topics and ideologies, can pose a greater threat than any individual piece of content, adversarial actors, or malicious network.”

According to a Facebook spokesperson, the recommendation in “Deamplification Roadmap”, are correct: “This important work” and “We have a long history of using our research for changes to our applications,” she wrote. Drebbel’s consistent approach helped us make this decision to stop permanently recommending news, civic or political groups on our platforms this year. This work is a credit to us and will continue to be an important part of product and policy decision making going forward.

Frances Haugen, a former Facebook employee, arrives to testify during the Senate Commerce, Science and Transportation Subcommittee on Consumer Protection, Product Safety, and Data Security hearing titled Children's Online Safety-Facebook Whistleblower, in Russell Building on Tuesday, October 5, 2021.

Watch Facebook whistleblower Frances Haugen’s full testimony before the Senate

[ad_2]