A small subset of Facebook users is reportedly responsible for the majority of content expressing or encouraging skepticism about Covid-19 vaccines, according to early results from an internal Facebook study.

The study, first reported by the Washington Post, confirms what researchers have long argued about how the echo chamber effect can amplify certain beliefs within social media communities. It also shows how speech that falls short of outright misinformation about vaccines, which is banned on Facebook, can still contribute to vaccine hesitancy.

A document outlining the study – which has not been publicly released – was obtained by the Washington Post. Researchers at Facebook divided users, groups and pages into 638 “population segments” and studied them for “vaccine hesitant beliefs”, according to the Post. This could include language such as “I am worried about getting the vaccine because it is so new”, or “I don’t know if a vaccine is safe”, rather than outright misinformation.

Each “segment” could be as many as 3m people, meaning the study could examine the activity of more than 1bn people – less than half of Facebook’s roughly 2.8bn monthly active users, the Post reported. The massive study also underscores how much information can be gleaned from Facebook’s user base, and how the company is using this trove of data to examine public health outcomes.

The Post reported that the study found in the population segment with the highest incidence of vaccine hesitancy, just 111 users were responsible for half of all content flagged within that segment. It also showed just 10 out of the 638 population segments flagged contained 50% of all vaccine hesitancy content on the platform.

Facebook’s research into vaccine hesitancy is part of an ongoing effort to aid public health campaigns during the pandemic, said spokeswoman Dani Lever, and is one of a number of studies being carried out by Facebook.

“We routinely study things like voting, bias, hate speech, nudity, and Covid – to understand emerging trends so we can build, refine, and measure our products,” Lever said.

Meanwhile, Facebook has, in the past year, partnered with more than 60 global health experts to provide accurate information regarding Covid-19 and vaccines. It announced in December 2020 that it would ban all vaccine misinformation, suspending users who break the rules and eventually banning them if they continue to violate policies.

The study is just the latest to illustrate the outsized effect just a few actors can have on the information ecosystem online. It comes on the heels of another study from the Election Integrity Project that found that a handful of rightwing “super-spreaders” on social media were responsible for the bulk of election misinformation in the run-up to the Capitol attack. In that report, experts outlined a number of recommendations, including removing “super-spreader” accounts entirely.

The Facebook study also found there may be significant overlap between users who exhibit anti-vaccination behavior on Facebook and supporters of QAnon, a baseless conspiracy theory surrounding a “deep state” cabal of Democrats and Hollywood celebrities engaging in pedophilia and sex trafficking.

The overlap shows another longterm effect of the rise of QAnon, which has also been tied to the insurrection at the Capitol in January. Many far-right actors, including followers and proponents of QAnon, understand how to manipulate social media algorithms to reach broader audiences, said Sophie Bjork-James, a professor of anthropology at Vanderbilt University who researches the white nationalist movement in the US.

“QAnon is now a public health threat,” Bjork-James said. “Over the last year, QAnon has spread widely within the online anti-vaccination community and by extension the alternative health community. The Facebook study shows we will likely be facing the consequences of this for some time to come.”



This content first appear on the guardian

Leave a Reply

Your email address will not be published. Required fields are marked *