Massive Facebook study on users’ doubt in vaccines finds a small group appears to play a big role in pushing the skepticism

While Facebook has banned outright false and misleading statements about coronavirus vaccines since December, a huge realm of expression about vaccines sits in a gray area. One example could be comments by someone expressing concern about side effects that are more severe than expected. Those comments could be both important for fostering meaningful conversation and potentially bubbling up unknown information to health authorities — but at the same time they may contribute to vaccine hesitancy by playing upon people’s fears.

The research explores how to address that tension by studying these types of comments, which are tagged “VH” by the company’s software algorithms, as well as the nature of the communities that spread them, according to the documents. Its early findings suggest that a large amount of content that does not break the rules may be causing harm in certain communities, where it has an echo chamber effect.

The company’s data scientists divided the company’s U.S. users, groups, and pages into 638 different population segments to explore which types of groups hold vaccine hesitant beliefs. The document did not identify how Facebook defined a segment or group different communities, but noted that the segments could be at least 3 million people.

Some of the early findings are notable: Just 10 out of the 638 population segments contained 50% of all vaccine hesitancy content on the platform. And in the population segment with the most vaccine hesitancy, just 111 users contributed half of all vaccine hesitant content.

Facebook could use the findings to inform discussions of its policies for addressing problematic content or to direct more authoritative information to the specific groups, but the company was still developing its solution, said spokeswoman Dani Lever.

The research effort also discovered early evidence of significant overlap between communities that are skeptical of vaccines and those affiliated with QAnon, a sprawling set of baseless claims that has radicalized its followers and been associated with violent crimes, according to the documents.

Facebook has partnered with more than 60 health experts around the globe and routinely studies a wide variety of content to inform its policies, said Lever of the study, in an emailed statement.

“Public health experts have made it clear that tackling vaccine hesitancy is a top priority in the COVID response, which is why we’ve launched a global campaign that has already connected 2 billion people to reliable information from health experts and remove false claims about COVID and vaccines,” she said. “This ongoing work will help to inform our efforts.”

Nearly 30 percent of Americans – and half of all Republican men – say they do not intend to get one of the three federally-approved vaccines, according to a March poll by PBS NewsHour, Marist, and NPR. An Associated Press/NORC study from late January found that the top reasons for concern over the vaccinations were fear of side effects, distrust of vaccines, and desire to wait and possibly get it later.

Covid-19-related misinformation has flooded the company’s platforms, including false narratives about covid being less deadly than the flu, that it is somehow associated with a population-control plot by the philanthropist Bill Gates and that vaccines are associated with the antichrist. Its content decisions, potentially anticompetitive behavior, and its use by extremist groups to foment violence have drawn the attention of regulators, leading to Congressional hearings and a major antitrust charges by the U.S. Department of Justice.

Facebook, which owns WhatsApp messenger and Instagram, collect reams of data on its more than 3.3 billion users worldwide and has a broad reach onto those users’ devices. Public health experts believe that puts the company in a unique position examine attitudes toward vaccines, testing, and other behaviors, and push information to people.

But the company has a steep hill to climb when it comes to proving that its research efforts serve the public because of its history of misusing people’s data. The company allowed a political research firm to exploit the profiles of tens of millions of Americans, resulting in the Cambridge Analytica privacy scandal, and at one time manipulated people’s emotions for an internal study.

Since last April, the social network has partnered publicly with Carnegie Mellon University researchers to conduct the COVID-19 Symptom Survey, a daily survey of Facebook users that asks people about their symptoms, testing, mental health, attitudes about masks, are more recently about intent to get vaccinated. A related project has used smartphone data to track if people are upholding social distancing orders and lockdowns. At least 16 million people have been surveyed, making it one of the large public health data collection efforts during the pandemic, the researchers have said.

Facebook has also banned a wide range of baseless or misleading claims about vaccines and about covid — removing more than 12 million pieces of content — and connects people to authoritative information with labels on posts and with a banner atop the Facebook site, according to the company.

Facebook’s research into vaccine hesitancy will force the company to walk a fine line if it decides to further police it, since much of the content regards expressions of concern and doubt versus outright misinformation.

“Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful,” wrote Kang-Xing Jin, Facebook’s head of health, in an op-ed last week in the San Francisco Chronicle. “It’s hard to draw the line on posts that contain people’s personal experiences with vaccines.”

But according to the documents, Facebook worries that the content that isn’t outright breaking its rules could be problematic. “While research is very early, we’re concerned that harm from non-violating content may be substantial,” the documents said.

That that risk of harm seems to be disproportionately impacting a few communities, Facebook’s engineers found.

The results from Facebook’s early research track with findings from disinformation researchers, who have pointed that a small minority of people, particularly influencers and people who post frequently or use underhanded tactics to spread their message, can have an outsized impact on the conversation and act as superspreaders of misleading information.

The researchers noted that while some small percentage of vaccine hesitant comments could be overcome when they were posted in communities with a diverse range of opinions, the concentration of such comments in small groups suggests that they are becoming echo chambers where people simply reinforce people’s pre-existing ideas.

In segments that were affiliated with QAnon, the study found sentiment that was skeptical of vaccinations was widespread. “It’s possible QAnon is causally connected to vaccine hesitancy,” the document said. In QAnon communities, skepticism of vaccines was connected to a distrust of elites and institutions.

Last year, external researchers found that QAnon groups in Facebook were influential in fueling the spread of a misinformation-filled documentary called Plandemic on social platforms.

The internal Facebook study found that comments that could contribute to vaccine hesitancy overlap with QAnon but also go well beyond it, into many other different types of communities. While QAnon groups appeared to be more focused on a possible distrust of authority as a reason for doubting the vaccine, other groups had different ways of expressing their doubts and worries. This finding suggests that public health experts will need to develop nuanced messages when trying to address vaccine hesitancy in the population.

Source: WP