Facebook and Twitter must do more to fight anti-vaccine misinformation, a dozen state attorneys general demand

The attorneys general say the companies have not cracked down hard enough on prominent anti-vaccine accounts that repeatedly violate the companies’ terms of service. They also say that falsehoods about the safety of coronavirus vaccines from a small pool of individuals has reached over 59 million followers on Facebook, YouTube, Instagram and Twitter, citing data from the Center for Countering Digital Hate, which studies online misinformation and disinformation.

They sent the letter the day before Zuckerberg, Dorsey, and Alphabet and Google CEO Sundar Pichai are expected to testify in front of the House Energy and Commerce Committee. The hearing is broadly focused on disinformation, and lawmakers and their staff have been in communication with leaders of Anti-Vax Watch, a collection of people and organizations concerned about vaccine disinformation.

Facebook spokesperson Dani Lever said the company has worked with health organizations to update its policies and has removed 2 million pieces of content containing covid-19 and vaccine misinformation since February from Facebook and Instagram.

“We’ve also labeled more than 167 million pieces of COVID-19 content rated false by our fact checking partners, and now are rolling out labels to any post that discusses vaccines,” she said in a statement. “But since research shows that the best way to combat vaccine hesitancy is to connect people to reliable information from health experts, we’ve connected over 2 billion people to resources from health authorities.”

Twitter spokesperson Trenton Kennedy said the company has removed more than 22,400 tweets for violating its policies against covid misinformation since the beginning of the pandemic.

“Making certain that reliable, authoritative health information is easily accessible on Twitter has been a priority long before we were in the midst of a global pandemic,” he said in a statement.

Tong argues that lives depend on the companies’ ability to properly enforce their rules. He said online falsehoods are undermining public confidence in vaccinations, and he raised concerns about those against vaccines targeting Black Americans and other minority communities.

“Coronavirus vaccines only work if people actually get them. Pseudoscience coronavirus conspiracy theories peddled by a small number of uninformed anti-vaxxers have reached tens of millions of social media followers,” Tong said in a statement. “These posts are in flagrant violation of Facebook and Twitter policies. Facebook and Twitter must fully and immediately enforce their own policies, or risk prolonging this pandemic.”

The attorneys general of Delaware, Iowa, Massachusetts, Michigan, Minnesota, North Carolina, New York, Oregon, Pennsylvania, Rhode Island and Virginia also signed the letter.

Vaccine misinformation has plagued social media companies for years, as they struggle to maintain what they see as a balance between allowing free speech and cracking down on harmful material on the sites. In 2019, several social media companies, including Facebook, took steps to try to slow the spread of vaccine misinformation. Facebook created policies to reject ads with false claims and to stop recommending groups that were spreading misinformation.

In December, Facebook banned false and misleading statements about coronavirus vaccines.

But vaccine hesitancy, or delaying or refusing a vaccine, can be a tricky area to police because much of the online content could be people expressing concern as opposed to purposefully spreading false information.

“Vaccine conversations are nuanced, so content can’t always be clearly divided into helpful and harmful,” wrote Kang-Xing Jin, Facebook’s head of health, in an op-ed in the San Francisco Chronicle this month. “It’s hard to draw the line on posts that contain people’s personal experiences with vaccines.”

Facebook is conducting its own thorough study into U.S. users’ vaccine doubts, The Washington Post reported this month. Early results show that a lot of content that does not break the company’s rules could still be causing harm in some communities, where the information bounces around in an echo chamber.

Twitter said in December it would remove some tweets that included false claims about adverse effects of vaccines, or that vaccines are unnecessary because covid-19 is not serious. It expanded that policy earlier this month, saying it would label tweets that included misleading information about vaccines, even if they don’t rise to the level of removal, and lock people out of their accounts for escalating periods of time on a strike system.

Despite the social media companies’ efforts, vaccine misinformation is still readily found online.

Some Evangelical Christians and Christian ministries have spread false information about vaccines online, baselessly claiming that the vaccines contain microchips or fetal tissue.

And vaccine misinformation has caught the attention of supporters of conspiracy theory group QAnon on Telegram and other smaller social media sites, where supporters flocked after the mainstream social media sites cracked down on them.

Source: WP