Far-right sources on Facebook get more engagement than any other political posts, study finds

Researchers from the school’s Center for Cybersecurity found that far-right content generated the highest engagement — clicking, sharing and commenting — per follower across partisan groups.

The findings are part of a larger study that will be submitted to a peer-reviewed journal, but the timeliness of the report motivated the team to release some information early, according to lead researcher and PhD candidate Laura Edelson.

Edelson and her colleagues determined that engagement from far-right and far-left news sources peaked around Election Day and Jan. 6, when President-elect Joe Biden’s electoral votes were counted. Yet, engagement with far-right content was more intense than other news sources.

Researchers used categories for media sources and information from extreme right and left, devised by NewsGuard and Media Bias Fact Check, independent data providers that survey the news ecosystem and rate the political leanings and quality of media.

“It’s a finding I find concerning,” Edelson said in an interview with The Washington Post. “What that is saying is even at those crucial moments, content from sources that spread misinformation was more engaging than content from reliable sources.”

The conclusions of the study are troublesome yet unsurprising.

The spread of false information has risen with the growth of social media platforms. The country is transitioning from a presidential term marred by “alternative facts” and dealing with social networks that shirk responsibility for what is shared on their platforms, according to scholars.

The study shows the extent of the misinformation problem and just how polarizing social media can be, according to Kurt Braddock, an assistant professor of public communication in the School of Communication at American University.

The results of the analysis echo human behavior and how people seek information, which includes not wanting to spend too much time thinking about things or engaging with ideas that are contrary to their beliefs, he said.

“We tend to seek out information that affirms our ideas,” he said, adding that social media algorithms, which track what a person likes and pushes more of that in front of them, can worsen our natural tendencies. “We can get into patterns and cycles where these beliefs become so ingrained.”

Researchers looked at a Facebook page with more than 100 followers and more than 2,900 news and information sources from NewsGuard and Media Bias Fact Check.

After collecting data of 8.6 million public posts between Aug. 10 to Jan. 11, the team concluded that far-right sources designated as spreaders of misinformation had an average of 426 interactions per thousand followers per week compared to the average of 259 weekly interactions per thousand followers from “non-misinformation sources.”

Sources identified as being center or left typically had a lower ceiling on engagement if the news sources were unreliable, called a “misinformation penalty” in the study.

Joe Osborne, a Facebook representative, told The Post in a statement that the report looks mostly at how people engage with content, which shouldn’t be confused with how many people actually see it on Facebook.

“When you look at the content that gets the most reach across Facebook, it’s not at all as partisan as this study suggests,” he said.

Edelson said she and her team would love to look at the data that Facebook hasn’t made available to verify the company’s claim.

She and her colleagues wrote that Facebook’s CrowdTangle gives researchers information about engagement, reactions, shares and comments, but not about how many people actually saw a piece of content or spent time reading it.

For instance, a 2014 study found liberals think more analytically than moderates or conservatives and that briefly training people to think analytically caused them to develop more liberal opinions.

“It means you can’t discern fact from fiction and that make your decisions based on emotions rather than rationality,” said Heidi Julien, professor of information science in the Department of Information Science at the University at Buffalo.

Edelson and her team said that more research is needed to study other social platforms and their algorithms. They also noted that higher relative engagement for far-right sites doesn’t imply that there is more far-right content on Facebook, or that Facebook users prefer far-right content.

To repair the damage caused by misinformation, the issue must be placed at the forefront of policy agendas and addressed at the school level so the next generation will have more news literacy, Julien said.

Braddock and other experts from the Polarization and Extremism Research Innovation Lab are researching the effectiveness of memes rebuffing misinformation, another step that is needed for scholars to determine how to curtail the spread of fabrications.

“We’re all vulnerable to [false beliefs] to some extent,” he said. “The idea of dissonance, that’s a human trait — it’s not a left or right trait.”

Read more:

Source: WP