Facebook moves to scale down political content

“It’s important to note that we’re not removing political content from Facebook altogether,” Aastha Gupta, Facebook’s product management director, said in a statement. “Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person’s appetite for it at the top of their News Feed.”

Facebook has taken heat over the role its services have played in the organizing of far-right events since at least 2017, when it was used to promote a rally by white nationalists and neo-Nazis in Charlottesville in which a woman was killed. But last month, Chief Operating Officer Cheryl Sandberg tried to deflect blame in the Capitol riot, noting the role of smaller, right-leaning services such as Parler and Gab.

“I think these events were largely organized on platforms that don’t have our abilities to stop hate, don’t have our standards and don’t have our transparency,” Sandberg said in a January interview that was live-streamed by Reuters.

But researchers found that supporters of then-president Donald Trump heavily promoted the Capitol rally on the platform and on Facebook-owned Instagram, and used the services to organize bus trips to Washington. In the days leading up to the attack, more than 100,000 users posted hashtags affiliated with the movement prompted by Trump’s baseless claims that voter fraud cost him the presidential race, including #StopTheSteal and #FightForTrump.

The Tech Transparency Project, a watchdog research initiative, found that “numerous” Facebook users openly billed the Jan. 6 rally as a chance to “#OccupyCongress.” The group also found the platform had been showing ads for weapons accessories and body armor alongside election misinformation in “patriot” and militia groups after the riot. (In response, Facebook paused the ads until after the inauguration.)

According to Facebook’s analysis, political content currently makes up 6 percent of what users see. But in the company’s last earnings call, chief executive Mark Zuckerberg said that one of the most common pieces of feedback Facebook gets is that users want politics to take up less space in their feeds. That’s part of what sparked the tests to lessen political content.

“This is a continuation of work we’ve been doing for a while to turn down the temperature and discourage divisive conversations and communities,” Zuckerberg said.

Lauren Svensson, a Facebook spokesperson, said the countries included in the first wave of testing were among those with the highest complaints about seeing too much political content.

“For the purposes of this initial set of tests in Canada, Brazil, Indonesia, and the United States, we’ll be reducing the distribution of political content in News Feed for a small percentage of users by using a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics,” Svensson said in an email to The Post. “We’ll be refining this model during the test period to better identify political content, and may or may not end up using this method longer-term.”

She also said the company is still working through its standards to determine what constitutes political content.

Information about covid-19 from major health organizations such as the U.S. Centers for Disease Control and Prevention, the World Health Organization, and national and regional health agencies in affected countries will be exempt from the tests, Facebook said. Content from official government agencies also will be exempt.

Source: WP