Covid-19 is one of Wikipedia’s biggest challenges ever. Here’s how the site is handling it.

Moore generally seeks out topics he’s interested in, such as historic buildings or public art, and either expands an existing page or creates one from scratch. Although medicine isn’t his forte, when the mainstream media began reporting on the coronavirus in China, he thought, “This was going to be much, much bigger than an isolated medical outbreak.”

He, of course, was right. As the pandemic overtook the globe, it sparked one of Wikipedia’s largest challenges: to chronicle a massive news event in real time as information constantly shifted and misinformation constantly spread, putting the site’s tried and true process to the test.

As of the end of July, according to a Wikipedia spokeswoman, more than 67,000 editors had collaborated to create more than 5,000 Wikipedia articles in 175 different languages about covid-19 and its various impacts. Some of these, including the disease’s main English-language article, are sensitive pages restricted to certain trusted users (a decision made by other Wikipedia volunteers), according to a Wikipedia spokeswoman.

Frequent editors, according to Moore, generally fulfill two primary roles. The first is actually writing or editing a specific page. The second “is kind of like community organizing, is how I often think of it,” by helping manage WikiProjects, which act as organizing spaces for topics that stretch across many pages, such as “medicine” and “disaster management.”

Seeing that one didn’t exist for covid-19 as it picked up steam in mid-March, Moore created one. A WikiProject, among other things, includes a page of reliable sources for editors to pull from. It also — like every page on Wikipedia — contains a “talk” page where editors can discuss how to approach certain articles, which ones are needed and what information isn’t up to their standards.

By the end of July, the main English-language covid-19 article had been edited 22,000 times by more than 4,000 editors. Among them is Netha Hussain, a 30-year-old doctor from Kozhikode in Kerala, India, who has a Ph.D. in clinical neuroscience and is a researcher for Sweden’s University of Gothenburg. She began editing Wikipedia a decade ago, when she was earning her medical degree. Covid-19 proved more challenging to chronicle than anything else in her 10 years editing the site, as information about the virus, even from reputable sources, constantly shifted. Unlike in the past, she said, “I have to work fast and act fast to ensure it remains reliable and updated.”

Editors will ask each other to create or expand articles on certain subjects. Hussain, for example, was asked to write about covid-19 and its effects on pregnancy for its own break-off page.

The impacts of the pandemic spread beyond medicine, however, touching on everything from “human rights” to “national politics” to “world economies,” Hussain said. Other editors rushed to fill those gaps.

Rosie Stephenson-Goodknight, 66, a visiting scholar at Northeastern University and veteran Wikipedia editor, has been working to expand Wikipedia’s coverage of important women from history — only 18 percent of Wikipedia’s biography entries are about women, according to the Lily. (Diversity, particularly among gender lines, is one of the areas where Wikipedia’s process notoriously breaks down.) Like so many other editors, the gravitational pull of the pandemic captured her as well.

“What we did is what we do about everything: We just jump in and do it. We just start writing,” she said.

As the main page grew supersaturated with information, editors created pages for various countries, then states. Some are hyperspecific, such as the impact on Walt Disney Co. or disc golf. Stephenson-Goodknight noticed that pages existed for the disease’s impact on the performing arts, sports, musical, retail tourism and oil prices — but not one on the upended fashion industry.

“No one had written that article, and we needed it,” she said. “Things like people wearing masks now and people aren’t going to the mall to buy clothes.”

Of course, all these pages are useless — in fact, harmful — if they’re not accurate. And most anyone who attended school in the first two decades of the millennium is familiar with a common refrain: “Wikipedia is not considered a credible source.” Which would be a problem, given that the English-language covid-19 page received more than 73 million page views as of July 30.

Jevin West, a professor in the Information School at the University of Washington, said not to worry, that the Wikipedia has handled the virus “overall, exceptionally well.”

“It’s not only what people go to and read,” West said. “It’s what feeds a lot of the major search engines, too. So it sort of has double impact.”

“As someone who studies misinformation and disinformation, it’s kind of a ray of hope in a sea of pollution,” West added. “It’s almost like people’s passion to get things right and to be these curators of human knowledge makes them even more careful.”

He also cited Wikipedia’s transparency. Certain discredited sources aren’t allowed, and the entire website’s edit history is readily available to the user. Finally, every fact is plainly sourced. “That level of transparency provides trust,” he said.

As a result, Wikipedia stands in stark contrast with social media sites like Facebook and Twitter, which are often slow to remove misinformation. Even when they do, it has usually already spread. Both companies banned the conspiracy-laden “Plandemic” video in early May — after it racked up millions of views.

“Twitter and Facebook are beholden to stick the user to the platform at all costs so that they can sell ads. And I think that creates these distorted objectives,” West said. “Sometimes they pull things and sometimes they don’t … while Wikipedia’s processes for slowing the spread of misinformation are much clearer.”

With so many Wikipedia editors (and bots) constantly monitoring the mountain of information, misinformation tends to be quickly weeded out. If a troll comes along or “if there are anonymous editors who are putting in vulgarities or clearly operating outside the rules of Wikipedia, we have no problem blocking those people,” Moore said. “We will ban you and your IP address if we think you’re not here for the right reasons.” Plus, the editors will quickly fix errors — often within seconds.

“We have this sense of responsibility to write these articles and ensure that what we’re writing, we’re getting right,” Stephenson-Goodknight said. “I feel like I’m just one small hand in the middle of this. I’m just one small person who has done one small bit.”

West remains a true believer, convinced the world could benefit from the philosophy held by the site’s many editors.

“Maybe more people can buy into the ethos of getting it right,” West said. “Maybe that’s one of the antidotes to misinformation.”

Read more:

Source:WP