The Staggering Scope of the Global Coronavirus Misinformation Epidemic

Scientists identified more than 2,000 false claims about Covid-19 in circulation

Photo: Westend61/Getty Images

At the end of March, shortly after the coronavirus outbreak was declared a pandemic, the United Nations warned of another looming global threat. Humanity’s common enemy, said UN Secretary-General Antonio Guterres, is the “growing surge of misinformation.” Rumors and conspiracy theories about the coronavirus have become so widespread that they’ve occasionally surfaced on the global stage. President Donald Trump, for example, infamously shared the dangerous idea that injecting oneself with disinfectants could cure Covid-19.

In a new paper published today in the American Journal of Tropical Medicine and Hygiene, an international team of researchers attempt to quantify how rampant misinformation has become during the pandemic. Poring through the platforms where misinformation surfaces or is reported — like Facebook, Twitter, online newspapers, and fact-checking websites — they identified 2,311 reports of misinformation that circulated between December 31, 2019, and April 5, 2020. Most of the information came from India, the United States, China, Spain, Indonesia, and Brazil—countries that have been hit hard by the coronavirus.

The researchers say misinformation comes in three flavors: rumors, like the false idea that drinking bleach could cure Covid-19; stigma, like the unjustified belief that people from China carry the coronavirus; and conspiracy theories, like the idea that the coronavirus is intentionally spread from 5G cellphone towers. Most of the claims had to do with the nature of Covid-19’s spread and death rates, control measures, cures, and the cause or origin of the disease.

Rumors were by far the most prevalent, making up 89% of the reports. We’ve addressed a few at the Coronavirus Blog, including the false ideas that drinking strong alcohol or so-called Miracle Mineral Solution, a dangerous solution of chlorine dioxide, can kill the coronavirus. Interestingly, the study authors also flagged region-specific rumors: One circulating in Saudi Arabia suggested camel urine mixed with lime as a cure; another in India suggested drinking tea with cow urine or dung. It bears repeating that there is currently no cure or vaccine for the coronavirus.

Misinformation related to stigma, which made up 3.5% of the reports, stoked discrimination against people from China. The president’s repeated use of the terms “Wuhan virus” or “China virus,” for example, falls into this category. Scientists have clearly established, however, that the virus affects all people, not just people from China. Other instances of stigma promoted discrimination of health care workers and people in quarantine, raising fears that they were more likely to spread Covid-19.

The conspiracy theories made up 7.8% of the reports. Many speculated on the origin of the virus, suggesting in some cases that it was deliberately made or released from a lab, or that it was engineered by international agencies as a bioweapon. (Scientists recently ruled out the lab hypothesis, showing strong evidence that the virus came from bats.) Some stuck out to me as especially wild: that the Bill and Melinda Gates Foundation created it as a bioweapon to boost vaccine sales, that the coronavirus is a population-control scheme, or that there actually is a vaccine for Covid-19 but pharmaceutical companies are withholding it on purpose. There’s no factual basis for any of these ideas, and yet they circulate.

Part of the reason misinformation is so rampant right now, the authors write, is that information in general is so widely available — so much so that people are overwhelmed with the amount and are unable to parse through what is real and what is not. Such a glut of information, accurate or not, is called an “infodemic.” And a person’s inclination to share information, as a study recently co-authored by researchers from MIT suggested, can cloud their ability to assess its accuracy.

Curbing the spread of misinformation is hard. Social media companies like Twitter and Facebook are experimenting with different ways to label information that’s correct and censor misinformation, but they keep bumping up against First Amendment rights. Science journalists debunk misinformation about the coronavirus and replace it with correct information, but convincing people of what is true is not always as simple as providing scientific proof.

In an interview with CNN last week, Anthony Fauci, MD, director of the National Institute of Allergy and Infectious Diseases and head of the White House’s coronavirus task force, said, “There is a degree of anti-science feeling in this country,” adding that “it’s almost related to authority and a mistrust in authority that spills over, because in some respects, scientists, because they’re trying to present data, may be looked at… as being an authoritative figure.”

Despite the immense challenges, stopping misinformation must remain a priority. Because as the new study illustrates, misinformation has real and dangerous consequences: There was an uptick in disinfectant misuse after the president made his comments. People have burned down 5G towers and threatened the engineers who work on them. Asian people, especially those who are ethnically Chinese, have experienced racist abuse and violence as a result of racist stigma. Misinformation about vaccine safety is leading to more anti-vaccine sentiment, which threatens public health: If people refuse to get the future coronavirus vaccine, how can it protect the population?

While governments, public health organizations, and media platforms must do their part in stopping the spread of misinformation, everyday people also have a role to play. In the MIT study mentioned earlier, the researchers found that people were generally able to correctly assess the accuracy of a claim if they were asked to do so. “People are much more discerning when you ask them to judge the accuracy, compared to when you ask them whether they would share something or not,” said co-author and MIT professor David Rand, PhD. Getting into the habit of taking a beat to ask yourself whether what you’re reading and sharing is credible — and doing your own sleuthing to verify what’s true — can go a long way in battling the two pandemics that have the world in their grip.

Editor, Medium Coronavirus Blog. Senior editor at Future Human by OneZero. Previously: science at Inverse, genetics at NYU.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store