Since the COVID-19 pandemic began, misinformation has permeated all aspects of public knowledge, from early rumours about the disease being generated in a lab, to false information on preventing transmission or miracle drug treatments, to inaccuracies about how and why governments make the decisions they do. It’s spread in many mediums: over social media, from family and friends, even sometimes coming from voices of power.
While the pandemic demands huge amounts of new information to effectively understand how the virus is transmitted, how people are affected, and how to limit its spread, this flood of new information also creates a perfect environment for misinformation to thrive. The consequences of this can be dire. Misinformation can undermine public health recommendations, threaten physical and mental health and safety, create deep social and political divides, enhance stigmas, and reduce the ability of communities to effectively respond to COVID-19. Misinformation during a pandemic truly can cost lives.
As we continue to wage war on the COVID-19 pandemic, how do we also battle against the accompanying “infodemic”?
The draw of misinformation: Why are we so susceptible?
The trouble is that misinformation can be difficult to detect. While “fake news” cyber hoaxes may be easily recognizable, often misinformation is subtle, like information that is partially true or misrepresented, which can make it harder to discern fact from fiction.
This is further complicated by pre-existing biases in our brains. As humans, our confirmation biases makes us quicker to believe information that aligns with our pre-existing beliefs, or that comes from people we trust who share similar views. Trying to make informed choices in situations of high emotion, like the stress and anxiety of a pandemic, can make us more vulnerable to false information, especially when we’re seeking to alleviate our anxiety, or “doomsday scrolling” through our social media feeds without activating our analytical thinking. Misinformation itself can often spark intense emotions, making it challenging to look at what we’re seeing critically.
Even more complicated is that the science around COVID-19 is developing rapidly. Good evidence-informed decision-making requires constantly updating and adjusting our perspectives as new information becomes available. However, this can be hard to cope with and can lead to distrust in science or public officials, as previous recommendations become outdated.
Even with the best intentions, we are all susceptible to misinformation, especially in a rapidly developing pandemic.
Taking action on misinformation
In response to the infodemic, initiatives and programs designed to combat misinformation have sprung up around the world. Fact-checking websites have emerged to help the public navigate the COVID-19 pandemic and easily access robust information when they’re confused about a potential claim. This includes websites like that of the Taiwan Fact Checking Centre, the WHO’s Mythbuster site, Infogation (a collaborative effort of government advice), and the Poynter Institute’s ##Coronavirus Facts Alliance.
Social media companies like Facebook and Twitter have also taken action by adding new features that detect potentially misleading information and flag it to the readers, with the aim of identifying misinformation before it spreads.
Providing public access to experts can also be a good tool to close the gap between science and the public. For example, throughout the pandemic, the CBC has provided opportunities for the public to write in their COVID-19 questions to be answered by experts, sometimes live on air. As well, many scientists have been using creative digital tools like Instagram and TikTok to make robust science more accessible and shareable in a public forum (Science Sam is a great example of this!). These tools make science more accessible and inspire trust in experts and public health advice.
Your choices have an impact
Effective response to COVID-19 misinformation also requires individual action. Even with the best information available, navigating an infodemic is difficult, and we all have a role to play in combating the threats of misinformation in our own communities.
It’s time for us all to take clear steps to reduce the spread of misinformation in our own lives. This means searching to find an original source, checking for information replication, visiting fact-checking sites, or consulting with an expert before sharing information. As well, given the fundamental role of our emotions and analytical thinking in beleiving false information, it is worth pausing and considering: Am I reacting emotionally to this information? Am I thinking rationally? Taking steps can help us limit our own impact on misinformation spread.
The conversations we have with our community can also play a fundamental role. While it can be tempting to lash out at someone sharing something untrue, approaching conversations with patience and tact can be key. By challenging the information, rather than the person, we prevent talking down to them or making them feel stupid. Being patient, and trusting that most people are actively seeking the truth can lead to more productive discourse. As well, rather than propagating false information, experts recommend the “inoculation method”, which can help prime people for recognizing misinformation. This includes giving the public a signal that they may encounter false information on a particular topic and providing them with good information ahead of time.
While misinformation around COVID-19 isn’t going anywhere, we can all take action in de-cluttering the information landscape, and contributing to the battle against misinformation.
Evidence for Democracy is a non-partisan, non-profit organization mandated to promote the transparent use of evidence in government decision making. If you’re interested in learning more about how you can take action on combating misinformation, visit E4D’s training portal or sign on to the Truth Pledge, committing to take action to reduce the spread of misinformation in your own community.