I recently appeared in a video made by the American Association for the Advancement of Science (AAAS). The video, where I talk about climate science communication, is part of a series on “Countering Science Misinformation.” While recording it, I realized that, while misinformation is the most common type of falsehood in science around us, disinformation is also a common presence, and unlike misinformation, it can have much more serious and deliberate consequences.
Misinformation is “false information that is spread, regardless of whether there is intent to mislead.” Whereas a lot of misinformation is intentional, in many instances it may be accidental, because the person or channel relaying the information may actually believe it to be true.
According to the Univ. of Southern California, people may misunderstand what they are reading or watching, even when it’s from legitimate sources. People not only seek information according to their own biases (based on their values, beliefs, environment, political inclination) but they also misunderstand what they read when they scan news on social media. They state that the spread of fake news and misinformation — including the misunderstanding of real news — across social media communities is a big problem.
A 2019 paper found that, as people scan news to choose what to read (and do not know the accuracy of the news they are offered), in an effort to reduce their uncertainty about a subject they tend to choose those that align with their beliefs. The more choices they are offered, the stronger the polarization — and the slower the actual learning. The authors actually conclude that, to combat misinformation, it is better to reduce the amount of information to which people are exposed at a time than to censor content. Of course, the underlying conclusion is that there is such a thing as too much information on many platforms, and the overabundance of it contributes to the spread of misinformation.
While looking for information that aligns with their beliefs, people also come across disinformation. Disinformation is “deliberately misleading or biased information; manipulated narrative or facts; propaganda.” The Oxford Reference defines disinformation as “A form of propaganda involving the dissemination of false information with the deliberate intent to deceive or mislead.” So, propaganda — and specifically inaccurate information supplied by governments — is part of the definition of disinformation. Its main difference from misinformation is that it is known to be untrue by the person or group conveying it.
A lot of what we see today from special interest groups (such as oil companies) and many elected officials is disinformation, many times in the form of organized campaigns. In the regulatory sphere, some companies and trade associations use tactics such as actively burying scientific evidence, manipulating facts, playing up uncertainty, and obscuring the facts in an effort to shirk regulatory scrutiny on products or activities that are found to be harmful, simply because the facts do not serve somebody’s interests, values, or bottom line.
Another recent example is from the media, which knowingly carried falsehoods relating to meat consumption, only because it did appeal to their viewership’s values, so disinformation can be used both ways. In contrast, misinformation is often unknowingly spread by the general public, and even some elected officials, because they believe their sources of “information.”
UCS has been bringing to light and actively fighting against these disinformation campaigns. Our Disinformation Playbook is a helpful resource for starting to recognize the tactics being used by special interest groups. We also work with various other organizations such as the Citizens’ Platform on Climate Change, a partnership with UNESCO. One of its goals is to fight disinformation globally, and I also talk about disinformation dangers, practices, and tactics there.
The role of digital media outlets
The main channels through which fossil fuel companies, political special interests and other bad actors purvey misinformation and disinformation are news outlets and social media. They must do more to prevent the viral spread of false content on the internet. Further, according to the Brookings Institution, there should be initiatives to encourage news literacy. People need to learn how to evaluate digital news sources rather than assume everything on social media or digital news sites is true. Many people do not read a whole article but just see the headlines, which can be misleading also. There are some useful resources to help people recognize misinformation online, such as this one.
The news outlets people seek are generally the ones that are used by their peers, by people with whom they share values and beliefs. That is a well-known concept in social sciences, called confirmation bias — when people seek information that confirms their pre-existing ideas or beliefs. This bias is pervasive and comes in many forms. People look for news that plays to their biases, but they can also misunderstand real, good quality, unbiased information, scientists say.
Consequences of misinformation and disinformation
Misinformation and disinformation are definitely a major problem, especially when it comes to the communication of scientific results. Science is behind a lot of things that keep us safe and healthy in our everyday lives, and to get it wrong, or deliberately say wrong things, is incredibly irresponsible, particularly for policymakers. Misleading data or commentary related to science can have serious negative impacts for people, the economy, society, and government.
The spread of mis- and disinformation related to COVID-19 resulted in people refusing to take preventative measures to protect themselves, and in some cases, treating their symptoms using unsafe and even deadly methods. Brookings says that the spread of disinformation is dangerous because of its ability to affect public opinion and electoral discourse based on falsehoods, as we regularly saw in U.S. presidential elections and in the storming of the Capitol on January 6th. Disinformation in the electoral process is a critical concern both in the U.S. and abroad.
Government decisions in particular should always be informed by the best available data and analyses, not by special interests and related disinformation. The wellbeing of people should be the guiding principle. Unfortunately, we have seen a lot of disinformation and dismissal of science in decision making in the United States and other countries. Attacks on science were not uncommon during the last administration, and we at the Union of Concerned Scientists have actively engaged in bringing science to its rightful place in policymaking and social justice.
Disinformation and misinformation are an ethical and moral issue, and one of justice. By misleading people or plainly not giving them the actual information, industry actors and politicians deny them the most basic right, which is the right to the truth, and to live healthy lives.
Add to it the inequities of climate change impacts, where disadvantaged and disenfranchised communities bear most of the burden, and the situation is concerning. We all need to actively work to stop the spread of disinformation in our networks. Click here to learn more about how to stop the disinformation playbook.
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.