
There are a rapidly growing number of governance arrangements that seek to counter disinformation. Battles have ensued as opposing stakeholders vie to influence public discourse. The application of an overlapping four-stage analytical framework reveals major challenges in the fight against disinformation. While an evaluation of the effectiveness of these governance arrangements shows mixed results, best practices are emerging which offer a promising roadmap. The problem of (climate) disinformation and social media
Disinformation is as old as civilization and it has been a problem since the advent of written language. However, the rise of social media has exponentially accelerated its reach. This is a serious problem that has augured widespread concerns (Levush, 2019). These concerns include the erosion of trust in state institutions, impeding the functioning of public policy (Burgos, Silva, 2007), and deleterious impacts on the institution of democracy.
Democracies around the world are threatened by the seemingly unstoppable onslaught of false information (Henley, 2020). Disinformation is contributing to the erosion of public trust in government and traditional media (West, 2017). It has been shown to be harmful to democratic societies (McGeehan, 2018), as it affects both public opinion and electoral discourse (West, 2017). It also disrupts governance (Weedon et al, 2017) which threatens national security and democratic systems of government (Levush, 2019). Disinformation is driving an increasing number of people to eschew facts altogether, adversely impacting the “overall intellectual well-being of a society” (Lewandowsky et al, 2017).
Climate disinformation has been around for decades (Krugman, 2018) and it has succeeded in slowing climate action (Cook et al, 2018). Such disinformation sews doubt about the credentials of climate scientists (Harvey et al, 2018) and the need for urgent action (Moser, 2010). Evidence shows that disinformation has succeeded in breeding confusion (Van der Linden et al, 2017), as well as suspicion, fear worry, and anger (Karlova and Fisher, 2013). Belief in the veracity of climate change has been eroded (Begley, n.d.), resulting in reduced support for climate mitigation (Budak et al, 2011).
Disinformation is being amplified by social media. The use of bots, big data, trolling, and deep fakes are driving division on many fronts including climate change. The seemingly unstoppable flow of online disinformation (Anderson et al, 2021) can destabilize both governments and society (Lonstein, 2020). Social media also contributes to and reinforces biases, including both confirmation bias and motivated reasoning, and this is contributing to ever more polarized social norms (Treen et al, 2020) that are compromising the functioning of government (Sepaha, 2007). The low cost and accessibility of social media make it easy to spread disinformation online (U.S. Department of Homeland Security, 2019) and this constitutes an existential threat to democratic institutions (Anderson et al, 2021).
Definitions and agendas
The words and definitions we use to counter climate disinformation have a profound impact on governance arrangements. To illustrate, consider the difference between four similar terms: Misinformation, disinformation, malinformation and fake news (see glossary of disinformation-related terminology). These four words are often used interchangeably, however, they are not synonymous (U.S. Department of Homeland Security, 2019). Each of these words changes the way we frame the issue as well as the policy actions we consider.
Scholars are increasingly calling attention to the issue of disinformation (Treen et al, 2020), and it is getting attention in policy circles, and finding its way onto the agendas of countries, companies, and organizations around the world. However, there is considerable debate about how to deal with it (West, 2017). Competing definitions and positions make crafting policy difficult. The lack of clarity hinders governments’ ability to accomplish “anything effective” (Funke and Flamini, 2021) and makes consensus-building difficult.
The positions of key actors
Governments realize that allowing the unregulated uptake of disinformation online is not a policy option (Marsden et al, 2020). The position of actors who support combating climate disinformation is premised on the scientific consensus that supports the need to rapidly draw down greenhouse gas emissions (GHGs). This position is opposed by those who engage in climate change denial or skepticism (see glossary of disinformation-related terminology). They commonly use social media to disseminate disinformation (Strudwicke and Grant, 2020) that is intended to confuse the public and stall support for climate action (Cook et al, 2018). The public position of climate disinformation advocates is that reducing GHGs will undermine economic growth (Belamy, 2020; Strudwicke and Grant, 2020).
The fossil fuel industry is the key actor supporting climate disinformation (UCS, n.d.). Their position is that climate action is a threat to their business activities (Kolmes, 2011) so they advance an agenda that seeks to preserve their social license to operate. In support of this agenda, they foment confusion and undermine the legitimacy of climate science (Oreskes and Conway, 2010).
These efforts are being countered by nonstate actors who have published reports that expose the fossil fuel industry’s malfeasance. This includes publications from organizations like Desmog Blog (DeSmog, n.d.), the Union of Concerned Scientists (UCS. 2017), Climate Reality (Climate Reality. 2019), and Carbon Brief (Treen et al, 2020). These publications hold fossil fuel companies accountable by exposing their leadership in the global warming denial industry (Demelle, n.d.). This includes showing how fossil fuel company leaders knew that their products drive climate change, yet they actively support disinformation campaigns designed to deceive. They are also being countered by the global divestment movement.
Because climate change affects everyone, there is broad support from governments, NGOs, environmental groups, and members of civil society for evidence-based approaches to neutralizing disinforming content (Cook, 2019). Despite the wide range of approaches that combat disinformation (see Top 100 Techniques to Combat Disinformation), it may not be possible to reconcile the interests of the opposing parties given the diametrically opposed interests of the key stakeholders.
Current practices
Governments are acknowledging their responsibility to protect citizens and the integrity of their democratic societies from the malicious effects of disinformation (Marsden et al, 2020). This has prompted state actors to deploy disinformation countermeasures (Levush, 2019). The fossil fuel industry is the principal supporter of the climate change denial machine (see glossary). They seek to conceal their causal role in climate change and undermine efforts to address it (Meadowcroft, 2013). They use their tremendous wealth to engage PR firms, lobbyists, pseudoscience (MacKay and Munro, 2012), and arms-length NGOs (Belamy, 2020) to skew public opinion by casting aspersions on climate science and exaggerating uncertainties (Cook et al, 2019). In the four years after the signing of the 2015 Paris Agreement, the world’s five largest oil and gas companies spent more than $1 billion on lobbying to prevent climate change regulations (Influence Map, n.d.). The result is that fossil fuel companies continue to drive climate change.
Technology companies continue to protect their business models and resist policy actions designed to combat disinformation. Despite the general lack of cooperation from technology companies (Lonstein, 2020) there are signs of movement. Some digital companies have signed agreements like the European Commission’s code of practice against disinformation that increases transparency and engages third-party fact-checkers to identify misleading content and remove fake accounts (European Commission, 2018; Mackintosh, 2019; Funke and Flamini, 2021). Both Twitter and Facebook have added information labels to questionable posts and they direct viewers to science-based information (Idowu et al, 2013). We have also seen collaborative agreements signed in Mexico and Sweden (Lavush, 2019).
Evaluation and effectiveness
While the importance of stakeholder cooperation and collaboration to combat disinformation is widely recognized (Cook, 2019; West, 2017), government efforts to foster networking between stakeholders have not achieved the stated objectives. This may be due to the difficulty of securing voluntary agreements that threaten the financial interests of key actors.
Fact-checking has been identified as a potential solution to disinformation but keeping up with the proliferation of ever-evolving social media channels is challenging (Babaker and Moy, 2016). Some attempts at fact-checking have backfired, further entrenching people in their biases (Lewandowsky et al., 2013). When Facebook began labeling “fake news”, it increased misinformation sharing (Levin, 2017; Zollo et al., 2017). There is also a risk that providing general warnings about fake news will increase cynicism, leading to a decrease in belief in the news (Pennycook and Rand; 2017; van Duyn and Collier, 2017). Facebook has not been able to diminish fake news (Levin, 2017) and fact-checking has been shown to be insufficient in efforts to combat disinformation (Shu et al., 2020).
There is growing support for government regulation to counter disinformation (Marsden et al, 2020) but there are also concerns about the dangers of “disproportional measures” (Levush,. 2019). These concerns include providing a pretext for governments to harass journalists (UNHC for Human Rights, 2017). There is concern that overly restrictive regulation could set a dangerous precedent and encourage authoritarian regimes to continue and/or expand censorship (West, 2017). More generally these regulatory efforts can pose a threat to the principle of free speech and the administration of the rule of law (Levush, 2019). The outpouring of false news overwhelms fact-checkers (West, 2017) and it is costly, making this solution less than ideal (Marsden et al, 2020) and perhaps even unrealistic (Lonstein, 2020).
Finland’s successful education efforts
Combatting disinformation has been described as a “paramount education priority” (Anderson et al, 2021). Finnish Ministries (Education and Culture, Justice, Competition and Consumer Agency) have made media literacy a key policy focus and their educational efforts prove that people can be taught to be resilient to disinformation (KennyBirch, 2019). Finland’s society-wide, multi-pronged, cross-sector approach to combat disinformation is premised on teaching critical thinking (Mackintosh, 2019) that enables people to identify disinformation. Finland’s cross-curricular anti-disinformation education starts in grade school and continues through to college.
The Finnish government has also passed legislation and allocated resources that fund advertising campaigns (KennyBirch, 2019). Nonstate Finnish organizations like Faktabaari support activities designed to improve media literacy and Finnish NGOs have partnered with the state to help with continuing education (Henley, 2020).
Finland tops the charts for media trust (Mackintosh, 2019, KennyBirch, 2019) and the country is rated as Europe’s most resistant nation to fake news (Henley, 2020). Finland’s approach is being followed by other countries which further corroborates the effectiveness of their educational focus. Finland’s success has been attributed to cross-departmental engagement and a bottom-up approach as well as the willingness of government to learn from best practices around the world (KennyBirch, 2019). Digital literacy has also been shown to be effective in combating climate disinformation (Belamy, 2020).
Conclusion
Despite the success of the Finnish approach, disinformation continues to spread around the world (Shu et al., 2020). While education is increasingly regarded as the most effective approach to combating disinformation (Henley, 2020), we also need policy, diplomacy, and interventions from civil society (Vogels et al, 2020). Social media companies also need to do more to stop the spread of disinformation (Mackintosh, 2019). Tools, techniques, and strategies that fight disinformation (see Top 100 Techniques to Combat Disinformation) are useful, however, no single approach addresses all concerns, and all have limitations (Treen et al, 2020).
There is still no clear consensus about what can and should be done once malicious content is detected (Fernandez and Alani, 2018). Assisted by human-moderated AI (Belamy, 2020), co-regulation Involving technology companies and the state may be the best way forward (Marsden et al, 2020). In addition to technological fixes and media literacy (particularly those approaches that teach people how to think critically, how to fact check, and how to assess the veracity of sources), we need to look at ways of removing the financial incentives of platforms that post disinformation (West, 2017). Another approach that holds promise involves developing an inclusive national narrative, rooted in science, human rights, and the rule of law (Mackintosh, 2019).
Disinformation is complex and multifaceted. It is intimately connected to cultural values, individual cognition, societal trends, developing technology, and a changing media landscape (Belamy, 2020). Therefore, disinformation requires a comprehensive, robust, interdisciplinary approach. Such approaches must synthesize the conclusions of a broad spectrum of academic disciplines. Perhaps most importantly, the scope of the threat posed by disinformation requires holistic solutions involving all sectors of society (U.S. Department of Homeland Security, 2019).
For references see “Resources to Combat Disinformation“. See also “Glossary of Disinformation Related Terminology“
Related
- Disinformation: Deception that Delayed Climate Action
- Disinformation is the Most Global Sustainability Issue
- Fossil Fuel Pollution: Disinformation and Political Corruption
- Fueling Disinformation: How Big Oil Obstructs Climate Education
- Top 100 Approaches to Bust Disinformation
- Conservative Climate Disinformation and the False Gods of Capitalism
- America’s Most Popular Purveyor of Climate Disinformation is Dead
- The Sources of Disinformation and the Cognitive Biases that Fuel Climate Denial
- Polls Suggest the GOP’s Climate and Environment Disinformation Efforts are Faltering
- API’s Long History of Climate Denial and Disinformation
- Heartland Institute Targets Kids with Climate Disinformation
- Proof of Disinformation from Fossil Fuel Companies (Video)
- Resources to Combat Disinformation
- Glossary of Disinformation Related Terminology
Footnotes
- Four-stage analytical framework comprised of: 1) Problem definitions and formal agenda. 2) Actors, groups and their positions 3) Institutions, norms and current practices and 4) Objectives and effectiveness of governance arrangements.
- Actors and institutions can often be used interchangeably and the positions of these actors and institutions can both inform current practices (for more detailed explanation see glossary of disinformation-related terminology
- Social license to operate is defined as “The perceptions of local stakeholders that a project, a company, or an industry that operates in a given area or region is socially acceptable or legitimate”.
- The divestment movement focuses on investment funds like pensions or big institutions like universities and churches.
- Facebook is now debunking common myths about climate change, relying on experts from George Mason University, the Yale Program on Climate Change Communication and the University of Cambridge to identify and debunk climate change myths (DW, n.d.)
- An informed citizenry may be the single best method of combating the mercurial manifestations of disinformation (Saurwein and Spencer-Smith, 2020). However, improving media literacy requires a long-term, system-wide strategy (KennyBirch, 2019).
- This includes social, political, information, computer, and psychological sciences (Cook, 2019).
- Government, businesses, NGOs, organizations, groups, and civil society.