“Disinformation” is a clever term to relegate the views we don’t like or care to hear to the realm of intentional untruth. Those on the other side of the political spectrum refer to it as “fake news”. But what is the motivation of using such terms and should it be tolerated in a world in desperate need of engagement, dialogue and bridge-building across ideological divides?
Trustbusters is a three-part series on how experts and authorities are failing to manage public trust in research and technology. The demise of trust (in authorities, science, institutions, expertise…) is leading to a breakdown in societal and institutional structures, political deadlock and a decline in scientific literacy. Part 1 looked at how a policy tool (the EU’s use of the precautionary principle) has destroyed trust in science. The second part looks at how our misunderstanding of the difference between misinformation and disinformation mirrors the mistrust/distrust distinction and how our reaction to suspected disinformation is further affecting public trust. The last part simply asks if the arrogance portrayed by certain scientists and science communicators is to blame for a public distrust of technological innovations.
Once the “disinformation” label is slapped upon a participant in a discussion, the next step in the process is to deplatform any person, website or social media account, effectively censoring the views and activities of organisations or individuals. Deplatforming is the modern-day excommunication and anyone who has experienced it knows it is not pleasant. This deplatforming is clearly intentional (and quite abominable) but it assumes the disinformation is always intentional. Is this really the case?
Outside of cases of prankster teenagers and Russian bot factories (which I’m told apparently exist to spam millions with random untruths), I believe most “disinformation” is unintentional (ie, the views are strongly believed by the messenger as factual). In other words, it is misinformation (information that is not factually correct or failing to align with evidence) held by vulnerable individuals rather than disinformation. When someone has been misinformed, we assume that it should be easy to correct them – just give them the facts and we can all get on our way.
This “autocorrection” may have worked in the 1970s when our experts determined our information and any disagreements went on outside of earshot or would evolve after years of journal articles finally filtering down to the general public. There was trust and humility then towards those who had devoted their lives to studying a specific field, but . Now though, everyone is running around with a PhD from Google University thinking they know everything, no longer feeling the need to read and even less inclined to listen.
The nature of social media tribes and echo-chambers is such that individuals can find and associate freely with others and share views, build on other ideas, validate their fears and create believable arguments known as “the truth”. If you really want to believe something, it isn’t hard to find others who feel the same and lend credence to your concerns. My information must be right, ergo, you, with your different views are trying to spread disinformation.
Why is the distinction between misinformation and disinformation important? If, as is too often the case, we label everything we disagree with as disinformation (as intentionally deceptive) then we have set hate as the baseline for interactions and censorship as the best option. But if we recognise non-factual statements as misinformation, then the reaction should be to try to understand how that person was misinformed and find a way to reach a better state of understanding. We open ourselves up to engage rather than enrage, create the conditions for dialogue rather than demagogue, foster inclusion rather than seclusion. Sadly though our tribal impulse (among all opposing tribes) is to go full on guns blazing against those disinformation fascists.
Who Paid off The Risk-Monger?
“It is hard to believe that others can be so stupid as to actually believe certain things. The only reason they would say something like that is if someone paid them.”
This is what my trolls often say about what I write. At one point it was so widely believed that I was a Monsanto shill that Google created a prompt to the word “Monsanto” by the first letter of my last name. I was stunned how much time and effort activist journalists had spent trying to create this link. Spoiler alert: they didn’t find one (conclusion: they lied).
Being an industry shill was the only reasonable explanation they could come up with for how The Risk-Monger’s views and perception of reality were so different … from our “truth”. I was deplatformed from my original BlogActiv page when I exposed how Le Monde’s Stéphane Foucart was working on behalf of IARC to attack EFSA’s position on glyphosate, and I have no doubt, with the amount of ink this activist has since spilt trying to discredit me as an industry shill, that he firmly believes in his own virtue and my vice.
I have referred to this as the Age of Stupid … but it does not refer to how I am right and others are stupid. When we surround ourselves with those who always agree with me and repeat the same ideas, when we block out anyone who may disagree with us or provide challenging views, how would I know, for example, that I am not the stupid one? Those in my tribe have robo-banned or coordinated a deplatforming assault on any sites or people which may think differently (a sort of docilian empowerment). My filters and my algorithms are protecting me from any contrarian infection. I am then free to believe whatever I want to … but is this active disinformation or systemic misinformation?
As the Risk-Monger, I firmly believe that science and innovation are improving humanity and the environment and can solve most any problem we may face. Having worked 15 years for industry (until 2006), I have seen how researchers have developed solutions efficiently and how the capitalist system has spread these technologies and raised up economies and opportunities globally. Look, for example, how fast the COVID-19 vaccine was developed, produced and distributed. I believe this and have worked with people who share this view fervently. I also feel certain that the European Commission’s blind reliance on a manipulative interpretation of the precautionary principle has handcuffed such innovations and suppressed our risk management skills. In expressing such views, I have also been accused of being a captain of disinformation, evidently shilling for industry who must be paying me to allow polluters free reign to poison the planet. These accusers cannot understand how I can sleep at night.
But is it my intention to disinform or is it the efforts of those who don’t like my views to use this handle to discredit me? Truth is in the eye of the beholder and we are itching to shout down the liars. If not me, then whom? There must be someone out there with a clear disinformation strategy.
The Disinformation Playbook?
One group that thinks I am a disinforming industry shill is the Union of Concerned Scientists. They seem to play the disinformation card to anyone who does not donate to them and they have even given industry’s lobbying activities a name: The Disinformation Playbook. In a recent viewpoint, five authors from this activist NGO tried, once again, to expose industry (Which ones? All of them!) as liars, thieves and cheats. The authors of this attack piece, led by Genna Reed, go back 60-70 years to the tobacco and asbestos industries to try to prove their thesis that industry seems to exist only to disinform. When industry lobbies, it is pure disinformation (but I wonder what the 27 paid lobbyists at the Union of Concerned Scientists’ DC branch office then do).
Their article makes many claims that can only be considered as intentional (libellous and unsubstantiated) disinformation. For example, they accuse J&J of intentionally selling asbestos-riddled talcum powder for more than six decades knowing that mothers were poisoning their babies. Why? Because that is what industry does. They accused Monsanto of harassing scientists working on the IARC glyphosate monograph. (Correction! I did that because these activist scientists were secretly paid by US tort law firms to produce a document that would disinform juries, ban safe, valuable agricultural tools and enrich lawyers at a cost to consumers.)
The Union of Pot, Kettle, Black.
What makes their fear-driven campaign so ripe with hypocrisy is that the Union of Concerned Scientists have a track record of repugnant disinformation. In 2012, this activist NGO presented a strategy at a conference they had organised in La Jolla to persuade scientists to “produce” evidence for tort lawyers for the sole purpose of suing industry out of existence. In SlimeGate, I called this strategy the La Jolla Playbook and it is perhaps the most unscientific, unethical strategy to mislead the public and bypass the democratic regulatory process ever concocted. No wonder it was so easy for this group to continually write about disinformation … it’s the only thing they seem to know how to do.
How can people who claim to be scientists actually believe this? How can a publication like the Journal of Public Health Research actually publish a piece like this and keep a straight face? (Oh yeah, they got paid to print it.) How can these authors have been so misinformed? The five authors from the Union of Concerned Scientists even claimed they had no conflict of interest (somehow forgetting who pays their rent).
There is only one reliable fact to come out of such a group of haters: none of them have ever worked for industry. To be so badly misinformed, I suspect they have never even had a meaningful conversation with anyone outside of their narrow circles.
Hey Genna, give me a call … let’s talk about ethical research practises.
How Can we be so Stupid?
Information within our present communications tools and communities excites us, triggers us, mobilises us. If we really want to believe something, a seemingly credible source will certainly appear. We each now have an emotional microphone nearby and are encouraged to use it. In such a 24/7 emotional fact funnel, we can’t help but irrationally favour certain points and exclude or discredit threatening information. And in that wanting to believe, bias is born: a child of raw emotion with a lineage of agreed-upon stories and benefits. In the noisy intersection of these beliefs, our algorithms sort us into vocal mobs that comfort us, promote our ideas and push us to accept ideas far beyond what we would have, had we been in a rational state of mind. That little voice in our heads that should have warned us to think twice before posting or give this some more consideration is quickly silenced in the free-flow of affirmation. Suddenly we find ourselves in Charlottesville with a Tiki torch and no hood.
Indeed no one is paying The Risk-Monger to bark at the moon but the moon is in his way and needs to move.
More to the extreme, people who are afraid are able to convince themselves of many things that can defy reason and logic. Likewise for people who are angry. Or feeling victimised. Bring them together with others who share similar views or concerns and this is catnip for radicalising thoughts. To simply blame others and call it disinformation is not responsible. But there are bigger issues at play here – that of a lack of trust.
Mistrust or Distrust
The title of this article plays with an idea – are we misinformed (accidentally receiving incorrect facts and information) or disinformed (someone intentionally is trying to deceive me)? If I am misinformed, I have placed my trust in the wrong sources (mistrust). If I am disinformed, then there is no trust in the sources of information (distrust). This is an interesting dichotomy of intention.
We need to trust in order to make any decision. If I want to stand up, take a step or cross the street, I need to trust my motor skill capacity and the way others around me operate. If I get in a car, I need to trust the driver, the car, the road and others near my vehicle. If I buy food to consume, there is an entire (often complex) chain to trust (why labels have become inherent in this trust relationship and why the “buy local” campaign is so attractive).
Fear is a signal that trust is weak. But there are two types of lack of trust here. Distrust is when I fear / do not believe the source of information, the product or the process. Many distrust our authorities or the political process, institutions, industry or experts – they believe there is active disinformation to mislead them on everything from the safety of the food chain, vaccines, medicines to public health and safety. The strong emotion here is fear of the unknown (a Heideggerian dread). Mistrust, on the other hand, arises when people are given the wrong information (misinformation) – that they have trusted the wrong sources or may have had a bad experience with trusting authorities. This can come from a wide range of misplaced trust experiences (like using a bad map or believing an erroneous weather forecast). The guiding emotion here is FUD – fear, uncertainty, doubt.
Fear and deception often work together on risk-based decisions (which may explain why misinformation and disinformation are often confused). Take for example the decision to get a COVID-19 vaccine. I may distrust the experts, health authorities, doctors, pharmaceutical companies and communicators telling me to get the jab – I fear that the information they give me is motivated by money, control, politics… I consider their message as disinformation and stand against vaccines. I am an anti-vaxxer opting to put my personal freedom ahead of the objective of obtaining herd immunity.
I could also not know whom to trust, not being convinced of the personal benefits and not believing that there are no serious side-effects. I am then vaccine hesitant but I am not spreading disinformation – I need more reasons to trust.
It is no surprise that science communicators merge mistrust and distrust together, but it is regrettable that they are not observant enough to understand the consequences of that. The disinforming anti-vaxxer did not attract the vaccine hesitant; the obstructive vaccine vigilante pushed them into their distrusting arms.
Deniers or Sceptics?
Today’s Western media social narrative is dominated by two issues: climate change and COVID-19 vaccines. In both cases a consensus has been determined and any questioning or straying from the orthodoxy is deemed as heresy worthy of excommunication. Those who don’t submit to the truths of the science are labelled as “deniers” – horrible creatures trying to disinform others who must not be allowed to express their views.
But any consensus they hide behind is political, not scientific. A scientist, following Karl Popper, must by nature be a sceptic. We must engage with sceptics in a scientific process and if there is misinformation, it is corrected in a dialogue process that strengthens both sides. The more a scientific theory can resist falsification, the more robust it becomes – the more it can be trusted. But if all misinformation is treated as disinformation attempts, we no longer are capable of having such exchanges. The sceptic is treated as a denier. Truth then rests with the mob who can intimidate, ban, fire or deplatform those who challenge their more popular consensus.
Take the case of vaccine hesitancy – a scepticism that requires engagement.
Pax Vaxiana: Turning the Vaccine Hesitant into Anti-Vaxxers
If I am vaccine hesitant, I do not know whom to believe. I don’t instinctively distrust like the anti-vaxxer, but I have heard stories of friends and family who have had terrible side-effects and there is, in the search for trust, a tough credibility standard that needs to be met. There is scepticism.
The authorities charged with communicating the message on getting the jab confuse the distrusters with the mistrusters and lump them both together as spreading disinformation on vaccines (apparently this is being treated as today’s moral equivalent of terrorism). The vaccine hesitant need better messages from sources they can trust; they should not be condemned as opportunistic liars, ostracised for their concerns or have their intellect insulted.
Overwhelmed by different messages, I may become vulnerable and seek information that may be more emotionally reassuring than factually correct. I need to trust and will listen to all sides: one side telling me it is my choice to get a vaccine and there are other alternatives while the other is threatening me with consequences, forced mandates and telling me to shut up and take the jab. In such situations, I am highly likely to trust certain organisations who argue against forced vaccines (who are growing their tribe and becoming more confident). I may be misinformed, I may have misplaced my trust, but this decision is not disinformation – it is a result of a lack of trust..
Is this intentional? The only thing intentional is how our authorities have failed to engage and listen. Their message of vaccines being 100% safe goes against experience and has led to vaccine hesitancy and distrust. Their message of forced mandates on personal choices has been idiotic and capitulatory. Their poor communication and lack of engagement have then led to serious long-term institutional distrust. Seriously, how can ostracising people you have failed to properly communicate to actually rebuild public trust?
It doesn’t. Rather it cements this uncertainty in echo-chambers that breed distrust of authorities, technology and science. Being pushed and bullied enough, the vaccine hesitant will more likely harden their views. The anti-vaxxer is a creation of poor health and science communications. And then accusing them of spreading disinformation and then trying to censor them? Pure trust-building gold.
Dialogue or Demagogue?
Anyone who plays the denier card or quickly claims “the” science as their political consensus will likely refer to any threat or opposing view as disinformation. An unwillingness to listen or engage is a guaranteed trustbuster.
We need to restore dialogue and openness for trust to return. So why then are Western authorities forcing vaccine mandates on their populations? Why are they rushing a painful energy transition without consultation? Why are they imposing precautionary measures without viable alternatives?
We need to be humble with sceptics rather than arrogantly enforcing a politically-driven consensus view. Authorities unwilling to engage or listen are rarely trusted. That is the subject for the third part of this series.