Proponents of the precautionary principle often claim that it is necessary to secure public trust in research and technology. Without measures in place to ensure certainty (eradicating all hazards) and public safety, they argue, public acceptance of innovative technologies will diminish and fear will proliferate. Without meeting the demands of the precautionary principle, there can be no trust in science and emerging technology.
Nothing could be further from the truth.
Trustbusters is a three-part series on how experts and authorities are failing to manage public trust in research and technology. The demise of trust (in authorities, science, institutions, expertise…) is leading to a breakdown in societal and institutional structures, political deadlock and a decline in scientific literacy. Part 1 looks at how a policy tool (the EU’s use of the precautionary principle) has destroyed trust in science. The second part looks at how our misunderstanding of the difference between misinformation and disinformation mirrors the mistrust/distrust distinction and how our reaction to suspected disinformation is further affecting public trust. The last part simply asks if the arrogance portrayed by certain scientists and science communicators is to blame for a public distrust of technological innovations.
Guilty until proven innocent
While there are many versions of the precautionary principle, the one used most widely (and by the European Commission in Directives like REACH and the Sustainable Use of Pesticides) is the European Environment Agency (EEA) interpretation commonly known as the reversal of the burden of proof. Rather than testing whether a substance or process may cause harm or be of concern, this version demands that everything must be proven to be safe before being allowed onto the market. Of course, “safe” and “certain” are emotional concepts that are open to wide personal interpretation (and scientists would never use this language, opting for a continuous process of attaining “safer” and “more reliable”).
The EEA definition leaves researchers in a “guilty until proven innocent” situation where the demand for innocence is impossible to meet. This has caused a cottage industry in the “forever doubt” activist fear campaigns like endocrine disruption, GMOs and insect apocalypse where campaigners use old (poor) data to disregard better research and continue with their “just not good enough” strategy. So long as the benefits of such technologies can be explained away, precaution remains the (lucrative) activist weapon of choice. Other longstanding campaigns (like acrylamide, dioxins or EMFs) did not meet the “forever doubt” criteria because the benefits of the technologies or processes are too high or exposures too banal, highlighting the foolishness of the reversal of the burden of proof approach.
The precaution game is rigged as there is no such thing as 100% safe. Scientists have been trained to develop risk management methods to ensure that products and processes are safer (a continuous process of developing ever better results while lowering exposure – risk – to as low as reasonably achievable – ALARA). Safe is an ideal (an emotional state) that we should work towards but will never reach. There are always eventualities or unforeseen possibilities, however remote. When the precautionary principle demands that we prove a technology or product is safe before it is allowed onto the market, the answer to any additional research data will always be “Sorry, but this is still not enough”.
Any application of law is based on certain values. Precaution as “guilty until proven innocent” could easily be applied to ban coffee, automobiles, solar panels and organically-grown produce. But the judge and jury are very selective (ie, political) in their use of this principle, basing their application on a rather arbitrary distinction of natural versus synthetic. Natural foods that are known endocrine disruptors are tolerated and even fêted for their health properties while a plastic or synthetic pesticide that cannot prove (with certainty) that it does not have trace endocrine disrupting properties must face an immediate removal from markets.
Precaution is a political construct, activist-led, at ease with its internal hypocrisies and not at all scientific.So why is the European Environment Agency claiming that their interpretation of the precautionary principle is the only way to ensure trust in emerging technologies?
The Denial Trap
One of the clever tricks of these forever doubt activists is the denial trap. So long as researchers are being led on the “prove to me it’s safe” goose chase, they are not framing the narrative, not demonstrating the benefits and not promoting the values of their innovation. They are caught up in a denial trap, continuously trying to prove that their technology is safe rather than demonstrating what it can do or developing even safer iterations. And if you are spending all of your time trying to prove that something is not harmful (ie, safe), and you cannot, what are you actually saying that is positive about your innovation?
GM technology is a prime example of the denial trap – an NGO hostile hoodwink. When these innovative seed technologies emerged in the public eye in the 1990s, the promise and the benefits were enormous. But the innovators got caught up defending their innovations in a forever doubt vortex.
- Prove to me that your seeds won’t harm monarch butterflies.
After two years of tests, the results looked very good (but not certain).
- Prove to me that GMOs won’t cause cancer.
More tests, more evidence to the negative (but one deceptive little snake in France had a press conference with some rats with visible tumours).
- Prove to me that GMOs won’t be the cause of endocrine disruption in frogs!
First of all, was there endocrine disruption in frogs, and secondly, Huh??? Time to finance more studies.
And after five years of the denial trap, the skittish European Commission decided that we did not need this technology, declared an effective moratorium and European research and innovation was set back a generation. With new plant breeding technologies, the anti-tech, anti-industry activists are pulling out the same playbook, and guess what? The scientific community has fallen right back into their trap. Trust was destroyed 25 years ago and it is not coming back any time soon.
Proving that a technology or substance is safe is a game scientists will never win. Safe enough is just never enough. The only way out of the denial trap is to focus on the benefits. At the beginning of the millennium, there was concern about the health risks of mobile phones (radiation exposures could cause brain tumours, leukaemia, other cancers… especially among young people). There were viral videos of three phones ringing at the same time being able to pop a popcorn kernel. We did not really need this new technology (then) and there were strong voices of concern about other EMF risks (EMF frequencies, microwaves, phone masts, 3G…). Rather than falling into the denial trap, the mobile phone industry continued to improve their technologies (make them safer) and continued to stress the benefits. The EMF activists lost the narrative and ran out of cash.
The Taint of Precaution
Precaution acts like a plague on any innovative technology. If activists paint any product or substance as uncertain or potentially unsafe to human health or the environment, the stench emanates via the media and causes the innovation to be forever tainted. Once a substance would be blacklisted, once the public has doubt forever etched on their minds, once scientists fall into the denial trap, an innovation or technology is as good as gone.
Perception of a risk is nine tenths of the law in the court of precaution and facts matter little in such situations. If people think chemicals leeching out of plastics in food packaging makes them go sterile then they will spend the extra money for glass, demand plastic-free alternatives and then, in predictable zealot-fashion, demand a total ban of their use so others have to pay more to justify their irrational fear. No studies will ever replicate their original purpose-built studies but this was never necessary in a precaution-based docilian world where the public has come to expect 100% safe (without actually understanding what this implies).
Tainting a substance or technology as potentially unsafe has created opportunities not only for the legions of activist zealots and NGO fundraisers but also for the competition. Building a market and surviving in a competitive environment is difficult work for many companies. It could be made easier if your competition is facing the forces of forever doubt and caught in the precautionary denial trap. If a substance is under suspicion, the supply chain will quickly retool with alternatives so that they will not get left behind once (when) the precautionary principle is applied. And once these downstream users have an alternative to the substance stuck with the stench of precaution, they may just play their card and promote the forces of forever doubt. Fait accompli.
Br-Br-Br-Bromine! Brominated flame retardants is a good case study in how the stench of precaution, when allowed to waft throughout the supply chain, can lead actors in competitive industries to promote this forever doubt for their own advantage. Bromine-based flame retardants worked well to ensure plastics (a petroleum-based substance) would not burn. This was a valuable safety feature added to plastics across a wide range of electronics products including computers (that would heat up and, if not protected, burn spectacularly … as countless YouTube videos attest). An alternative to brominated flame retardants was aluminium, which was not flammable, but very expensive and energy intensive. In the 1990s and early 2000s, a series of studies started to raise doubt on the safety of brominated flame retardants and the industry fell into the denial trap. More studies were done finding the substance in the environment (it was persistent, which was its value as a flame retardant) and NGOs like Greenpeace started to do (very expensive) biomonitoring tests to show bromine accumulating in blood and tissue samples (at very small, non-hazardous levels). Fear of being blacklisted with a tainted substance, the supply chain started to look for alternative substances that would not burn. Greenpeace launched the famous Green my Apple campaign along with their annual green electronics guides condemning any IOMs that dared to use PVC or brominated flame retardants. Sure enough, in the mid-2000s, computer companies, led by Apple, started coming out with (much more expensive) aluminium computers. Bromine (and many plastics) are now becoming a legacy substance, condemned not by science and facts, but by the taint of precaution brought about by some ruthless, stealth lobbyists from the aluminium industry who bought off some gullible activist scientists and NGOs.
We should not overlook how the original campaigns, regulatory pressure and doubt-driven research against bromine came from Norway and Washington State. What do these two areas have in common? They are home to the largest aluminium companies and smelters. The aluminium lobbyists also tried that same trick in trying to legislate out polyethylene-based automotive fuel tanks (another deception I observed first hand) but the car industry was too strong and not as stupid.
Uncertainty – The Handmaiden of Distrust
When there is doubt, distrust is not far behind. Precaution does not build trust but rather preys on a lack of trust populations may have with their governments, scientists and industries. For activists working for a cause (or an alternative technology) raising alarm bells and causing distrust is an easy process – find a fear, raise doubt and communicate the hell out of it. Then inform a frightened public that the technology is not necessary or that alternatives exist (in the case of nuclear, Greenpeace’s alternative was two more decades of coal … sweet!). Our trust perceptions are emotional and, like any emotion, they do not pay much heed to facts or reason.
Precaution cannot succeed in situations where the public trusts the substance or process or appreciates the benefits. There are certain elements of trust (covered before) that can precaution-proof any substances or processes:
Familiarity. One of the key elements of trust is familiarity. If we have been using something for generations (like toasting bread) then any precautionary fear campaign (like acrylamide) would be met with derision and laughter. If I am not familiar with a scary-sounding chemical name, and I am told it is not necessary, than any whiff of doubt will have policy-makers reaching for their precautionary pillbox.
Agency. Another element of trust is agency. If I have the ability to take control (if I am empowered with a decision), I trust the process more. Why I feel safer driving a car rather than sitting on an airplane even though the statistics would prove that to be an irrational belief. With many technologies today, the public feels helpless – that they are being exposed to substances in their environment without their consent. Even if the exposure is harmless, taking precaution is equivalent to taking control (and is an easy political win for skittish government officials).
Kinship. Trust is often based on a kinship – an identification with some population, region or cultural practice. Trust is relational. We trust those like us and the rise of social media communities (tribes, echo-chambers) has been like catnip to precaution-mongers. When enough people in my community scream “We don’t want this!” then a precautionary voice is born. There are far fewer outspoken scientists, farmers or energy experts so the voice of the mob, with their precautionary torches, will easily win out over the voice of facts and reason.
Precaution does not build trust. So when an activist like David Gee, the architect of the European Environment Agency’s formulation of the precautionary principle (and previously a director at Friends of the Earth) claimed that precaution is necessary to rebuild trust in science, we can easily see how he was, well, full of shit. Gee’s objective was just the opposite – to put scientific innovations in impossible situations pitted against populations demanding safety and certainty while rejecting anything that was not familiar, within their control or like them. After the risk crises of the 1990s (GMOs, MMR, EDCs, acrylamide, dioxins…), the precautionary principle was the coffin that sealed away any hope for improved trust in research and technology.
And after two decades of this miserable policy tool, we are reaping the rewards of precaution – the public rejection of beneficial technologies like vaccines, biotech and medicines.
Precaution’s Monster – The Anti-Vaxxer
Those in Brussels who support the precautionary principle are the proud parents of the anti-vaxxer. They created the conditions for this monster to flourish and are now reaping their rewards.
Vaccine hesitancy has thrived on distrust of science, governments and industry. With the new mRNA vaccines developed to help battle the spread and reduce the impact of COVID-19 comes an unfamiliar technology with fear of uncertain consequences. Anti-vaxxers would rather take control (agency) of preventing potential harm from the virus (by whatever often useless means those in their communities propose) rather than trusting a new technology.
And as more people question the safety of vaccines, officials and science communicators find themselves, that’s right, falling into the denial trap. That vaccines don’t cause miscarriages; they don’t cause heart attacks; they don’t cause blood clotting … and so the game goes on. Those who don’t engage in the anti-vax forever doubt formulations then commit the sin of arrogance: calling out those who are vaccine hesitant as lunatics and a threat to society. Hardly a means to build trust in research and technology. The science communications solution – create a vaccine mandate to ostracise those who are afraid and uncertain. How do you spell “stupid”?
The unprecautionary EU precautionary principle formulation. A vaccine is precautionary by its very nature – under the triple-negative interpretation, vaccines are safer than the alternative (a virus or a disease) which has a higher likelihood of causing serious harm. How did this then get turned on its head? The “safer with technology” approach is not the precautionary principle that the European Environment Agency has successfully campaigned for (demanding certainty instead, that a substance like a COVID-19 vaccine is 100% safe). Vaccines can never be 100% safe so we are using the worst interpretation of precaution at the worst possible time. When there was a question of a few potential cases of blood clotting from the AstraZeneca COVID-19 vaccine, European Commissioner for Industry, Paolo Gentiloni, stressed the need for certainty and praised the EU’s application of the precautionary principle to suspend use (until the European Medicines Agency came in and restored scientific logic). It was too late though and, predictably, public trust in the AstraZeneca jab cratered. The impossible conditions that precaution as the reversal of the burden of proof created has led to a public health crisis far greater than any potential unknown hazards. This has once again become a fruitful activist campaign tool to sow distrust in science and technology. Without the activist, politicised version of precaution, the voice of the anti-vaxxer would fall silent with the benefits of the vaccine promising a safer outcome.
Someone like David Gee, who spent his entire life campaigning against chemicals and mobile phones, created a policy tool monster to help him fix the game to win in Brussels but that is now causing severe harm to a population that bought into his fear-driven nonsense. So David, 20 years on, within this carnage of distrust, fear and needless death, where is the trust that you promised that your precious little principle would deliver?
9 Comments Add yours
GMO being not too bad does not have much relevance to covid vaccines. There are plenty of reports of the vaccines causing harm. The only thing to do is properly test them, not rush them into universal use on a relative harm basis that relies con data yet to be produced.
LikeLiked by 1 person
Oh, and I don’t trust the authorities for another reason, unmentioned above. Because they lie.
Another excellent post! I really appreciated your take on “trust,” and that you included “agency.” In the US, the growth of anti-social behavior – everything from illegal U-terns to road rage to rioting – has grown seemingly in lockstep with the growth of government. People who don’t feel they have control of their lives – agency – too often will act irrationally. Trust is the ultimate act of agency – I give my trust to you; you can’t take it from me. If I have no agency, how can I trust?
LikeLiked by 1 person
Comment sent via mail from Giovanni Molteni Tagliabue
This interesting article contains an incorrect idea – which, however, does not invalidate its main content. It is maintained that distrust in science is widespread, even on the increase. Fortunately, this oft-heard concept is empirically refuted by troves of data.
According to a vast international survey, in 2016 (data collected in October-November) we were witnessing the largest-ever drop in trust related to government (41%, but leaders 29%), business (52%, but CEOs 37%), media (37%-50%) and NGOs (53%); yet, people were still confident in certain categories: a person like you (60%) was as credible as is a technical (60%, from 67% 2015) or academic (60%, from 65% 2015) expert (Edelman 2017, data summaries at p. 11 and 14). Two years after, the same updated report shows a minor but significant rise in trust for certain categories, especially for experts: 65% (+2% on the former year) for a “company technical expert”, 63% (+2%) for “academic experts”, 61% (+7%) for “a person like yourself”; journalists and government officials (approx. 35%) are at the bottom of the rankings (Edelman 2019, data summary at p. 32). In 2019, the trust in company technical experts (+3%) and academic experts (+3%) is still growing; trust in scientists among the international public is set at a remarkable 80% – the highest among all social categories (Edelman 2020, p. 63 and 17).
More: “a new international survey finds scientists and their research are widely viewed in a positive light across global publics, and large majorities believe government investments in scientific research yield benefits for society.” (Funk et al. 2020, p. 6. Survey across 20 publics (October 2019 – March 2020): Europe, Russia, Americas, Asia-Pacific region. The State of Science Index Survey 2020 (3M 2020) reports an important change in public attitudes: comparing the results of polls from several countries (Brazil, Canada, China, Germany, India, Japan, Mexico, Poland, Singapore, South Africa, South Korea, Spain, UK and the USA) carried out in 2019, with surveys conducted in roughly the same group of countries six months into the Covid pandemic (July-August 2020), “appreciation for science and trust in scientists has increased significantly”.
Similarly, many people in a number of countries with diverse socio-political systems, mixed cultural backgrounds and different levels of economic development, when asked about their “Confidence in Universities” gave various answers: in democracies, respondents who declared “A great deal” or “Quite a lot” are in the range from 16.8+63.1% (Australia) to 12.6%+47.9% (Chile): that is, positive opinions regularly constitute a robust majority of 3, even 4 out of 5. In the same international survey, people were asked their opinion about “Political system: Having experts make decisions”: in democracies, generally, respondents who declare that the idea is “Very good” or “Fairly good” are around 50%. (World Values Survey 2010-2014)
In the USA, in 2017, 21% of the surveyed declared “a great deal” of trust in scientists, 55% a “fair amount”, 18% “not too much”, 4% “no confidence” (Funk 2017); a four-year comparison (2016-2019) shows a growing confidence in scientists (from 76% to 86%) and medical scientists (84%-87%), even higher than in the military (79%-82%), a category traditionally held in great consideration by the American public – while elected officials (27%-25%-37%-35%) trudge back (Funk et al. 2019).
In Sweden, confidence in universities and research is high and rising (84%), while many citizens believe that “science has too weak influence on politics (43 percent)” (Bergman, Bohlin 2018).
In the UK, teachers, professors and scientists are among the most trusted professions, with an 89-85% score – while “government ministers” and “politicians generally” rank almost at the bottom with 22% and 19%; differences in opinions among conservative vs. labour supporters (1-4%) are scarcely relevant (Ipsos MORI 2018). Again in the UK, approx. 85% believe it “important” that “when making difficult decisions, politicians”: 1. “consult a wide range of professionals and experts”; 2. “demonstrate that the decision is based on objective evidence”. So, pollsters can remark that “people have not ‘had enough of experts’; they still want them involved in decision making.” (Institute for Government 2016, p. 4 and 1) The words in inverted commas in the latter quotation refer to the infamous phrase uttered by Michael Gove, MP who campaigned for Leave during the Brexit referendum. Interestingly, the percentage of respondents who voted Leave and endorse the positive role of expertise is almost equivalent to those who voted Remain.
Therefore, contrary to a diffuse perception, this context “seems to suggest that the so-called populist backlash against science and expertise as a general claim is a figment of the imagination, itself in the land of opinion and post-truth.” (Grundmann 2018, p. 3)
3M  State of Science Index Survey 2020. http://www.3m.com/3M/en_US/state-of-science-index-survey
Bergman, Martin; Bohlin, Gustav  VA Barometer 2018/19 – VA report 2018:6. Stockholm: Vetenskap & Allmänhet, 2018. https://v-a.se/2019/02/va-barometer-2018-2019-in-english
Edelman  Trust Barometer 2017: Executive Summary. New York: Edelman, 2017. http://www.edelman.com/trust2017
Edelman  Trust Barometer 2019. New York: Edelman, 2019. http://www.edelman.com/research/2019-edelman-trust-barometer
Edelman  Trust Barometer 2020: Global Report. New York: Edelman, 2020. http://www.edelman.com/trust-barometer
Funk, Cary at al. (Pew Research Center)  Science and Scientists Held in High Esteem Across Global Publics. http://www.pewresearch.org/science/2020/09/29/science-and-scientists-held-in-high-esteem-across-global-publics
Grundmann, Reiner  The Rightful Place of Expertise. Social Epistemology, November 2018, 32(6):372-38
Institute for Government  Trust in Government Is Growing – but It Needs to Deliver. London: Institute for Government, 19 September 2016. http://www.instituteforgovernment.org.uk/publications/trust-government-growing—it-needs-deliver
Ipsos MORI  Veracity Index 2018. http://www.ipsos.com/sites/default/files/ct/news/documents/2018-11/veracity_index_2018_v1_161118_public.pdf
World Values Survey [2010-2014] Confidence: Universities. http://www.worldvaluessurvey.org/WVSOnline.jsp
Such a shame to use the term Anti-Vaxxer.
With respect your idea that vaccination is good or necessary is misplaced.
The vaccine manufacturers have no liability. You recon that’s fair?
You trust Gates?
If drug companies willfully choose to put harmful products in the market, when they can be sued, why would we trust any product where they have NO liability?
Never mind how they are now being exposed for crooked deals with Government. That alone is enough to put anyone off a product where Government protects them and passes responsibility onto tax payers for damages.
Happy new year David.
Deanna McLeod: More Harm Than Good: An evidence-based analysis of the Pfizer 6 month trial.
This is an EYEOPENER. It seems that it is acceptable for a corporation to lie and get away with it. Do you see why the term Anti-Vaxxer is actually an insult to peoples intelligence, it is. Ad Hominem. https://worldcouncilforhealth.org/multimedia/deanna-mcleod-evidence-analysis-pfizer/