Search
  • Paul D. Wilke

In Defense of Expertise


I finished my last post writing about how to stay intellectually engaged and honest in a world full of so much white noise. This post continues that thought while going in a slightly different direction to talk about the differences between scientific and public opinion. How can we more accurately form opinions about the physical world around us when we're saturated with so much contradictory information? Can we even hope to do so?

Public opinion and mainstream science are uncomfortably out of sync on some issues. When it comes to climate change, for example, the Pew Research Center found that about 90% of active research scientists believe that the earth is warming and man is the primary cause, compared to just 50% of the general population. [1] That figure goes up to 97% when narrowed down to just climate scientists. [2]


The safety of genetically modified foods (GMs) likewise shows an even greater disparity. Around 91% of active research scientists believe GM foods are safe to eat, compared to only 39% of the population. [3] Likewise, 98% of active scientists accept that humans evolved (with or without God) compared to 66% of the population. However, a staggering one in three Americans rejects evolution outright [4].

My basic premise is that scientists opining in their areas of expertise are the ones best positioned to give us the most accurate evidence. Scientific opinions are therefore invaluable tools to inform our debates about public policy. Unfortunately, a process of cognitive self-deception can occur resulting in beliefs widely divergent from consensus science. These then create knowledge illusions and produce sloppy thinking.

For the most part, however, people accept science without any fuss. Just stop for a minute and think about how much science impacts our daily lives. Vaccines have saved tens of millions of lives over the last half-century. Every time we go to the doctor or take a pill to lower blood pressure or cholesterol, we are trusting in science. We trust NASA to get the math right when they send probes to explore other planets and are no longer surprised when they execute such complicated missions.


Hurricane forecasters are now good enough to track storms accurately from start to finish. Few people, except perhaps ​Rush Limbaugh, will argue for any ulterior motives in hurricane forecasting. An abundance of products stack the shelves of our grocery stores, the result of incredible advances in agriculture over the last century. No big deal. Nothing controversial.

In the examples cited above, experts using science created technologies to benefit society. Science, combined with the free market, does a lot to make our lives so much better. We lead comfortable, well-fed, healthy lives today thanks to science and its practical applications. In today's world, it takes experts, not a village, to make such exquisitely complicated technologies look like these look like magic. We villagers don't have time for it and only see the end products as consumers.

Our highly technical civilization means we are a society of specialists. Outside of our area of expertise, however, we are all non-specialists or laymen. An expert in one sub-discipline will be ignorant of another expert in a different field. A molecular biologist will not have the knowledge to discuss particle physics, nor will an aerospace engineer normally have an expert grasp of climate science. As non-specialists opining on areas outside our expertise, we are susceptible to cognitive traps that tug us in the direction of our bias. In truth, even those who practice good, sound science in their field can easily fool themselves that they know more about other specialties.

Now, by science, I mean systematic efforts to discover truths about nature by using the tools of reason, observation, and experiment. [5] Of course, science is not perfect, nothing people do ever is, but it is by far the best way we have of understanding the world. Dismissing what the scientific evidence is showing us is not something that should happen lightly when there are policy implications. And yet, policy implications and competing interests end up clouding some of these issues and make them controversial.

We give science the benefit of the doubt when it keeps the store shelves stocked and the planes from crashing. However, this peaceful coexistence runs into trouble when ideology enters the equation. Over the last few decades, some areas of science have driven people to hold irrational opinions. I call them irrational because they do not come from any reasoned, objective look at the evidence, but stem from ideological biases combined with knowledge illusions.

In other words, when there's an ideological or economic angle (the two are often intertwined) to a scientific issue, for example, a mistrust of 'Big Government' or "Evil Corporations," we often go with our gut first and then work backward to build a rational case to justify the conclusion we already made intuitively at the start. This type of thinking is not about intelligence or lack thereof since highly intelligent people are just as susceptible, often more so, to over-assessing their knowledge and building beliefs based on flawed intuitions.

Many things shape intuitions: our social environment, our upbringing, the information we consume, and who we perceive as authority figures, just to name a few. We rely on our intuitions as starting points for most of our beliefs simply because there is not enough time to methodically deliberate each thing that comes our way. We are not biologically equipped to deal with all of the world's complexities. So, we develop mental shortcuts, or heuristics, to help bridge the gap.

While this usually works well enough for most routine day-to-day activities, we sometimes get in trouble by thinking we know more about the world than we do. Steven Sloman and Philip Fernback wrote a book explaining just how little we know and yet how hard we work to make ourselves believe otherwise. "We ignore complexity by overestimating how much we know about how things work, by living life in the belief that we know more than we actually do. We tell ourselves our opinions are justified by our knowledge, and that our actions are grounded in justified beliefs even though they are not. We tolerate complexity by failing to recognize it. That's the illusion of understanding." [6]

Daniel Kahneman argues that our thinking operates at two cognitive levels: System 1 is where most of the action is, running on quick and easy snap decisions that help us function smoothly in the world. It handles the routine decision-making that we face every day. System 1 is all about quick impressions, feelings, emotions, and intuitions. This is where everyday thinking takes place for most people the majority of the time. System 1 is really in charge.

System 2 is the more deliberative side of our mental operating system. We are using System 2 thinking when we are paying attention, reasoning, working things out, reading this article, and sifting through System 1's flood of inputs. When System 2 endorses System 1's inputs, those inputs become beliefs. But remember, those beliefs started out as subjective emotions or intuitions. Where there's a conflict, System 2 adjudicates and interprets. Most of the time, however, we get along in the world with our System 1 impressions on autopilot. System 2 takes time and is slow, deliberative, and only able to get involved on a limited basis. [7]

All of us do this kind of heuristic thinking more than we would like to admit - it's how our brains are wired. For most of our day to day activities, however, System 1's intuitive approach gets us by just fine. The danger comes when those System 1 impressions and intuitions do not get the sort of rigorous vetting that only our System 2 mind can provide. As a result, you get people with strong opinions on topics in which they have no expertise. For the sake of this discussion, those strong opinions are in the sciences. These views are usually less about the science itself than about the ideologically threatening (emotional) implications of its conclusions to someone's beliefs.

People end up triggered by certain scientific theories while indifferent to others. In this way, string theory is not a source of cultural controversy, though climate change is. The general public is not bitterly divided about the deeper mysteries of gravity, and yet many have strong opinions that genetically modified products are dangerous, contrary to compelling evidence to the contrary. Evolution is not a scientifically proven fact, confirmed and reconfirmed over the 150 years, but "only a theory." Usually, science only gets embroiled in society's culture wars when there are real-world policy ramifications. Then things get interesting.

A firmly held belief that diverges from mainstream science often has underlying causes. For climate change, the issue can be less about the science itself than the fear of Big Government. The average person who doubts the science behind climate change usually knows very little about it but instead has a fear of government intervention in the economy.


This economic bias, itself questionable, then filters the way the issue is perceived, leading people to dismiss the scientific evidence outright or minimize its impact. Call it, if you will, a fundamental fear of watermelons - or policies that are green (pro-environment) on the outside but red (socialist) on the inside - that motivates climate doubters. Sophisticated media outlets and think tanks work to obfuscate the issue, stoking fears and pressing ideological buttons to garner appropriate responses.

Let me offer a personal example. As someone who reads quite a few left-leaning news sources, I'm regularly exposed to well-written anti-GM articles and slick food documentaries that appeal to my pro-environment and personal fitness biases. Not surprisingly, for a long time, my beliefs on this issue closely mirrored the media I consumed, which was against GMs. Did I ever actually stop to take a look at the strongest scientific evidence that challenged my view? No, of course not.


Almost no one does that, and I was no different. I was already emotionally and ideologically set against GMs and was not moved by the dim awareness that there was strong scientific evidence contrary to my opinion. Everything I didn't agree with was merely the biased media colluding with profit-driven Evil Corporations like Monsanto to increase their bottom lines, all coming at the expense of the environment. Meanwhile, Mother Earth wept. Of course, I had no compelling evidence to back up my view, just a feeling, an intuition, that what I believed was true.

In other words, I started with my emotionally-driven belief, made some intuitions (System 1), and then worked backward to make it all appear reasonable (System 2). I was operating under an illusion of understanding, and one fostered by the biased media sources I used to shape my beliefs. I confirmed my bias by only reading literature that supported my beliefs and then studiously avoided everything else that might threaten those beliefs. To defend this bias, I had a clever grab bag of talking points I could deploy when challenged to make myself sound more knowledgeable than I was.

In retrospect, however, I was engaging in sloppy thinking. The issue had been framed for me beforehand in a way that hit the right buttons in my mind: GMs = Frankenfoods = unnatural = bad; Organic = wholesome = natural = good - now repeat a thousand times and the truth will set you free! Around that belief, I constructed a sort of identity sustained by a community of like-minded individuals, mostly found online.


I found myself stuck in an echo chamber and one that worked to constantly reinforce my cherry-picked reasoning. After all, if everyone agrees with me, it must be true. I'll just ignore or dismiss everyone else. Does that sound familiar? I see aspects of my own sloppy thinking with GMs in those who insist the world is 6000 years old, or who think climate change is just some liberal conspiracy to implement worldwide socialism.

Only after sitting down and studying the merits of the arguments on both sides - which was difficult and time-consuming, by the way - did I gradually shift my opinion to become more in line with mainstream science on GMs. The unfortunate fact is that changing an established opinion is not easy, especially if you have parts of your identity wrapped around it.


First, you have to acknowledge that as a non-expert you may be dead wrong. In fact, you are probably dead wrong. There may be mountains of compelling evidence you have not considered. Even then, you have to be motivated enough to do something about it. Finally, if you do have that motivation, you have to roll up your sleeves and objectively look at the strongest evidence. This can be an uncomfortable and humbling process, and most do not think it worth the effort. After all, the padded cell of our echo chamber is the most comfortable to remain.

The main takeaway is to get into the habit of scrutinizing your opinions. Doing this will make you better equipped intellectually to detect all the bullshit that's out there trying to manipulate your beliefs to win your votes and dollars. Fear sells. Identity politics sells. And the ignorant are buying. We can't know everything about everything; we can't be experts in all fields. Don't be afraid to cautiously trust the experts in those areas where they have the training and education to know what they are talking about.

There's nothing wrong with deferring to expertise, though I can sense many readers recoiling at this idea as a submission to outside authority, and therefore somehow hostile to personal free thought. Don't confuse deference to expertise to submission to authority. Authority often does not come with any expertise, as any reasonably astute follower of our current political situation will immediately recognize. Experts are subject matter experts because they spent a great deal of time and effort mastering their chosen field.

Respect that distinction, even if you are still disposed to disagree. Some people simply know a lot more about specific topics than we ever will. To dismiss the experts out of hand because they are telling you something you don't want to hear smacks of the intellectual arrogance so often attributed to scientists and academics. If you are honest with yourself, you should hear alarms going off in your mind every time you strongly disagree with an expert in any field. That disagreement is fine, but pull the string a bit and see where it goes. Read something. Study some more. Or, just admit that you don't have the time and interest to do so and leave it at that. Then, humbly defer to the experts.

1. http://www.pewinternet.org/2016/10/04/the-politics-of-climate/

2. https://climate.nasa.gov/scientific-consensus/

3. http://www.pewinternet.org/2015/07/23/an-elaboration-of-aaas-scientists-views/

4. http://www.pewresearch.org/fact-tank/2017/02/10/darwin-day/

5 Jerry A. Coyne. “Faith versus Fact: Why Science and Religion Are Incompatible.” Faith versus Fact: Why Science and Religion Are Incompatible, Viking, 2015, p. 39.

6 Steven A. Sloman, and Philip Fernbach. “What We Know.” The Knowledge Illusion: Why We Never Think Alone, Riverhead Books, 2017, p. 35.

7 Daniel Kahneman. “Part I/Two Systems.” Thinking, Fast and Slow, Farrar, Straus and Giroux, 2015, pp. 48-68.

#indefenseofexpertise #experts

44 views