It is evident that something is going terribly wrong in
communication between science and the public. The consequences are clear and
severe; yet scientists seem unsure how to address this. There are some great
science communicators out there, and I urge you to find them (reading Ben
Goldacre’s “Bad Science” is a great place to start). But my purpose here is to
demystify the scientific process, and to explain its necessity. Science
communicators go to great lengths to describe what makes “good science”, which
research is reliable, which findings we can trust. But what they are all missing
is the “why”. When a scientist says that we need to follow the
scientific method to ensure a fair test, Jo Public asks “why?” When we tell the
public that anecdotes are not evidence, they ask “why?”
To the public our rules are arbitrary, overly complicated
and set up to exclude. We make science the preserve of an educated few. We
exclude the public by our language, our rituals and practises. We turn the
argument away from “good science/bad science” to “us and them”. In
Psychological terms we set up “in-group/out-group” thinking. This type of
thinking does not foster sharing of knowledge. The groups become increasingly
closed. Until anything presented by one group is dismissed by the other. We
need to go back to basics, start over and explain “why”.
The first step in this process is to explain to Jo Public
that s/he is not a rational person; that the brain takes shortcuts in
processing information. In order for the public to understand why we need the scientific method, we
need Psychology in our science communication. Our brains are bombarded with
information every waking moment. I don’t mean this in the digital sense, but
literally. We receive immense amounts of visual input through our eyes; our
ears constantly pick up sounds in our environment; our sense of touch provides
information about temperature, air movement etc. We receive all of this
information simultaneously and constantly. This is a huge amount of data for
our brains to process, and yet we haven’t even begun to consider the social
information we receive from other people in our environment. The sheer amount
of information would require super-computer processing capacity, however
evolution found another way. Our brains come pre-programmed with short-cuts in
information processing. Our brain filters the incoming information, so that it
only needs to fully process the important (or “salient”) information. These
short cuts (or heuristics) are VERY effective, and since they're subconscious, we are unaware of their influence. We cannot
control them, nor can we prevent their influence on our thinking. All of our
knowledge of the world is filtered and biased by these heuristics.
Firstly there is the “Confirmation Bias”. This heuristic is
the most dangerous when considering oneself to be a “rational person”. The
brain is wired to seek out evidence that SUPPORTS our beliefs and to IGNORE
information that refutes them. This is useful, because it means that we are not
constantly re-evaluating our belief systems. It is dangerous because you cannot
EVER assume that you have assessed the evidence and come to a reasoned
conclusion.
The second bias that we need to educate the public about is
the “Availability bias”. Basically, the
easier something is to remember the more likely it is to be considered in
decision making. Let’s take the MMR example – sad and desperate parents were
paraded in front of us on the TV, in the papers, and therefore always in our
mind. The emotive content of the stories, the crying families, the impassioned parents convinced the vaccine changed their child; these make those stories more memorable. When considering whether to give their child the MMR parents think about
the “evidence” and what they remember is those sad families. What we remember, is what we consider to be the evidence. So, they don’t vaccinate.
Furthermore, the human brain is wired to see cause and
effect in EVERYTHING! This is the default position for the mind, when one event
follows another, the brain decides that the first event CAUSED the second. This
is what happened to the parents in the MMR press. This assumption that two
events are related when not all the other variables are known is called an
“illusory correlation.” To highlight this consider the “fact” that when Ice
Cream sales in New York increase, so does the murder rate. So, Ice Cream causes
murder? No, there is another variable that links them, heat. As the temperature
rises more people buy Ice Cream and more murders are committed. When you
control for heat, the relationship between disappears. In the MMR and Autism
case, you need to control for age of diagnosis.
Scientists use the "scientific method" to ensure that their conclusions are not influenced by these kinds of biases. We insist on controlled tests, so that we can limit the possibility of "illusory correlations". We use large samples and statistical significance testing to ensure that we are not blinded by the availability bias. And, as scientists, we set out to DISPROVE our theories. Yes, I said that right, we test to see if we can falsify our claims. This helps us to avoid the confirmation bias.
Our strict rules, the language we use and the practises we use are all set up to make sure that we can be confident in our conclusions. They are not there to exclude people, they are not there to make things difficult or to act as a gate-keeper. They are important. As scientists we need to get better at explaining that.
If the public can be educated about how the mind works, then
they can take the first step to understanding scientific discourse. They can
begin to see that the rules of science are not arbitrary. The key to better
science communication is to remember that you are communicating with humans.
Therefore an understanding of human thinking or Psychology is essential.