We would like to believe that if we just give people the right information, they will grasp the scope of the climate problem. That’s naive thinking.
Climate researchers generally agree that most global warming is man-made and that if we continue to emit greenhouse gases at today’s rate, the probability of flooding, heat waves, tsunamis, food deficiencies and disease will increase. But if you ask most people, you will find that they don’t all agree with the researchers about this. According to a survey conducted before the summer by CICERO and the market research company Synovate, only six of ten Norwegians believe that climate change is caused by humans, and international polls conducted by YouGov show that half the Norwegian population thinks that the threat of climate change is exaggerated. Polls in other countries confirm these results. Why is the general public more divided in its view of climate change than researchers?
It is widely thought that if people are just given the right information about climate change, then their beliefs will correspond with the general scientific consensus. But this is not necessarily the case. A group of researchers involved in the Cultural Cognition Project, headed by Yale professor Dan Kahan, asserts that our perceptions of how a good society is organized determines what we choose to believe is true or false.
In other words, our cultural values bias the way we assess a potential risk or new scientific data. As a result, some research dissemination can actually lead to greater dissension within the population because people interpret the same information differently.
Along with his colleagues, Kahan has developed what he calls “cultural cognition theory” in which he attempts to explain why the US population is divided on issues such as nuclear power, climate change, nanotechnology and gun control.
Individualism versus collectivism
Cultural cognition theory is based on the work of British anthropologist Mary Douglas, the founder of the cultural theory of risk. In 1980, Douglas and US political scientist Aaron Wildavsky published the book “Risk and Culture” (1980) in which they propose that our world view – that is, what we believe constitutes a good society - determines how we perceive environmental risks.
Douglas and Wildavsky theorized that an individual’s world view could be defined along two dimensions: individualism versus collectivism and hierarchical versus egalitarian. People on the collectivist side believe that society as a whole is more important than the individual and that the group is responsible for ensuring that individuals have a good quality of life. On the other hand, individualists believe that each person is responsible for his or her own well-being and that the state should not interfere in people’s lives. Hierarchical individuals believe that access to resources and opportunities is determined by a person’s gender, class, race and family background, while egalitarian individuals believe that these distinctions are irrelevant and that everyone should have equal opportunities.
Some risks to society fit in better with a particular world view more than others, according to Douglas and Wildavsky. Individualists are disposed towards dismissing environmental risks because acknowledging that industry harms the environment will lead to restrictions on industrial operations – an activity they view as positive. For collectivists, environmental risks have a silver lining because they can lead to new regulations that curb egoism, a defining feature of individualism and unregulated markets. At least this is the theory. But does it correspond to reality?
For over 20 years researchers have tried to prove these connections empirically, but it has been difficult to find a way to group people along these two axes. The empirical studies conducted by the Cultural Cognition Project show that Douglas and Wildavsky are mostly correct in their assumptions. But why are we culturally disposed to assessing the same risk so differently?
Dan Kahan and the other researchers in the project have identified four mechanisms that are anchored in established psychological processes. Kahan’s group has helped to show how these psychological processes are linked to the cultural values that create different perceptions of the same risk.
First of all, people want to protect their own group identity. If the people you identify with are egalitarian like you, they will probably support an increase in immigration. In this case it will be especially difficult for you to view immigration as a threat to society. Not only do your egalitarian values dispose you towards supporting a more open immigration policy, but they also make it particularly hard for you to take a position that contradicts the beliefs of the group you identify with.
Group identification gives us everything from material goods to higher status to a feeling of self-worth. As a result, if you take a position that contradicts the beliefs of the group you identify with, your self-image may be threatened - especially if the group means a lot to you.
Seek out support
As a general rule, people want to defend their own convictions, so they give more weight to the arguments that reinforce their existing beliefs, and they are sceptical towards the arguments that challenge these beliefs.
When people receive balanced information – for example when the media presents both sides of an issue – their existing beliefs are actually reinforced. So when a diverse group of people are given balanced information about a risk to society, they do not come closer to agreement. On the contrary, they disagree even more about the extent of the risk.
To test this theory, the researchers gave a group of people information about nanotechnology and then asked them about the kind of risks posed by this technology. In the control group, which did not receive any information, the understanding of the risks was rather similar. This is not surprising since 80 percent of Americans have barely heard of nanotechnology. However, the group that received balanced information by reading a short article was highly polarized. It seems that people selectively acquire knowledge in a way that reinforces their own world view and general attitudes about environmental and technological risks. Quite simply, we hear what we want to hear.
Who do you trust?
Very few of us have the ability to accurately assess the extent of risk. When people assess risk and the arguments related to it, they listen more to people with a high degree of credibility. And how do people judge credibility? The hypothesis used by the researchers behind cultural cognition theory states that people are more likely to believe those who have the same cultural values as themselves. Studies show that people are more influenced by experts whose world view corresponds with their own.
A heated debate is underway in the US about whether 12-year-old girls should be vaccinated against the human papillomviruses that cause cervical cancer in women. Some people believe that all girls should be vaccinated, while others believe that a large-scale vaccination programme will lead to more sexual activity and an earlier sexual debut.
The researchers in the project tested the attitudes of 1,500 Americans, who were divided into three groups. The first group received no information, the second group received information containing arguments from both sides, and the third group received information from people who made their own cultural values clear. When the project participants received information from people they perceived to have the same cultural values as themselves, they were more likely to believe the information, even though the message being conveyed contradicted their own values.
The fourth and final mechanism is based on the confirmation of a person’s own cultural identity and in many ways is a reflection of the two first mechanisms – namely, defending her or his own group identity and convictions. When research findings about a societal risk are disseminated, it is possible to do this in a way that confirms rather than threatens a certain world view.
Researchers asked two groups to read different articles on climate change. Both articles discussed a study conducted by researchers from leading universities who asserted that the average global temperature is rising, that this is due to human activity and that unchecked global warming will be disastrous for the environment as well as the economy. However, the articles had different conclusions. In one article the experts concluded that emissions must be reduced, which is a measure that is threatening to individualists. In the other article the experts recommended the use of nuclear power. According to the cultural researchers, nuclear power confirms the cultural identity of individualists because it does not limit industrial activity.
Obviously, there is no logical connection between the two different solutions proposed and the question of whether global warming is occurring. Nonetheless, the researchers found that the individualists who read the nuclear power version had a more positive view of the data that support man-made climate change. In contrast, the individualists who read the article that recommended reducing emissions were more sceptical towards the data that support man-made climate change than the individualists in the control group who were not given any information.
Solutions do exist
These four mechanisms have implications for people who disseminate scientific information about risks to society. Bombarding people with facts does not help to raise awareness about climate change and instead may cause a backlash that compels certain groups to protect their identity and thus reject research findings as being unreliable and untrue.
According to the researchers in the Cultural Cognition Project, one way to bypass these mechanisms is to provide a balance of cultural values. If more people with different values convey the same message, the cultural signals will be ambiguous and people will not be able to uncritically accept the opinion of experts who they believe share their values. In the best case people will give more thought to the information they receive, and in the worst case they will revert to the other mechanisms that help them to form an opinion.
Denne artikkelen ble opprinnelig publisert i Magasinet Klima nummer 4, 2010