Will you change your mind?
Buddhini Samarasinghe wrote about http://edge.org‘s annual question:
What Have You Changed Your Mind About? Why?
https://plus.google.com/+BuddhiniSamarasinghe/posts/JKSBovEfBxF
I thought about some of the hot-button topics and how you often can’t change people’s mind, even with overwhelming evidence. There are three things working against changing someone’s mind.
✿Confirmation Bias – the tendency to favor information that supports your existing view.
✿Cognitive Dissonance – action that contradicts reasoning, e.g., continuing to smoke when you agree it is unhealthy.
✿Motivated Reasoning – accepting information that supports what you already believe and giving extra scrutiny to information that is against what you believe.
Here’s a good blog to summarize these three concepts.
Psychology’s Treacherous Trio: Confirmation Bias, Cognitive Dissonance, and Motivated Reasoning
By Sam McNerney 2011
There is a biological component too. Chris Mooney writes that emotion kicks in before you have a chance to reason. Evolution has lead animals to react quickly for survival. It makes sense that when we react quickly, we rely on what we believe first.
The Science of Why We Don’t Believe Science
How our brains fool us on climate, creationism, and the vaccine-autism link.
By Chris Mooney 2011 h/t Marjolein Caniels
The Mooney piece has many examples of the treacherous trio. I would suggest reading the McNerney blog first. This isn’t my area of expertise so I’m hoping Zuleyka Zevallos or Chris Robinson can chime in.
Image via Reddit
#ScienceSunday #Anti_anti_intellectualism
January 12, 2014
Should it be edge.org (typo?).
Very interesting post. Unfortunately, even if you “hit people with science” there are those whose opinions are made up, especially on hot button topics of evolution and GMO. If you show them data, they cry conspiracy theory and data manipulation.
January 12, 2014
Good post Chad Haney. And coming hot off the heels of my argument with someone last night who believed that adultery is illegal in NY.
January 12, 2014
Thanks for catching the typo Rajini Rao. On a climate change post one denier said all he wants is proof that high CO2 causes the earth to warm. When someone pointed him to studies using CO2 in the lab, he said that doesn’t prove that it’s true for the planet. Some people say they will accept scientific facts but don’t really know what that means. You can’t do a controlled experiment on the whole planet. You have to accept surrogates in research. Just like it’s unethical to do some in vivo experiments, so you have to come up with a surrogate. Of course I’m preaching to the choir.
January 12, 2014
I can understand what he is saying just by looking at his mouth….
January 12, 2014
Great post! It’s hard to admit to changing your mind, so this was a very thought provoking question for me. There are lots of reasons for sticking to a pre-formed opinion or idea, and science cannot advance when people do that in spite of the evidence to the contrary.
January 12, 2014
But can’t dissenting opinions provoke advancement in science Buddhini Samarasinghe?
January 12, 2014
Tiffany Henry it can, but dissent must come from evidence, not ideology.
January 12, 2014
They can Tiffany Henry as long as they except the process of working out alternative ideas, ie, the scientific method.
January 12, 2014
I find that I rarely change my mind radically (can’t say I remember having done so), but that I rather hone the opinion, making it more blurred, less certain. Also I may keep some opinions, but be more pragmatic about asserting them. Well, I think I probably believed in God as a kid, but by the time I left elementary school, information and education had left no room for believing in deities. I like to think that I can be convinced by the proper arguments, but I rarely find that anyone can provide them 😮
January 12, 2014
You are welcome Mz Maau
January 12, 2014
For some reason, share’s were disabled. That’s fixed now.
January 12, 2014
I was wondering about that too Chad Haney lol
January 12, 2014
Excellent post Chad Haney ~
January 12, 2014
Nop ~.~
January 12, 2014
Thanks Mara Rose
January 12, 2014
It can also take time to replace a deeply held belief, I think, because of all the old connections to other beliefs that must be broken and reformed with a new belief.
I’ve become absolutely convinced that “systems thinking” is critical to understanding pretty much everything these days. I realized systems were important during my doctorate (in the early 90s). It’s taken till only a few years ago to get to the point where I basically “see in systems.”
This is why education is so important. We need the skills to be open to this kind of… bayesianism? in our beliefs. But that’s really what science and so-called evidentialism is all about.
January 12, 2014
I agree Filippo Salustri it’s a process that requires educating people. That’s why I take science outreach so seriously.
January 12, 2014
have you seen this article from yesterday?
http://www.theguardian.com/science/2014/jan/12/what-scientific-idea-is-ready-for-retirement-edge-org
the points made by Alba and Dawkins seem to follow the same line of thinking, that the universe rarely has any black and whites, its all grey areas
January 12, 2014
in meeting someone because i lied to that person
January 12, 2014
Wil Wheaton has an interesting post at https://plus.google.com/u/0/+WilWheaton/posts/Nh8hmGxN9BZ about the scientific consensus on climate change.
It’s pertinent here for the comments and how few minds are being changed there, and how most of them are missing the crucial difference between whether climate change is true or not, and whether a modern scientific consensus can be used as the basis for determining fact.
January 13, 2014
Short but sweet “you can’t win an argument with a ignorant person”…..
January 13, 2014
Thanks Wesley Yeoh, I marked that to read later. Buddhini Samarasinghe had mentioned the “idea is ready for retirement” question on edge.org.
Filippo Salustri that’s definitely applicable here.
January 13, 2014
I was just going to bring up that thread Filippo Salustri, most apposite indeed.
Great post Chad Haney
January 13, 2014
Thanks Sonny Williamson. I ended up re-sharing Wil Wheaton’s post, pointing it back to this one as they are indeed nicely tied together.
January 13, 2014
Your identity is build from your ideas. If I attack your ideas by trying to convince you that they are wrong, I am attacking you, so you will start defending yourself by defending your ideas. Instead I should facilitate you to relate reality with your identity. I should point out what you knew but were not aware.
January 13, 2014
Joost Ringoot, psychology isn’t my area but I was drawn to this topic and felt like posting it for the reason you just mentioned. Scientists often think in terms of hard facts. If a scale reads 154 g then you don’t argue that the sample has more or less than 154 g. When it comes to ideas and beliefs, we have to remember that we can’t just show people the facts for the reason you mentioned and for the reasons that are in the article.
January 13, 2014
Wesley Yeohthe full responses to this year’s Edge question will be available after the press embargo of the 14th. I’m pleased to say that an essay I submitted was also accepted so look out for that one 🙂
January 13, 2014
me neither
January 13, 2014
what the hell is this
January 13, 2014
Buddhini Samarasinghe I didn’t want to spill the beans so I didn’t mention your essay.
Abdallah Altareb Samantha Gutierrez and rantej boparai what questions do you have about the post? What part doesn’t make sense?
January 13, 2014
Chad Haney I know, thank you 🙂 I’ll write a post for G+ tomorrow when the essays go live. I think you will like it! I’ll also pose the question for others to answer, but I am worried that crazy conspiracy theorists will show up with answers like “Evolution” and “Man made Global Warming” as answers to ‘What Scientific Idea is Ready for Retirement?” haha
January 13, 2014
Buddhini Samarasinghe lets hope that trolls don’t see that as bait. I have one denier on my 0.04% post. He clearly didn’t look at the this post, which is linked.
January 14, 2014
Chad Haney your “denier” on the 0.04% post appears to have blocked me. Or did he delete his comments? Wondering if I should feel special 🙂
January 14, 2014
Here, Rajini Rao, have a tissue. Looking at his profile, I’m guessing he’s afraid of smart women like you.
January 14, 2014
Thanks for the laugh, Chad Haney !
January 14, 2014
I found all of your reference regarding cognitive dissonance, motivated reasoning and confirmation bias extremely interesting, so thank you for the post. However, as I read and enjoyed the read, the very concepts explored compelled me to wonder …. why I was I enjoying the read? As in, am I just another rat in the cage with a confirmation bias!? Aagh. (LOL, probably)
January 14, 2014
Karen Katz the fact that you ask that question of yourself suggest you are not.
January 18, 2014
Hi Chad Haney Sorry it’s taken me so long to respond. [Insert Aesop fable reference to slow and steady etc.] Thanks for sharing such as fantastic post! I started writing you back and it turned into a long blog post, which you can read here if you’re interested in more info (http://goo.gl/mE0d4H). In summary, these concepts you’ve discussed are useful for how individuals refute evidence for sure, but I think only to a point. They’re useful for understanding individual resistance towards scientific evidence, but these concepts alone don’t tell us why this happens at the group or societal level.
To get to the bottom of this, we have to tease apart the social science differences between belief, values and attitudes, and how these influence how people assess, take on, or discard scientific evidence.
In a nutshell, at the societal level, it’s useful to think about how cultural beliefs about trust and risk influence the extent to which people accept scientific evidence.
In social science, the concept of belief is a statement that people think is either true or false. Beliefs are deep rooted because they evolve from early socialisation.There’s a strong motivation to protect what we believe because beliefs are tied to personal identities, culture and lifestyle. The key to understanding why beliefs are hard to shift comes down to one question: Who benefits?
When people don’t believe scientists on climate change or GM foods or vaccinations or gender inequality and so on, it comes down to their assessment of: What does this mean for me? What life changes are required of me? How does this scientific knowledge undermine my place in the world? Changing one belief means reassessing all other beliefs and changing the social order. This is scary for people.
Values are linked to social morality. Values relate to the standards of what individuals perceive to be “good” or “bad” in direct reaction to what our society deems to be “good” or “bad.” Values are shaped by cultural institutions like education, religion and the law. In my blog post I give examples of how the social context and histories in Sweden, Peru and the USA all give rise to very different public views as to whether or not GM foods are “good.” Scientists make the case that some GM food technologies represent a safe, relatively inexpensive way to address hunger. In order to accept this argument, the public needs to be able to trust that the science isn’t governed by commercial entities. There are cultural reasons why some Americans and other developed nations don’t trust scientists.
Attitudes are relatively stable system of ideas that allow people to evaluate our experiences. This is how we make judgements about how we feel or think about particular issues, objects, people and situations. While attitudes are relatively stable, they are more superficial than beliefs and less normative than values. Attitudes can be changed more easily than beliefs. Sometimes people will say one thing, especially if it’s socially desirable to do so, but in private they may not adhere to that attitude. Often times, however, people are not always aware that their attitudes are contradictory.
Attitudes about science are shaped by many societal processes, such as education, wealth, ethnicity and so on. Yet the social science literature has overwhelmingly shown that these attitudes are connected to:
1) Whether or not people are willing to accept the risks associated with a particular scientific issue; and
2) Whether or not people trust scientists in general.
Research shows that public trust is generally at half the rate it was in the mid-1960s for federal government, universities, medial institutions and the media. In fact, people who live in relatively affluent, technologically advanced societies are more likely to distrust science than people in developing regions (http://goo.gl/eTLPdI).
We might think that distrust in science is about lack of knowledge. Not necessarily. This is where cognitive dissonance, confirmation bias and motivated reasoning come in. People who are highly knowledgeable in a particular area are more likely to spend a lot of time taking in and responding to world views that contradict their own (http://goo.gl/RaumX9). These are the people we see who come to our science communities and argue against science very loudly. You will notice in many cases, they say they are scientists or that they’re interested in debating science. These people have an information bias that they do not readily recognise. This is because their beliefs, values and attitudes align in a way that compels them to argue against evidence that contradicts their place in the world.
They don’t see it this way. Studies on cognitive dissonance show that all people think they’re very nice, rational and impartial (http://goo.gl/d4C9lW). People who argue against an oppositional view see themselves as sceptics ready to weigh up evidence as it comes to hand. In fact, people spend more time and emotional energy arguing against conflicting information precisely because their attitudes align with their beliefs and values. You discussed the emotional element of anti-science types. This isn’t some primal emotional response. It is about socio-cultural power and protecting the status quo.
Paradoxically, people who know little about a topic are more likely to accept new information to be true or unbiased, but they show a weak commitment to defending this new information. These people are less likely to act on information. Think about what this means to our efforts in the science of climate change, social justice and so on. It’s not good.
More information on a topic alone is not enough to improve how the public engages in science. I argue that our collective efforts on Google+ to teach the public how to read and critique the science they see (especially in the media) is valuable. Changing beliefs and values is always going to be hard, but shifting attitudes will be more effective when people become more scientifically literate. I note this isn’t about making the public into a mass of scientists. (Of course you need formal training to qualify as a scientist.) Instead, I refer to the notion of reflexive critical thinking which is more than just questioning ideas, but also reflecting on our biases and being committed to change in light of new interactions with people and information.
Thanks for helping me consolidate my thinking on public outreach. I’ve long been pondering questions like: Is this worth it? Does it make a difference? How do we do it better?
January 18, 2014
Wow, thanks Zuleyka Zevallos for adding so much insight to my attempt at social science. I’m really glad that my post helped motivate your in depth blog post. I will re-share your blog post as I feel many people should read it and consider what it means for them.
January 18, 2014
I tried to comment on the blog post but WordPress wouldn’t let me. Grrr
January 19, 2014
Chad Haney Thanks very much for your piece! Your words are much appreciated.
January 19, 2014
Buddhini Samarasinghe Oh no – but thanks for trying! I had a play with WordPress comments and I did have some problems logging in with my Twitter, but my Facebook and Google+ worked. What problem did you experience?
January 19, 2014
My Google+ logged me in but wouldn’t post the comment when I clicked Add Comment. I’ll try again later when I get to a computer! But yes, to echo Chad, it’s an excellent article 🙂
January 19, 2014
Thanks for the feedback Buddhini Samarasinghe!
January 19, 2014
Just remembered another case of my changing my mind.
I had been swayed by things I’d read that republicans weren’t as smart as Democrats. This rubbed me the right way (i.e., motivated reasoning) based on my personal experience (anecdotal; confirmation bias). Eventually I found some very reliable sources that called into question this conclusion on methodological and analytic grounds, leading to cognitive dissonance. I had to re-read those articles several times to understand them well enough to see that the conclusion I preferred could not be drawn from the analyses I’d read about originally.
I wish I had all the links, but I’m on my tablet and just haven’t got them handy.
January 19, 2014
That’s an interesting example Filippo Salustri. Politics are part of who most adults are, i.e., part of their identity. Political views can easily make people see evidence in whatever light agrees with their political views.
January 19, 2014
Chad Haney Oh, I still believe liberals are smarter than conservatives, I just don’t use those studies to support my belief any more. 🙂 I’m still open, though, to being corrected.