Rational thinking

“If you just thought for yourself, you’d agree with me and all my friends.” How often have you and I and the kitchen sink heard that?

Dan Kahan, one of the cultural cognition people, discusses the downsides of original thinking:

People need to (and do) accept as known by science much much much more than they could possibly understand through personal observation and study. They do this by integrating themselves into social networks—groups of people linked by cultural affinity—that reliably orient their members toward collective knowledge of consequence to their personal and collective well-being…

Polarization occurs only when risks or other facts that admit of scientific inquiry become entangled in antagonistic cultural meanings. In that situation, positions on these issues will come to be understood as markers of loyalty to opposing groups. The psychic pressure to protect their standing in groups that confer immense material and emotional benefits on them will then motivate individuals to persist in beliefs that signify their group commitments.

They’ll do that in part by dismissing as noncredible or otherwise rationalizing away evidence that threatens to drive a wedge between them and their peers. Indeed, the most scientifically literate and analytically adept members of these groups will do this with the greatest consistency and success.

Once factual issues come to bear antagonistic cultural meanings, it is perfectly rational for an individual to use his or her intelligence this way: being “wrong” on the science of a societal risk like climate change or nuclear power won’t affect the level of risk that person (or anyone else that person cares about): nothing that person does as consumer, voter, public-discussion participant, etc., will be consequential enough to matter. Being on the wrong side of the issue within his or her cultural group, in contrast, could spell disaster for that person in everyday life.

Some controversial social issues don’t carry this risk, and thinking for one’s self is OK:

The number of issues that have that character, though, is miniscule in comparison to the number that don’t. What side one is on on pasteurized milk, fluoridated water, high-power transmission lines, “mad cow disease,” use of microwave ovens, exposure to Freon gas from refrigerators, treatment of bacterial diseases with antibiotics, the inoculation of children against Hepatitis B, etc. et. etc., isn’t viewed as a a badge of group loyalty and commitment for the affinity groups most people belong to. Hence, there’s not meaningful amount of cultural polarization on these issues–at least in the US (meaning pathologies are local; in Europe there might be cultural dispute on some of these issues & not on some of the ones that divide people here).

Yet some of us do hold views on icon issues that differ from what others with our cultural affinity believe. What makes the difference? What motivates us to adopt different beliefs? What inoculates us against the reaction of the group? Or makes the importance of thinking for one’s self greater than group affinity?

Comments are closed.