Buck up, John. Once the real hazards of rising sea level, failed crops, and 
acidified oceans materialize, the decision-makers just might yearn for some 
hazards of the moral kind. And you and I might still be around when that 
happens. Even then there is no guarantee that any countering action will be 
effective and safe unless we do some research to find out before the real need 
for hazard mitigation arises, which for some of us is right now.
Keep up the good fight...
Greg

From: John Nissen <johnnissen2...@gmail.com<mailto:johnnissen2...@gmail.com>>
Date: Tuesday, March 4, 2014 11:21 AM
To: Default <r...@llnl.gov<mailto:r...@llnl.gov>>
Cc: "dmorr...@gmail.com<mailto:dmorr...@gmail.com>" 
<dmorr...@gmail.com<mailto:dmorr...@gmail.com>>, geoengineering 
<geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com>>, 
"dan.ka...@yale.edu<mailto:dan.ka...@yale.edu>" 
<dan.ka...@yale.edu<mailto:dan.ka...@yale.edu>>, John Nissen 
<j...@cloudworld.co.uk<mailto:j...@cloudworld.co.uk>>
Subject: Re: [geo] Re: Geoengineering and Climate Change Polarization: Testing 
a Two-channel Model of Science Communication, Ann. Am. Acad. Pol. & Soc. Sci.

Hi Greg,

The theory is that people tend to be polarised into two camps.  One camp is 
against the idea that climate change can have anything to do with our 
greenhouse gas emissions; and therefore (subconsciously) this camp is against 
geoengineering because it would admit of a massive problem to be solved.  The 
other camp is against geoengineering (subconsciously) because of the moral 
hazard - the idea that it's a "get out of jail free" for the people responsible 
for causing climate change in the first place.  They will talk of 
geoengineering as a climate "fix", that it is "playing with God", etc.

Kahan refers repeatedly to a 2012 study where it was shown that the "moral 
hazard" argument against geoengineering was scientifically invalid.  But 
subconsciously the second camp may still have this deep-seated fear of 
geoengineering.

Therefore I deduce, using his argument, that neither camp will accept 
geoengineering, whatever evidence of the need for geoengineering is presented 
to them.

I think this is the crux of the matter: nobody, identified with either of the 
common "camps", will accept geoengineering.  Only when this impasse is properly 
acknowledged, will it be possible for people to accept the scientific evidence 
that geoengineering is needed, not only to suck CO2 out of the atmosphere, but 
also to cool the Arctic.

Cheers,

John









On Tue, Mar 4, 2014 at 4:22 AM, Rau, Greg <r...@llnl.gov<mailto:r...@llnl.gov>> 
wrote:
This observation may bear repeating:
"To be effective, science communication must successfully negotiate both 
channels. That is, in addition to furnishing individuals with valid and 
pertinent information about how the world works, it must avail itself of the 
cues necessary to assure individuals that assenting to that information will 
not estrange them from their communities."

Isn't this what good advertising does, and couldn't our community benefit from 
some cogent advice from Madison Ave, if we could afford it? Science and 
scientific reasoning alone apparently isn't enough, especially when there are 
(well funded) individuals who would cast such reasoning as a threat to their 
communities.
Greg
________________________________
From: geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com> 
[geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com>] on 
behalf of David Morrow [dmorr...@gmail.com<mailto:dmorr...@gmail.com>]
Sent: Monday, March 03, 2014 6:27 PM
To: geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com>
Subject: [geo] Re: Geoengineering and Climate Change Polarization: Testing a 
Two-channel Model of Science Communication, Ann. Am. Acad. Pol. & Soc. Sci.

FYI, the lead author of that paper, Dan Kahan, posted two additional blog posts 
on culture, values, and geoengineering:

http://www.culturalcognition.net/blog/2014/2/24/geoengineering-the-cultural-plasticity-of-climate-change-ris.html

http://www.culturalcognition.net/blog/2014/2/26/geoengineering-the-science-communication-environment-the-cul.html



On Thursday, February 27, 2014 2:04:00 AM UTC-6, andrewjlockley wrote:

Poster's note : This is just brilliant. At last an explanation of why believing 
nonsense is rational. Useful to reflect on how this paper replies to the origin 
and persistence of other belief systems, as well as climate change. Leaves me 
wondering what nonsense I believe.

http://www.culturalcognition.net/blog/2014/2/23/three-models-of-risk-perception-their-significance-for-self.html

Three models of risk perception & their significance for self-government

Dan Kahan Posted on Sunday, February 23, 2014 at 7:52AM

>From Geoengineering and Climate Change Polarization: Testing a Two-channel 
>Model of Science Communication, Ann. Am. Acad. Pol. & Soc. Sci. (in press).

Theoretical background

Three models of risk perception

The scholarly literature on risk perception and communication is dominated by 
two models. The first is the rational-weigher model, which posits that members 
of the public, in aggregate and over time, can be expected to process 
information about risk in a manner that promotes their expected utility (Starr 
1969). The second is the irrational-weigher model, which asserts that ordinary 
members of the pubic lack the ability to reliably advance their expected 
utility because their assessment of risk information is constrained by 
cognitive biases and other manifestations of bounded rationality (Kahneman 
2003; Sunstein 2005; Marx et al. 2007; Weber 2006).Neither of these models 
cogently explains public conflict over climate change--or a host of other 
putative societal risks, such as nuclear power, the vaccination of teenage 
girls for HPV, and the removal of restrictions on carrying concealed handguns 
in public. Such disputes conspicuously feature partisan divisions over facts 
that admit of scientific investigation. Nothing in the rational-weigher model 
predicts that people with different values or opposing political commitments 
will draw radically different inferences from common information. Likewise, 
nothing in the irrational-weigher model suggests that people who subscribe to 
one set of values are any more or less bounded in their rationality than those 
who subscribe to any other, or that cognitive biases will produce systematic 
divisions of opinion of among such groups.

One explanation for such conflict is the cultural cognition thesis (CCT). CCT 
says that cultural values are cognitively prior to facts in public risk 
conflicts: as a result of a complex of interrelated psychological mechanisms, 
groups of individuals will credit and dismiss evidence of risk in patterns that 
reflect and reinforce their distinctive understandings of how society should be 
organized (Kahan, Braman, Cohen, Gastil & Slovic 2010; Jenkins-Smith & Herron 
2009). Thus, persons with individualistic values can be expected to be 
relatively dismissive of environmental and technological risks, which if widely 
accepted would justify restricting commerce and industry, activities that 
people with such values hold in high regard. The same goes for individuals 
withhierarchical values, who see assertions of environmental risk as 
indictments of social elites. Individuals with egalitarian and communitarian 
values, in contrast, see commerce and industry as sources of unjust disparity 
and symbols of noxious self-seeking, and thus readily credit assertions that 
these activities are hazardous and therefore worthy of regulation (Douglass & 
Wildavsky 1982). Observational and experimental studies have linked these and 
comparable sets of outlooks to myriad risk controversies, including the one 
over climate change (Kahan 2012).Individuals, on the CCT account, behave not as 
expected-utility weighers--rational or irrational--but rather as cultural 
evaluators of risk information (Kahan, Slovic, Braman & Gastil 2006). The 
beliefs any individual forms on societal risks like climate change--whether 
right or wrong--do not meaningfully affect his or her personal exposure to 
those risks. However, precisely because positions on those issues are commonly 
understood to cohere with allegiance to one or another cultural style, taking a 
position at odds with the dominant view in his or her cultural group is likely 
to compromise that individual's relationship with others on whom that 
individual depends for emotional and material support. As individuals, citizens 
are thus likely to do better in their daily lives when they adopt toward 
putative hazards the stances that express their commitment to values that they 
share with others, irrespective of the fit between those beliefs and the 
actuarial magnitudes and probabilities of those risks.The cultural evaluator 
model takes issue with the irrational-weigher assumption that popular conflict 
over risk stems from overreliance on heuristic forms of information processing 
(Lodge & Taber 2013; Sunstein 2006). Empirical evidence suggests that 
culturally diverse citizens are indeed reliably guided toward opposing stances 
by unconscious processing of cues, such as the emotional resonances of 
arguments and the apparent values of risk communicators (Kahan, Jenkins-Smith & 
Braman 2011; Jenkins-Smith & Herron 2009; Jenkins-Smith 2001).But contrary to 
the picture painted by the irrational-weigher model, ordinary citizens who are 
equipped and disposed to appraise information in a reflective, analytic manner 
are not more likely to form beliefs consistent with the best available evidence 
on risk. Instead they often become even more culturally polarized because of 
the special capacity they have to search out and interpret evidence in patterns 
that sustain the convergence between their risk perceptions and their group 
identities (Kahan, Peters, Wittlin, Slovic, Ouellette, Braman & Mandel 2012; 
Kahan 2013; Kahan, Peters, Dawson & Slovic 2013).Two channels of science 
communication

The rational- and irrational-weigher models of risk perception generate 
competing prescriptions for science communication. The former posits that 
individuals can be expected, eventually, to form empirically sound positions so 
long as they are furnished with sufficient and sufficiently accurate 
information (e.g., Viscusi 1983; Philipson & Posner 1993). The latter asserts 
that the attempts to educate the public about risk are at best futile, since 
the public lacks the knowledge and capacity to comprehend; at worst such 
efforts are self-defeating, since ordinary individuals are prone to overreact 
on the basis of fear and other affective influences on judgment. The better 
strategy is to steer risk policymaking away from democratically accountable 
actors to politically insulated experts and to "change the subject" when risk 
issues arise in public debate (Sunstein 2005, p. 125; see also Breyer 1993).

The cultural-evaluator model associated with CCT offers a more nuanced account. 
It recognizes that when empirical claims about societal risk become suffused 
with antagonistic cultural meanings, intensified efforts to disseminate sound 
information are unlikely to generate consensus and can even stimulate conflict.

But those instances are exceptional--indeed, pathological. There are vastly 
more risk issues--from the hazards of power lines to the side-effects of 
antibiotics to the tumor-stimulating consequences of cell phones--that avoid 
becoming broadly entangled with antagonistic cultural meanings. Using the same 
ability that they reliably employ to seek and follow expert medical treatment 
when they are ill or expert auto-mechanic service when their car breaks down, 
the vast majority of ordinary citizens can be counted on in these "normal," 
non-pathological cases to discern and conform their beliefs to the best 
available scientific evidence (Keil 2010).

The cultural-evaluator model therefore counsels a two-channel strategy of 
science communication. Channel 1 is focused on information content and is 
informed by the best available understandings of how to convey empirically 
sound evidence, the basis and significance of which are readily accessible to 
ordinary citizens (e.g., Gigerenzer 2000; Spiegelhalter, Pearson & Short 2011). 
Channel 2 focuses on cultural meanings: the myriad cues--from group affinities 
and antipathies to positive and negative affective resonances to congenial or 
hostile narrative structures--that individuals unconsciously rely on to 
determine whether a particular stance toward a putative risk is consistent with 
their defining commitments. To be effective, science communication must 
successfully negotiate both channels. That is, in addition to furnishing 
individuals with valid and pertinent information about how the world works, it 
must avail itself of the cues necessary to assure individuals that assenting to 
that information will not estrange them from their communities (Kahan, Slovic, 
Braman & Gastil 2006; Nisbet 2009).

--
You received this message because you are subscribed to the Google Groups 
"geoengineering" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
geoengineering+unsubscr...@googlegroups.com<mailto:geoengineering+unsubscr...@googlegroups.com>.
To post to this group, send email to 
geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com>.
Visit this group at http://groups.google.com/group/geoengineering.
For more options, visit https://groups.google.com/groups/opt_out.

--
You received this message because you are subscribed to the Google Groups 
"geoengineering" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
geoengineering+unsubscr...@googlegroups.com<mailto:geoengineering+unsubscr...@googlegroups.com>.
To post to this group, send email to 
geoengineering@googlegroups.com<mailto:geoengineering@googlegroups.com>.
Visit this group at http://groups.google.com/group/geoengineering.
For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"geoengineering" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to geoengineering+unsubscr...@googlegroups.com.
To post to this group, send email to geoengineering@googlegroups.com.
Visit this group at http://groups.google.com/group/geoengineering.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to