Powerful reason to be a political  Independent
 
Amazing study. The following article is an "advertisement for Radical  
Centrism"
like nothing else. If you are a political partisan, a Yellow Dog Democrat  
or
a Rock Ribbed Republican, you simply will not get the math right when  a
problem that is presented to you empirically shows that your  ideological
position is unsupported by the facts.
 
This is not higher math, simply a problem in assessing ratios, which  is
pretty basic stuff.
 
Confirmation bias stands out like a sore thumb for both liberals and  
conservatives.
If you believe something, you will get the math right, if you strongly  
oppose something
you will get the math wrong   -to support you pre-existing  political views.
 
Incredible.
 
This is an empirical study, not a blogger's rant. In all likelihood the  
authors of
the study have never heard of RC.  But it demonstrates precisely what  one
of the major contentions of Radical Centrism has been all along  :  
If you are a political ideologue then you are incapable of objectivity 
and approach issues unscientifically.
 
And by the way, you don't have to label yourself to be an ideologue.
You are one, by definition, if you vote Democratic for more than 80%
of all political offices, or if  80%+ is your pattern as a  Republican.
 
And, yes, the logic of this study would say the same thing for
Libertarians, Greens, Socialists, Constitution Party voters, etc.
As an educated guess this probably is also true when considering
religious believers of all kinds.
 
 
Incapacity to see the other side's point-of-view is a defect of  mentality,
as well as a defect of character.
 
I will admit that this study resonated with me, no question about it,
and in the process confirmed my own bias, which is that political   
partisans,
in effect, are lobotomized thinkers   -in extreme cases they  also are 
a**hol*s.
Hence my crusade against political ideology, at least as it currently  
exists.
I really despise hard core Democrats and hard core Republicans and  regard
each as complete idiots. And now here is proof !
 
However, there is a need for follow-up research.  It should be  regarded as
essential to confirm these results with maybe 2 or 3 additional  studies.
 
A weakness of the design was that, while I doubt that this effected the  
results,
the data were hypothetical. But you could plausibly argue that men have  a
repository of data in their heads on the issue of guns and that women have 
a similar repository on the subject of dermatology, and that in each  case
this data was drawn upon in their reasoning processes. This problem
needs to be controlled for.
 
Also, it would be best to investigate the differences between hot button  
issues
and less emotionally charged issues to see if, when "the temperature is  
lower,"
the reasoning processes of partisans are more scientific. But WOW !
this really opens all kinds of doors.
 
Billy
 
 
 
 
=====================================
 
Recommended by Real Clear Politics
September 21, 2013
 
 
from the site:
Grist
   
Science confirms: Politics wrecks your ability to do math 
By _Chris  Mooney_ (http://grist.org/author/chris-mooney/) 
 
 
Everybody knows that our political views can sometimes get in the way of  
thinking clearly. But perhaps we don’t realize how bad the problem actually 
is.  According to a _new psychology paper_ 
(http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2319992) , our political 
passions can even  undermine our 
very basic reasoning skills. More specifically, the study finds  that people 
who are otherwise very good at math may totally flunk a problem that  they 
would otherwise probably be able to solve, simply because giving the right  
answer goes against their political beliefs. 
_The study_ (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2319992) , 
by Yale law professor _Dan  Kahan_ 
(http://www.law.yale.edu/faculty/DKahan.htm)  and his colleagues, has an 
ingenious design. At the outset, 1,111  
study participants were asked about their political views and also asked a  
series of questions designed to gauge their “_numeracy_ 
(http://en.wikipedia.org/wiki/Numeracy) ,” that  is, their mathematical 
reasoning ability. 
Participants were then asked to solve  a fairly difficult problem that involved 
interpreting the results of a (fake)  scientific study. But here was the trick: 
While the fake study data that they  were supposed to assess remained the 
same, sometimes the study was described as  measuring the effectiveness of a “
new cream for treating skin rashes.” But in  other cases, the study was 
described as involving the effectiveness of “a law  banning private citizens 
from 
carrying concealed handguns in public.” 
The result? Survey respondents performed wildly differently on what was in  
essence the same basic problem, simply depending upon whether they  had 
been told that it involved guns or whether they had been told that it  involved 
a new skin cream. What’s more, it turns out that highly numerate  liberals 
and conservatives were even more – not less —  susceptible to letting 
politics skew their reasoning than were those with less  mathematical ability. 
But we’re getting a little ahead of ourselves — to fully grasp the  
Enlightenment-destroying nature of these results, we first need to explore the  
tricky problem that the study presented in a little bit more detail. 
Let’s start with the “skin cream” version of this brain twister. You can  
peruse the image below to see exactly what research subjects read (and try 
out  your own skill at solving it), or skip on for a brief explanation: 
(http://grist.files.wordpress.com/2013/09/study_image_1.png?) Dan KahanFull 
text 
of one  version of the study’s “skin cream” problem. Click to  embiggen.  
As you can see above, the survey respondents were presented with a 
fictional  study purporting to assess the effectiveness of a new skin cream, 
and 
informed  at the outset that “new treatments often work but sometimes make 
rashes worse”  and that “even when treatments don’t work, skin rashes sometimes 
get better and  sometimes get worse on their own.” They were then presented 
with a table of  experimental results, and asked whether the data showed 
that the new skin cream  “is likely to make the skin condition better or worse.
” 
So do the data suggest that the skin cream works? The correct answer in the 
 scenario above is actually that patients who used the skin cream were “
more  likely to get worse than those who didn’t.” That’s because the ratio of 
those  who saw their rash improve to those whose rash got worse is roughly 3 
to 1 in  the “skin cream” group, but roughly 5 to 1 in the control group — 
which means  that if you want your rash to get better, you are better off 
not using the skin  cream at all. (For half of study subjects asked to solve 
the skin cream problem,  the data were reversed and presented in such a way 
that they did actually  suggest that the skin cream works.) 
This is no easy problem for most people to solve: Across all conditions of  
the study, 59 percent of respondents got the answer wrong. That is, in  
significant part, because trying to intuit the right answer by quickly 
comparing  two numbers will lead you astray; you have to take the time to 
compute 
the  ratios. 
Not surprisingly, Kahan’s study found that the more numerate you are, the  
more likely you are to get the answer to this “skin cream” problem right.  
Moreover, it found no substantial difference between highly numerate 
Democrats  and highly numerate Republicans in this regard. The better members 
of 
both  political groups were at math, the better they were at solving the skin 
cream  problem. 
But now take the same basic study design and data, and simply label it  
differently. Rather than reading about a skin cream study, half of Kahan’s  
research subjects were asked to determine the effectiveness of laws “banning  
private citizens from carrying concealed handguns in public.” Accordingly, 
these  respondents were presented not with data about rashes and whether they 
got  better or worse, but rather with data about cities that had or hadn’t 
passed  concealed carry bans, and whether crime in these cities had or had 
not  decreased. 
Overall, then, study respondents were presented with one of four possible  
scenarios, depicted below with the correct answer in bold: 
(http://grist.files.wordpress.com/2013/09/study-image-3.png?) Dan KahanThe four 
problem  
scenarios from the study (each respondent received just one of these). Click to 
 
embiggen.  
So how did people fare on the handgun version of the problem? They 
performed  quite differently than on the skin cream version, and strong 
political 
patterns  emerged in the results — especially among people who are good at 
mathematical  reasoning. Most strikingly, highly numerate liberal Democrats did 
almost  perfectly when the right answer was that the concealed weapons ban 
does indeed  work to decrease crime (version C of the experiment) — an 
outcome that favors  their pro-gun-control predilections. But they did much 
worse 
when the correct  answer was that crime increases in cities that enact the 
ban (version D of the  experiment). 
The opposite was true for highly numerate conservative Republicans: They 
did  just great when the right answer was that the ban didn’t work (version 
D), but  poorly when the right answer was that it did (version C). 
Here are the results overall, comparing subjects’ performances on the “
skin  cream” versions of the problem (above) and the “gun ban” versions of the 
problem  (below), and relating this performance to their political 
affiliations and  numeracy scores: 
(http://grist.files.wordpress.com/2013/09/study-image-2_0.png?) Dan KahanFull 
study results comparing subjects’ performance 
on the skin  cream problem with their performance on the gun ban problem. 
Vertical axes plot  response accuracy. Horizontal axes show mathematical 
reasoning ability. Click to  embiggen.  
For study author Kahan, these results are a fairly strong refutation of 
what  is called the “_deficit model_ 
(https://en.wikipedia.org/wiki/Information_deficit_model) ” in the field of 
science and technology studies  — the 
idea that if people just had more knowledge, or more reasoning ability,  then 
they would be better able to come to consensus with scientists and experts  
on issues like climate change, evolution, the safety of vaccines, and pretty  
much anything else involving science or data (for instance, whether 
concealed  weapons bans work). Kahan’s data suggest the opposite — that 
political 
biases  skew our reasoning abilities, and this problem seems to be worse for 
people with  advanced capacities like scientific literacy and numeracy. “If 
the people who  have the greatest capacities are the ones most prone to 
this, that’s reason to  believe that the problem isn’t some kind of deficit in 
comprehension,” Kahan  explained in an interview. 
So what are smart, numerate liberals and conservatives actually doing in 
the  gun control version of the study, leading them to give such disparate 
answers?  It’s kind of tricky, but here’s what Kahan thinks is happening. 
Our first instinct, in all versions of the study, is to leap instinctively 
to  the wrong conclusion. If you just compare which number is  bigger in the 
first column, for instance, you’ll be quickly led astray. But more  
numerate people, when they sense an apparently wrong answer that offends their  
political sensibilities, are both motivated and equipped to dig deeper, think  
harder, and even start performing some calculations — which in this case 
would  have led to a more accurate response. 
“If the wrong answer is contrary to their ideological positions, we  
hypothesize that that is going to create the incentive to scrutinize that  
information and figure out another way to understand it,” says Kahan. In other  
words, more numerate people perform better when identifying study results that  
support their views — but may have a big blind spot when it comes to 
identifying  results that undermine those views. 
What’s happening when highly numerate liberals and conservatives actually 
get  it wrong? Either they’re intuiting an incorrect answer that is 
politically  convenient and feels right to them, leading them to inquire no 
further — 
or else  they’re stopping to calculate the correct answer, but then 
refusing to accept it  and coming up with some elaborate reason why 1 + 1 
doesn’t 
equal 2 in this  particular instance. (Kahan suspects it’s mostly the former, 
rather than the  latter.) 
The Scottish Enlightenment philosopher _David  Hume_ 
(http://en.wikipedia.org/wiki/David_Hume)  famously described reason as a 
“slave of the passions.”
 Today’s  political scientists and political psychologists, like Kahan, are 
now affirming  Hume’s statement with reams of new data. This new study is 
just one out of many  in this respect, but it provides perhaps the most 
striking demonstration yet  of _just how motivated, just how biased, reasoning 
can  be_ 
(http://www.motherjones.com/politics/2011/03/denial-science-chris-mooney)  – 
especially about politics.

-- 
-- 
Centroids: The Center of the Radical Centrist Community 
<[email protected]>
Google Group: http://groups.google.com/group/RadicalCentrism
Radical Centrism website and blog: http://RadicalCentrism.org

--- 
You received this message because you are subscribed to the Google Groups 
"Centroids: The Center of the Radical Centrist Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to