Politics as a form of Science
 
Terrific article follows. The point to make about it is that while it  is
true that human beings cannot live without confirmation bias  -think  of how
unmanageably complicated life would be without it-  it happens again  and
again that we make mistakes because of it.
 
In a sense Radical Centrism also falls into this pattern once you have 
gone through the process of abandoning partisan politics in favor of  
independent 
politics. From that time onward it becomes easy, too easy, to be dismissive 
of all real world contributions from partisan political sources.
 
The RC principle of balance is supposed to take care of this problem  and
I like to think that it usually does so.  That is,  we are  free to adopt 
radical stands
on issues as long as, on balance, there are about as many Right positions  
in
one's personal "platform" as there are Left positions. Not because of  
abstract
worship of math but because neither party would exist unless some of its  
views
were objectively true and good.
 
We need to ask ourselves, if we tend to have a mostly R or mostly L  
personal 
platform, OK, what am I missing?  Why am I having trouble not  seeing 
what is good and true on the Left or Right?  
 
The point is also that parties and politicians are always modifying their  
views.
How do we keep up with the changes?  Hence confirmation bias kicks  in.
No good reason to keep up with the changes, is there, when we can  
selectively
edit facts in our heads to reach conclusions we already have? Besides, who  
has
time for that? Any excuse will do.
 
It seems to me that Radical Centrism, because it sets the bar quite high, 
demands a lot, and for most people it demands too much. The question  
though, 
boils down to this:
 
How important is it to you to be objectively right?  Seriously.  Because if 
that
is your purpose then you need to invest time in being right since  there is
no end to the quest for right (true, correct, prescient) answers.
 
Do you really want to be a scientist of politics or not?  If you  answer 
yes, then
you need to be prepared to spend time at this and be prepared to revise  
your
conclusions every now and then. That is how science works.
 
These are some preliminary thoughts on the issues raised by the  article.
I'm not sure that I have things right but I think that there is a larger  
point
to be made based on the findings in the article. Feel free to  comment.
 
 
Billy
 
-----------------------------------------------------
 
 
 
 
Real Clear Politics  /  Real Clear Science
 
 
January 18, 2014  
 
What You Think Is Right May Actually Be  Wrong
By _Peter  Ellerton_ 
(http://www.realclearscience.com/authors/peter_ellerton/) 



We like to think that we reach conclusions by reviewing facts, weighing  
evidence and analysing arguments. But this is not how humans usually operate,  
particularly when decisions are important or need to be made quickly.
 
What we usually do is arrive at a conclusion independently of conscious  
reasoning and then, and only if required, search for reasons as to why we 
might  be right. 
The first process, drawing a conclusion from evidence or facts, is called  
inferring; the second process, searching for reasons as to why we might 
believe  something to be true, is called rationalising. 
Rationalise vs infer
That we rationalise more than we infer seems counter-intuitive, or at least 
 uncomfortable, to a species that prides itself on its ability to reason, 
but it  is borne out by the work of many researchers, including the US 
psychologist and  _Nobel  Laureate_ 
(http://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/2002/kahneman-bio.html)
  _Daniel Kahneman_ 
(http://www.princeton.edu/~kahneman/)   (most recently in his book _Thinking  
Fast and 
Slow_ 
(http://books.google.com.au/books/about/Thinking_Fast_and_Slow.html?id=ZuKTvERuPG8C)
 ). 
We tend to prefer conclusions that fit our existing world-view, and that  
don’t require us to change a pleasant and familiar narrative. We are also 
more  inclined to accept these conclusions, intuitively leaping to them when 
they are  presented, and to offer resistance to conclusions that require us to 
change or  seriously examine existing beliefs. 
There are many ways in which our brains help us to do this. 
Consider global warming
Is global warming too difficult to understand? Your brain makes a  
substitution for you: what do you think of environmentalists? It then transfers 
 
that (often emotional) impression, positive or negative, to the issue of global 
 warming and presents a conclusion to you in sync with your existing views. 
Your brain also helps to make sense of situations in which it has minimal  
data to work with by creating associations between pieces of information. 
If we hear the words “refugee” and “welfare” together, we cannot help but 
 weave a narrative that makes some sort of coherent story (what Kahneman 
calls  associative coherence). The more we hear this, the more familiar and  
ingrained the narrative. Indeed, the process of creating a coherent narrative 
 has been shown to be more convincing to people than facts, even when the 
facts  behind the narrative are shown to be wrong (understood as the 
_perseverance  of social theories_ 
(http://www.psychology.iastate.edu/faculty/caa/abstracts/1979-1984/80ALR.html)  
and involved in the _Backfire  Effect_ 
(http://youarenotsosmart.com/2011/06/10/the-backfire-effect/) ). 
Now, if you are a politician or a political advisor, knowing this sort of  
thing can give you a powerful tool. It is far more effective to create, 
modify  or reinforce particular narratives that fit particular world-views, and 
then  give people reasons as to why they may be true, than it is to provide 
evidence  and ask people to come to their own conclusions. 
It is easier to help people rationalise than it is to ask them to infer. 
More  plainly, it is easier to lay down a path for people to follow than it is 
to  allow them to find their own. Happily for politicians, this is what our 
brains  like doing. 
How politicians frame issues
This can be done in two steps. The first is to frame an issue in a way that 
 reinforces or modifies a particular perspective. The cognitive scientist 
_George Lakoff_ (http://georgelakoff.com/)  highlighted the use of the  
phrase “tax relief” by the American political right in the 1990s. 
Consider how this positions any debate around taxation levels. Rather than  
taxes being a “community contribution” the word “relief” suggests a 
burden that  should be lifted, an unfair load that we carry, perhaps beyond our 
ability  bear. 
The secret, and success, of this campaign was to get both the opposing  
parties and the media to use this language, hence immediately biasing any  
discussion. 
Interestingly, it was also an initiative of the American Republican party 
to  _rephrase  the issue_ 
(http://woods.stanford.edu/sites/default/files/files/gw-language-choices.pdf)  
of “global warming” into one of “climate change”
, which seemed  more benign at the time. 
Immigration becomes security
In recent years we have seen immigration as an issue disappear, it is now  
framed almost exclusively as an issue of “national security”. All parties 
and  the media now talk about it in this language. 
Once the issue is appropriately framed, substitution and associations can 
be  made for us. Talk of national security allows us to talk about borders, 
which  may be porous, or even crumbling. This evokes emotional reactions that 
can be  suitably manipulated. 
Budgets can be “in crisis” or in “emergency” conditions, suggesting the 
need  for urgent intervention, or rescue missions. Once such positions are  
established, all that is needed are some reasons to believe them. 
The great thing about rationalisation is that we get to select the reasons 
we  want – that is, those that will support our existing conclusions. Our 
_confirmation  bias_ 
(http://www.princeton.edu/~achaney/tmve/wiki100k/docs/Confirmation_bias.html) , 
a tendency to notice more easily those reasons or 
examples that  confirm our existing ideas, selects just those reasons that suit 
our purpose.  The job of the politician, of course, is to provide them. 
Kahneman notes that the more familiar a statement or image, the more it is  
accepted. It is the reason that messages are repeated ad nauseam, and  
themes are paraphrased and recycled in every media appearance. Pretty soon, 
they 
 seem like our own. 
How to think differently
So what does this mean for a democracy in which citizens need to be  
independent thinkers and autonomous actors? Well, it shows that the onus is not 
 
just on politicians to change their behaviour (after all, one can hardly 
blame  them for doing what works), but also on us to continually question our 
own  positions and judgements, to test ourselves by examining our beliefs and  
recognising rationalisation when we engage in it. 
More than this, it means public debate, through the media in particular,  
needs to challenge preconceptions and resist the trend to simple assertion. 
We  are what we are, but that doesn’t mean we can’t work better with it. 
Peter Ellerton does not work for, consult to, own shares in or receive  
funding from any company or organisation that would benefit from this article,  
and has no relevant affiliations. 
 

 
Peter Ellerton is a lecturer in critical thinking at the University of  
Queensland.

-- 
-- 
Centroids: The Center of the Radical Centrist Community 
<[email protected]>
Google Group: http://groups.google.com/group/RadicalCentrism
Radical Centrism website and blog: http://RadicalCentrism.org

--- 
You received this message because you are subscribed to the Google Groups 
"Centroids: The Center of the Radical Centrist Community" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to