On 27 July 2010 17:11, Francis Irving <[email protected]> wrote:
> An interesting article about how facts don't help, link from Julian
> below.
>

One of the very positive things that comes out of some of the academic
research it links to is the finding that the more self-confidence
people have, the more easily they will drop preconceived notions in
the face of conflicting evidence. That, I think, is something to work
on.

I've always been attracted to the idea of "mind training" (a term I
found in A.E. Van Vogt's "Weapon Shops" series). If people are trained
to think well, then they are more likely to be able to contribute
effectively to a democratic society: not merely narrowly in terms of
the way in which they vote, but more broadly in the way they
contribute to society.

There are preconceptions which are relatively easy to dispel in safe
and comfortable environments. Getting someone into the habit of
evaluating evidence properly without directly challenging their core
beliefs would, I hope, spill over into the more important parts of
their lives. Sneaky, but, I think, effective.

So, maybe that's something we should think about doing? I'm not sure
of the shape of it. Really its something that happens to a very
minimal degree in school (eg in media studies or in data handling) but
not to a very great extent and not very practically. Certainly pushing
straight thinking may be more effective that pushing whatever real
agenda you may have.

Something to think about maybe?

Taking on someone's belief system head on is very rarely the right way
to go about things. It is very easy to find oneself becoming convinced
of one's own arguments. This is sufficiently trite that I don't need
to read academic research on it (they haven't really affected my
priors 8-).

For example, (with my Christian hat on) I can tell you that
participating in evangelism makes one believe in the Christian faith
more not less, because, not despite, spending time arguing with people
who disagree with you. This is well understood by effective Christian
organisations who make use of it.

Equally, experience in de-programming, suggests that the most helpful
and long-lasting forms of de-programming (and ones that are not
abusive) simply involve giving cult members time and space to stop and
think without any pressure. Putting someone in a relaxed environment
with people to talk about random things, can be the most effective of
all.

My profession (the bar) is very well aware of the problems of
confirmation bias. Advocates all too easily come to believe the
objective certainty of their own cases, which is a danger we are
trained to avoid. Note: there's nothing wrong with believing you are
right, the problem comes when you think that you are self-evidently
right, then your advocacy goes to pot.

It is also vital to be able to assess a client's case objectively.
That is extremely tough since you may already have views about it, but
your bottom line is affected by a probabilistic assessment of the
likely outcome. To do that you have to anticipate arguments that will
be raised against you. Lots of bar training is directed at being able
to do that well, but one still sees many failures in practice.

> One of the cultural things I like about the community around mySociety
> is our susceptibility to facts. It's key to our non-partisanship.
> More fundamentally though, it's a built in geek trait.

I think this is coupled with habit again. As a programmer one is
habituated to dealing with a device (a computer) which has an
objective reality that is relatively inflexible and that is not
susceptible to persuasion, flannel or otherwise able to behave other
than its internal logic demands.

When I was teaching children mathematics they were often sceptical of
some of my conclusions (however well explained) but a quick appeal to
a calculator resolved their doubts instantly. Its a classic amongst
maths teachers, but 0.2*0.2=0.04 is a result many children do not
believe (they are already convinced its 0.4, on the basis that
0.5*0.5=0.25 and extrapolating from there), it helps to be able to
show that the *calculator* shows otherwise 8-).

>
> We admit when we're wrong.
>

Even more fundamentally we are used to being put in situations where
it is meaningful to be wrong and easily so ascertainable. Having said
that, I've had plenty of arguments with you personally which generated
more heat than light so as another person said we aren't perfect 8-).

-- 
Francis Davey

_______________________________________________
Mailing list [email protected]
Archive, settings, or unsubscribe:
https://secure.mysociety.org/admin/lists/mailman/listinfo/developers-public

Reply via email to