Jeff Ricker wrote:

> I came in this morning and noticed that there were no new messages. Is
> TIPS dying out? Oh well, I'll send out to you some thought I
> was having this morning and see if any of you nibble.

        Ok, you got me. My favorite topic...

> In teaching my courses, I often think about what seems to be a gulf
> between how I view the world and how many of my students, as
> well as the wider American culture, seem to view the world. This
> difference involves the notion of "belief" and what is required to say
> that one's belief is valid.
(snip)
> According to my dictionary, faith is an  "unquestioning belief that
> DOES NOT REQUIRE proof or evidence." In a post I wrote last February,
> I said that:
>
> "...this definition paints an ideal that is not often, or perhaps is
> never, seen in real life: people always require SOME evidence if they
> are going to continue to hold a belief. The problem is how
> they process new information with regard to this belief. My guess is
> that, when a belief is based on faith, what this means IN PRACTICE is that
> people are more likely to discount evidence that is not consistent with
their
> beliefs..."

        So you're saying that in practice, holding a belief "on faith" is not a
matter of having no evidence or proof, but rather being more likely to
discount disconfirming evidence. Right?

        You may well be right. I'll raise my (by now probably tiresome) objection
to your dictionary's definition: it suggests that "proof" and "evidence"
occupy the other pole of a dichotomy ("faith" occupying the first pole). But
of course it's NOT a dichotomy, but rather more like a spectrum. There's
"faith" at one end, and "proof" at the other, and the vast middle is
occupied by varying qualities of evidence.

        On that model, you're suggesting (I think) that the "faith" end is really
_never_ completely free of evidence - that it never really reaches the "zero
point". You're also suggesting that the spectrum I've depicted does NOT also
tell you anything about the tenacity of the belief - beliefs held at either
end (proof or faith) can be equally tenacious.

> In my own experience, the message I was taught was that, to believe
> without evidence is a sign of one's moral or spiritual worthiness.
> I think that this is a message that many of us have been taught in
> the wider culture as well as in our own particular
> subcultures (and it probably is one reason why scientists are seen
> relatively often in a negative light). Even more important, if one
> continues to have faith even in the face of contradictory evidence,
> one's worthiness and even superiority becomes even more apparent.

        I have heard (unconfirmed) that this value (lauding "blind faith" and
resistance to contradictory evidence) is absent from some non-Western
religions (specifically, Buddhism). I agree with you, though, that this is a
value underlying our culture. Witness the emphasis on _having_ principles,
without any corresponding emphasis on making sure that those are the RIGHT
PRINCIPLES IN THE FIRST PLACE (I'm thinking in particular about the
candidacies of persons like Gary Bauer and Dan Quayle, and their attacks on
the notion of changing one's mind in response to polling data or new
evidence).

> If I am right about this, then our jobs in the classroom are very
> difficult indeed--perhaps impossible for most students. In
> teaching the _science_ of psychology, we are trying to help develop
> in them a worldview where faith has no place. We cannot expect to
> convince many of them that this is the case. It is very difficult to
> learn that, NO MATTER HOW CERTAIN ONE "FEELS" THAT ONE'S BELIEF IS
CORRECT,
> THIS IS NOT EVIDENCE IN FAVOR OF THE BELIEF. It is difficult because, in
> the ideal model of faith, evidence is not required.

        This goes a bit afield from the "faith" part of your argument, but I'm
busily sifting through the results of my dissertation work, and just
starting writing a summary. Rather than "faith", I was concerned about the
notion that one's personal experiences provide unquestionable sources of
causal beliefs. I asked undergraduates for some causal beliefs which they
held on the basis of personal experience, and challenged those beliefs with
summaries of contradicting research. A treatment group also read a short
piece explaining that
A - We make _inferences_ from those experiences to those causal beliefs.
B - Those inferences are fallible.
C - Even really smart people make mistakes in those inferences.

        I hoped that treatment group participants would recognize alternative
explanations for their experiences (that is, explanations that did not rely
on the causal beliefs they expressed), and that in doing so, would be more
inclined to question their beliefs.

Here's some of the findings:

1. As expected, personal experience was a fairly common and strong basis for
causal belief.
2. As expected, authority was a fairly rare and weak basis for causal
belief.
3. Reading a summary of research contradicting personal experience-formed
causal beliefs did cause a small reduction in those beliefs.
4. While quite brief, the treatment significantly improved the likelihood
that a participant would consider a specific alternative explanation for the
personal experience leading her to a causal belief.
5. Participants were quite unlikely to consider alternative explanations
without the prompting provided by the treatment - only one of 16 control
group participants even showed any recognition that there might be such
alternative explanations when asked what would have to happen to change
their minds about their prior beliefs. Even with the treatment, such
recognition was rare - only five of 15 treatment group participants provided
specific alternatives (two more failed to provide specific alternatives, but
showed awareness that there could be alternative explanations).
6. Participants who did consider specific alternative explanations were more
likely to change their minds about their prior beliefs than participants who
did not consider specific alternative explanations.
7. Changing one's mind was apparently influenced by an interaction of
several different types of reasoning. A popular piece of reasoning involved
in maintaining one's prior belief was the notion that repeated or consistent
experience provides improved evidence. This notion is sometimes referred to
as the belief that "the plural of anecdote is data".

        More to come...

        By the way, that last phrase ("The plural of anecdote is not data") came
from a TIPSter, I believe. If it's a quote from someone in particular, I
could really use a source.

Paul Smith
Alverno College
Milwaukee

Reply via email to