Thanks, I have enjoyed many of your posts.

The term "magical thinking" seems to be a moving target for me in
relationship with communicating with others.  I know where I draw this
line, but I think everyone has there own line to draw here.  It seems
more useful as a concept of self discovery, but in the context of
communicating with people with different beliefs it seems harsh.  This
is coming from a guy who has used this term often and freely in the
past!  I don't think it works as well in a group like this where
people are thoughtfully choosing this line for themselves.  What do
you think?





--- In FairfieldLife@yahoogroups.com, new_morning_blank_slate
<[EMAIL PROTECTED]> wrote:
>
> No, I have not read that one. It looks good. 
> 
> I think cognitve biases and logical fallacies are the cornorstones to
> "magical thinking". (I appreciate your recent cites and posts on
> such.) And magical interpretations -- whether of experiences,
> "scriptures" or current events.
> 
> Magical thinking (MT) takes one to the opposite cornor of What Is. MT
> may bring some feel-good comfort to the soul, and be the fuel for
> dreamers, but ultimately its illusion and delusion. 
> 
> In my reading / interpretation (we all make interpretations) of
> various hindu-related  scriptures, a sharp intellect and the ability
> to finely discriminate are cited valuable tools in uncovering what is
> real and what is unreal. Discrimination of what is Real and Unreal.
> Discrimination between Buddhi and Purusha and all. Knowing the
> existence and structure of cognitve biases and logical fallacies,
> being able to readily indentify  them and avoid them are part of that
> sharpening process.
>  
> 
> --- In FairfieldLife@yahoogroups.com, "curtisdeltablues"
> <curtisdeltablues@> wrote:
> >
> > Excellent post.  Are you hip to Gilovitch's book: How We Know What
> > isn't So, The fallibility of human reason in everyday life? He studies
> > human cognitive error at Cornell.
> > 
> >
>
http://www.amazon.com/gp/product/0029117062/sr=8-1/qid=1149893839/ref=pd_bbs_1/102-4458199-6191348?%5Fencoding=UTF8
> > 
> > 
> > --- In FairfieldLife@yahoogroups.com, new_morning_blank_slate
> > <no_reply@> wrote:
> > >
> > > We all make them. To the extent that we are aware of their existence
> > > and structure, we can avoid them in our own internal reasoning,
and in
> > > communications. 
> > > 
> > > Whoever has more than 20 in any post, gets a gallon of woowoo juice.
> > > 
> > > 
> > > http://en.wikipedia.org/wiki/List_of_cognitive_biases
> > > 
> > > Cognitive bias is distortion in the way we perceive reality (see
also
> > > cognitive distortion).
> > > 
> > > Some of these have been verified empirically in the field of
> > > psychology, others are considered general categories of bias.
> > > 
> > >     This is an incomplete list, which may never be able to satisfy
> > > certain standards for completeness. 
> > > 
> > > 
> > > 
> > > Decision making and behavioral biases
> > > 
> > > Many of these biases are studied for how they affect belief
formation
> > > and business decisions and scientific research
> > > 
> > >     * Bandwagon effect - the tendency to do (or believe) things
> > > because many other people do (or believe) the same.
> > >     * Bias blind spot - the tendency not to compensate for one's own
> > > cognitive biases.
> > >     * Choice-supportive bias - the tendency to remember one's
choices
> > > as better than they actually were.
> > >     * Confirmation bias - the tendency to search for or interpret
> > > information in a way that confirms one's preconceptions.
> > >     * Congruence bias - the tendency to test hypotheses exclusively
> > > through direct testing
> > >     * Contrast effect - the enhancement or diminishment of a
weight or
> > > other measurement when compared with recently observed contrasting
> > object.
> > >     * Disconfirmation bias - the tendency for people to extend
> > > critical scrutiny to information which contradicts their prior
beliefs
> > > and accept uncritically information that is congruent with their
prior
> > > beliefs.
> > >     * Endowment effect - the tendency for people to value something
> > > more as soon as they own it.
> > >     * Focusing effect - prediction bias occurring when people place
> > > too much importance on one aspect of an event; causes error in
> > > accurately predicting the utility of a future outcome.
> > >     * Hyperbolic discounting - the tendency for people to have a
> > > stronger preference for more immediate payoffs relative to later
> > > payoffs, the closer to the present both payoffs are.
> > >     * Illusion of control - the tendency for human beings to believe
> > > they can control or at least influence outcomes which they clearly
> > cannot.
> > >     * Impact bias - the tendency for people to overestimate the
length
> > > or the intensity of the impact of future feeling states.
> > >     * Information bias - the tendency to seek information even
when it
> > > cannot affect action
> > >     * Loss aversion - the tendency for people to strongly prefer
> > > avoiding losses over acquiring gains (see also sunk cost effects)
> > >     * Neglect of Probability - the tendency to completely disregard
> > > probability when making a decision under uncertainty.
> > >     * Mere exposure effect - the tendency for people to express
undue
> > > liking for things merely because they are familiar with them.
> > >     * Color psychology - the tendency for cultural symbolism of
> > > certain colors to affect affective reasoning.
> > >     * Omission Bias - The tendency to judge harmful actions as
worse,
> > > or less moral than equally harmful omissions (inactions.)
> > >     * Outcome Bias - the tendency to judge a decision by its
eventual
> > > outcome instead of based on the quality of the decision at the
time it
> > > was made.
> > >     * Planning fallacy - the tendency to underestimate
task-completion
> > > times.
> > >     * Post-purchase rationalization - the tendency to persuade
oneself
> > > through rational argument that a purchase was good value.
> > >     * Pseudocertainty effect - the tendency to make risk-averse
> > > choices if the expected outcome is positive, but risk-seeking
choices
> > > to avoid negative outcomes.
> > >     * Rosy retrospection - the tendency to rate past events more
> > > positively than they had actually rated them when the event
occurred.
> > >     * Selective perception - the tendency for expectations to affect
> > > perception.
> > >     * Status quo bias - the tendency for people to like things
to stay
> > > relatively the same.
> > >     * Von Restorff effect - the tendency for an item that
"stands out
> > > like a sore thumb" to be more likely to be remembered than other
> items.
> > >     * Zeigarnik effect - the tendency for people to remember
> > > uncompleted or interrupted tasks better than completed ones.
> > >     * Zero-risk bias - preference for reducing a small risk to zero
> > > over a greater reduction in a larger risk.
> > > 
> > > 
> > > Biases in probability and belief
> > > 
> > > Many of these biases are often studied for how they affect business
> > > and economic decisions and how they affect experimental research.
> > > 
> > >      * Affective forecasting 
> > > Affective forecasting is the forecasting of one's affect (emotional
> > > state) in the future. This kind of prediction is affected by various
> > > kinds of cognitive biases, i.e. systematic errors of thought. Daniel
> > > Gilbert of the department of social psychology at Harvard University
> > > and other researchers in the field, such as Timothy Wilson of the
> > > University of Virginia and George Loewenstein of Carnegie Mellon
> > > University, have studied those cognitive biases and given them names
> > > like "empathy gap" and "impact bias" and the like.
> > > 
> > > Affective forecasting is an important concept in psychology, because
> > > psychologists try to study what situations in life are important to
> > > humans, and how they change their views with time.
> > > 
> > > 
> > >     * Ambiguity effect - the avoidance of options for which missing
> > > information makes the probability seem "unknown"
> > > 
> > > The ambiguity effect is a cognitive bias where decision-making is
> > > affected due to a lack of information, or an "ambiguity."
> > > 
> > > For example, picture an urn with 90 balls inside of it. The
balls are
> > > colored red, black and yellow. 30 of the balls are red, and the
other
> > > 60 are some combination of black and yellow balls, with all
> > > combinations being equally likely. In option X, drawing a red ball
> > > would earn you the $100, and in option Y, drawing a black ball would
> > > earn you the $100. The difference between the two options is
that the
> > > number of red balls is certain for option X, but the number of black
> > > balls for option Y is uncertain.
> > > 
> > > Which option gives you the best chance at picking out a winning
ball?
> > > The truth is that the probability of picking a winning ball is
> > > identical for both options X and Y. In option X, where the number of
> > > red balls is certain, the probability of selecting a winning ball is
> > > 1/3 (30 red balls out of 90 total balls). In option Y, despite the
> > > fact that the number of black balls is not certain, the
probability of
> > > selecting a winning ball is also 1/3. This is because the range of
> > > possibilities as to the number of black balls is some amount
between 0
> > > and 60. This means that the probability of there being more than 30
> > > black balls is the same as there being less than 30 black balls.
> > > Because of this, according to what is known as the expected-utility
> > > theory, one should be indifferent between the two options. As a
> > > result, the chances of winning the $100 are the same for both urns.
> > > 
> > > People are much more likely to want to select a ball under option X,
> > > where the probability of selecting a winning ball is, in their
minds,
> > > more certain. The question as to the number of black balls under
> > > scenario Y turns people off to that option. Despite the fact that
> > > there could possibly be double the black balls to red balls, people
> > > tend to not want to take the opposing risk that there may be
less than
> > > 30 black balls. The "ambiguity" behind option Y makes people want to
> > > select option X, even when they are theoretically equivalent.
> > > 
> > > This bias was discovered by Daniel Ellsberg in 1961. Ellsberg deemed
> > > these situations where the "probability is unknown" as "ambiguous,"
> > > hence the "ambiguity effect."
> > > 
> > > One explanation of the effect is that people follow a heuristic, a
> > > rule of thumb, of avoiding options about what information is missing
> > > (Frisch & Baron, 1988; Ritov & Baron, 1990). This is usually a good
> > > rule because it leads us to look for the information. In many cases,
> > > though, the information cannot be obtained. Information is almost
> > > always missing, and the effect is often the result of calling some
> > > particular missing piece to our attention.
> > > 
> > >     * Anchoring - the tendency to rely too heavily, or "anchor," on
> > > one trait or piece of information when making decisions
> > > 
> > >     * Anthropic bias - the tendency for one's evidence to be
biased by
> > > observation selection effects
> > >     * Attentional bias - neglect of relevant data when making
> > > judgments of a correlation or association
> > >     * Availability error - the distortion of one's perceptions of
> > > reality, due to the tendency to remember one alternative outcome
of a
> > > situation much more easily than another
> > >     * Belief bias - the tendency to base assessments on personal
> > > beliefs (see also belief perseverance and Experimenter's regress)
> > >     * Belief Overkill - the tendency to bring beliefs and values
> > > together so that they all point to the same conclusion
> > >     * Clustering illusion - the tendency to see patterns where
> > > actually none exist
> > >     * Conjunction fallacy - the tendency to assume that specific
> > > conditions are more probable than general ones
> > >     * Gambler's fallacy - the tendency to assume that individual
> > > random events are influenced by previous random events— "the
coin has
> > > a memory"
> > >     * Hindsight bias - sometimes called the "I-knew-it-all-along"
> > > effect, the inclination to see past events as being predictable
> > >     * Illusory correlation - beliefs that inaccurately suppose a
> > > relationship between a certain type of action and an effect
> > >     * Myside bias - the tendency for people to fail to look for
or to
> > > ignore evidence against what they already favor
> > >     * Neglect of prior base rates effect - the tendency to fail to
> > > incorporate prior known probabilities which are pertinent to the
> > > decision at hand
> > >     * Observer-expectancy effect - when a researcher expects a given
> > > result and therefore unconsciously manipulates an experiment or
> > > misinterprets data in order to find it. (see also subject-expectancy
> > > effect)
> > >     * Overconfidence effect - the tendency to overestimate one's own
> > > abilities
> > >     * Polarization effect - increase in strength of belief on both
> > > sides of an issue after presentation of neutral or mixed evidence,
> > > resulting from biased assimilation of the evidence.
> > >     * Positive outcome bias (prediction) - a tendency in
prediction to
> > > overestimate the probability of good things happening to them. (see
> > > also wishful thinking and valence effect)
> > >     * Recency effect - the tendency to weigh recent events more than
> > > earlier events (see also peak-end rule)
> > >     * Primacy effect - the tendency to weigh initial events more
than
> > > subsequent events
> > >     * Subadditivity effect - the tendency to judge probability
of the
> > > whole to be less than the probabilities of the parts.
> > > 
> > > 
> > > 
> > > Social biases
> > > 
> > > Most of these biases are labeled as attributional biases.
> > > 
> > >     * Barnum effect (or Forer Effect) - the tendency to give high
> > > accuracy ratings to descriptions of their personality that
supposedly
> > > are tailored specifically for them, but are in fact vague and
general
> > > enough to apply to a wide range of people.
> > >     * Egocentric bias - occurs when people claim more responsibility
> > > for themselves for the results of a joint action than an outside
> > > observer would.
> > >     * False consensus effect - the tendency for people to
overestimate
> > > the degree to which others agree with them.
> > >     * Fundamental attribution error - the tendency for people to
> > > over-emphasize personality-based explanations for behaviors observed
> > > in others while under-emphasizing the role and power of situational
> > > influences on the same behavior. (see also group attribution error,
> > > positivity effect, and negativity effect)
> > >     * Halo effect - the tendency for a person's positive or negative
> > > traits to "spill over" from one area of their personality to another
> > > in others' perceptions of them. (see also physical attractiveness
> > > stereotype)
> > >     * Illusion of asymmetic insight - people perceive their
knowledge
> > > of their peers to surpass their peers' knowledge of them.
> > >     * Ingroup bias - preferential treatment people give to whom they
> > > perceive to be members of their own groups.
> > >     * Just-world phenomenon - the tendency for people to believe the
> > > world is "just" and so therefore people "get what they deserve."
> > >     * Lake Wobegon effect - the human tendency to report flattering
> > > beliefs about oneself and believe that one is above average (see
also
> > > worse-than-average effect, and overconfidence effect).
> > >     * Notational bias - a form of cultural bias in which a notation
> > > induces the appearance of a nonexistent natural law.
> > >     * Outgroup homogeneity bias - individuals see members of
their own
> > > group as being relatively more varied than members of other groups.
> > >     * Projection bias - the tendency to unconsciously assume that
> > > others share the same or similar thoughts, beliefs, values, or
> > positions.
> > >     * Self-serving bias - the tendency to claim more responsibility
> > > for successes than failures. It may also manifest itself as a
tendency
> > > for people to evaluate ambiguous information in a way beneficial to
> > > their interests. (see also group-serving bias)
> > >     * Trait ascription bias - the tendency for people to view
> > > themselves as relatively variable in terms of personality, behavior
> > > and mood while viewing others as much more predictable.
> > >     * Self-fulfilling prophecy - the tendency to engage in behaviors
> > > that elicit results which will (consciously or subconsciously)
confirm
> > > our beliefs.
> > > 
> > > ==========
> > > 
> > > Other Cognitive Biases
> > > 
> > > http://en.wikipedia.org/wiki/Category:Cognitive_biases
> > > (Some duplicates with above)
> > > 
> > >     * Adaptive Bias
> > > Adaptive Bias is the idea that the human brain has evolved to reason
> > > adaptively, rather than truthfully or even rationally, and that
> > > Cognitive bias may have evolved as a mechanism to reduce the overall
> > > cost of cognitive errors as opposed to merely reducing the number of
> > > cognitive errors, when faced with making a decision under conditions
> > > of uncertainty.
> > > 
> > > When making decisions under conditions of uncertainty, two kinds of
> > > errors need to be taken into account - "false positives", i.e.
> > > deciding that a risk or benefit exists when it does not, and "false
> > > negatives", i.e. failing to notice a risk or benefit that exists.
> > > False positives are also commonly called "Type 1 errors", and false
> > > negatives are called "Type 2 errors".
> > > 
> > > Where the cost or impact of a type 1 error is much greater than the
> > > cost of a type 2 error (e.g. the water is safe to drink), it can be
> > > worthwhile to bias the decision making system towards making fewer
> > > type 1 errors, i.e. making it less likely to conclude that a
> > > particular situation exists. This by definition would also increase
> > > the number of type 2 errors. Conversely, where a false positive is
> > > much less costly than a false negative (blood tests, smoke
detectors),
> > > it makes sense to bias the system towards maximising the
probablility
> > > that a particular (very costly) situation will be recognised,
even if
> > > this often leads to the (relatively un-costly) event of noticing
> > > something that is not actually there.
> > > 
> > > Martie G. Haselton and David M. Buss (2003) state that Cognitive
Bias
> > > can be expected to have developed in humans for cognitive tasks
where:
> > > 
> > >     * Decision making is complicated by a significant
signal-detection
> > > problem (i.e. when there is uncertainty)
> > >     * The solution to the particular kind of decision making problem
> > > has had a recurrent effect on survival and fitness throughout
> > > evolutionary history
> > >     * The costs of a "false positive" or "false negative" error
> > > dramatically outweighs the cost of the alternative type of error
> > > 
> > > 
> > >     * Affective forecasting
> > > Affective forecasting is the forecasting of one's affect (emotional
> > > state) in the future. This kind of prediction is affected by various
> > > kinds of cognitive biases, i.e. systematic errors of thought. Daniel
> > > Gilbert of the department of social psychology at Harvard University
> > > and other researchers in the field, such as Timothy Wilson of the
> > > University of Virginia and George Loewenstein of Carnegie Mellon
> > > University, have studied those cognitive biases and given them names
> > > like "empathy gap" and "impact bias" and the like.
> > > 
> > > Affective forecasting is an important concept in psychology, because
> > > psychologists try to study what situations in life are important to
> > > humans, and how they change their views with time.
> > > 
> > >     * Anchor (NLP)
> > >     * Anthropic bias
> > >     * Apophenia
> > >     * Appeal to pity
> > >     * Attributional bias
> > >     * Availability error
> > >     * Availability heuristic
> > > 
> > > B
> > > 
> > >     * Base rate fallacy
> > >     * Belief Overkill
> > >     * Bias blind spot
> > > 
> > > C
> > > 
> > >     * Choice blindness
> > >     * Choice-supportive bias
> > >     * Clustering illusion
> > >     * Confirmation bias
> > >     * Conjunction fallacy
> > >     * Contrast effect
> > >     * Cultural bias
> > > 
> > > D
> > > 
> > >     * Data dredging
> > >     * Disconfirmation bias
> > > 
> > > E
> > > 
> > >     * Egocentric bias
> > >     * Empathy gap
> > >     * Endowment effect
> > >     * Errors in Syllogisms
> > > 
> > >   
> > > E cont.
> > > 
> > >     * Exposure effect
> > > 
> > > F
> > > 
> > >     * False consensus effect
> > >     * Forer effect
> > >     * Fundamental attribution error
> > > 
> > > G
> > > 
> > >     * Gambler's fallacy
> > >     * Group attribution error
> > >     * Group-serving bias
> > >     * Groupthink
> > > 
> > > H
> > > 
> > >     * Halo effect
> > >     * Hindsight bias
> > >     * Hostile media effect
> > >     * Hyperbolic discounting
> > > 
> > > I
> > > 
> > >     * Illusion of control
> > >     * Impact bias
> > >     * Ingroup bias
> > > 
> > > J
> > > 
> > >     * Just-world phenomenon
> > > 
> > > K
> > > 
> > >     * Kuleshov Effect
> > > 
> > > L
> > > 
> > >     * Lake Wobegon effect
> > >     * Loss aversion
> > > 
> > > M
> > > 
> > >     * Memory bias
> > >     * Mindset
> > >     * Misinformation effect
> > > 
> > > N
> > > 
> > >     * Negativity effect
> > >     * Neglect of Probability
> > >     * Notational bias
> > > 
> > > O
> > > 
> > >     * Observer-expectancy effect
> > >     * Omission Bias
> > >     * Outgroup homogeneity bias
> > >     * Overconfidence effect
> > > 
> > > P
> > > 
> > >     * Pareidolia
> > > 
> > >   
> > > P cont.
> > > 
> > >     * Peak-end rule
> > >     * Physical attractiveness stereotype
> > >     * Picture superiority effect
> > >     * Planning fallacy
> > >     * Pollyanna principle
> > >     * Positivity effect
> > >     * Primacy effect
> > >     * Publication bias
> > > 
> > > R
> > > 
> > >     * Recall bias
> > >     * Recency effect
> > >     * Regression fallacy
> > >     * Response bias
> > >     * Rosy retrospection
> > > 
> > > S
> > > 
> > >     * Selective perception
> > >     * Self-deception
> > >     * Self-serving bias
> > >     * Serial position effect
> > >     * Spacing effect
> > >     * Status quo bias
> > >     * Subject-expectancy effect
> > >     * Sunk cost
> > >     * Superstition
> > >     * Suspension of judgment
> > > 
> > > T
> > > 
> > >     * Trait ascription bias
> > > 
> > > V
> > > 
> > >     * Valence effect
> > >     * Von Restorff effect
> > > 
> > > W
> > > 
> > >     * Wishful thinking
> > >     * Worse-than-average effect
> > > 
> > > Z
> > > 
> > >     * Zeigarnik effect
> > >     * Zero-risk bias
> > > 
> > > 
> > > Memory biases may either enhance or impair the recall of memory, or
> > > they may alter the content of what we report remembering.
> > > 
> > > List of memory biases
> > > 
> > >     * Choice-supportive bias - states that chosen options are
> > > remembered as better than rejected options (Mather, Shafir &
Johnson,
> > > 2000).
> > >     * Classroom effect - states that some portion of student
> > > performance is explained by the classroom environment and teacher as
> > > opposed to purely individual factors.
> > >     * Context effect - states that cognition and memory are
dependent
> > > on context, such that out-of-context memories are more difficult to
> > > retrieve than in-context memories (i.e, recall time and accuracy
for a
> > > work-related memory will be lower at home, and vice versa).
> > >     * Hindsight bias - sometimes called the "I-knew-it-all-along"
> > > effect, is the inclination to see past events as being predictable.
> > >     * Humor effect - states that humorous items are more easily
> > > remembered than non-humorous ones, which might be explained by the
> > > distinctiveness of humor, the increased cognitive processing time to
> > > understand the humor, or the emotional arousal caused by the humor.
> > >     * Infantile amnesia - states that few memories are retained from
> > > before age 2.
> > >     * Generation effect - states that self-generated information is
> > > remembered best.
> > >     * Lag effect
> > >     * Levels-of-processing effect - states that different methods of
> > > encoding information into memory have different levels of
> > > effectiveness (Craik & Lockhart, 1972).
> > >     * List-length effect
> > >     * Mere exposure effect - states that familiarity increases
liking.
> > >     * Misinformation effect - states that misinformation affects
> > > people's reports of their own memory.
> > >     * Modality effect - states that memory recall is higher for the
> > > last items of a list when the list items were received auditorily
> > > versus visually.
> > >     * Mood congruent memory bias - states that information congruent
> > > with one's current mood is remembered best.
> > >     * Next-in-line effect
> > >     * Part-list cueing effect - states that being shown some items
> > > from a list makes it harder to retrieve the other items.
> > >     * Picture superiority effect - states that concepts are much
more
> > > likely to be remembered experimentally if they are presented as
> > > pictures rather than as words.
> > >     * Positivity effect - states that older adults favor
positive over
> > > negative information in their memories.
> > >     * Processing difficulty effect - see Levels-of-processing
effect.
> > >     * Primacy effect - states that the first items on a list show an
> > > advantage in memory.
> > >     * Recency effect - states that the last items on a list show an
> > > advantage in memory.
> > >     * Rosy retrospection - states that the past is remembered as
> > > better than it really was.
> > >     * Serial position effect - states that items at the
beginning of a
> > > list are the easiest to recall, followed by the items near the
end of
> > > a list; items in the middle are the least likely to be remembered.
> > >     * Self-generation effect - states that people are better able to
> > > recall memories of statements that they have generated than similar
> > > statements generated by others.
> > >     * Self-relevance effect - states that memories considered
> > > self-relevent are better recalled that other, similar information
> > >     * Spacing effect - states that while you are more likely to
> > > remember material if exposed to it many times, you will be much more
> > > likely to remember it if the exposures are repeated over a
longer span
> > > of time.
> > >     * Suffix effect - states that there is considerable
impairment of
> > > the Recency effect, if a redundant suffix item is added to a list,
> > > which the subject is not required to recall (Morton, Crowder &
> > > Prussin, 1972).
> > >     * Testing effect - states that frequent testing of material that
> > > has been committed to memory improves memory recall more than simply
> > > study of the material without testing.
> > >     * Time-of-day effect
> > >     * Verbatim effect - states that the "gist" of what someone has
> > > said is better remembered than the verbatim wording (Poppenk, Walia,
> > > Joanisse, Danckert, & Köhler, 2006)
> > >     * Von Restorff effect - states that an item that "stands out
like
> > > a sore thumb" is more likely to be remembered than other items (von
> > > Restorff, 1933).
> > >     * Zeigarnik effect - states that people remember uncompleted or
> > > interrupted tasks better than completed ones.
> > > 
> > > 
> > > Recall bias
> > > From Wikipedia, the free encyclopedia
> > > Jump to: navigation, search
> > > 
> > > Taken generally, recall bias is a type of statistical bias which
> > > occurs when the way a survey respondent answers a question is
affected
> > > not just by the correct answer, but also by the respondent's memory.
> > > [1] [2] This can affect the results of the survey. As a hypothetical
> > > example, suppose that a survey in 2005 asked respondents whether
they
> > > believed that O. J. Simpson had killed his wife. Respondents who
> > > believed him innocent might be more likely to have forgotten
about the
> > > case, and therefore to state no opinion, than respondents who
thought
> > > him guilty. If this is the case, then the survey would find a
> > > higher-than-accurate proportion of people who believed that O.J. did
> > > kill his wife.
> > > 
> > > Relatedly but distinctly, the term might also be used to describe an
> > > instance where a survey respondent intentionally responds
incorrectly
> > > to a question about their personal history which results in response
> > > bias. As a hypothetical example, suppose that a researcher
conducts a
> > > survey among women of group A, asking whether they have had an
> > > abortion, and the same survey among women of group B.
> > > 
> > > If the results are different between the two groups, it might be
that
> > > women of one group are less likely to have had an abortion, or it
> > > might simply be that women of one group who have had abortions are
> > > less likely to admit to it. If the latter is the case, then this
would
> > > skew the survey results; this is a kind of response bias. (It is
also
> > > possible that both are the case: women of one group are less
likely to
> > > have had abortions, and women of one group who have had
abortions are
> > > less likely to admit to it. This would still affect the survey
> > > statistics.)
> > > 
> > > ====
> > > 
> > > 
> > > Logical Fallacies
> > > 
> > > Aristotelian fallacies
> > > [edit]
> > > 
> > > Material fallacies
> > > 
> > > The classification of material fallacies widely adopted by modern
> > > logicians and based on that of Aristotle, Organon (Sophistici
> > > elenchi), is as follows:
> > > 
> > >     * Fallacy of Accident (also called destroying the exception or a
> > > dicto simpliciter ad dictum secundum quid) meaning to argue
> > > erroneously from a general rule to a particular case, without proper
> > > regard to particular conditions that vitiate the application of the
> > > general rule; e.g. if manhood suffrage be the law, arguing that a
> > > criminal or a lunatic must, therefore, have a vote.
> > > 
> > >     * Converse Fallacy of Accident (also called reverse accident,
> > > destroying the exception, or a dicto secundum quid ad dictum
> > > simpliciter) meaning to argue from a special case to a general rule.
> > > 
> > >     * Irrelevant Conclusion (also called Ignoratio Elenchi),
wherein,
> > > instead of proving the fact in dispute, the arguer seeks to gain his
> > > point by diverting attention to some extraneous fact (as in the
legal
> > > story of "No case. Abuse the plaintiff's attorney"). The
fallacies are
> > > common in platform oratory, in which the speaker obscures the real
> > > issue by appealing to his audience on the grounds of
> > >           o purely personal considerations (argumentum ad hominem)
> > >           o popular sentiment (argumentum ad populum, appeal to the
> > > majority)
> > >           o fear (argumentum ad baculum)
> > >           o conventional propriety (argumentum ad verecundiam)
> > > 
> > >     This fallacy has been illustrated by ethical or theological
> > > arguments wherein the fear of punishment is subtly substituted for
> > > abstract right as the sanction of moral obligation.
> > > 
> > >     * Begging the question (also called Petitio Principii or
Circulus
> > > in Probando--arguing in a circle) consists in demonstrating a
> > > conclusion by means of premises that pre-suppose that conclusion.
> > > Jeremy Bentham points out that this fallacy may lurk in a single
word,
> > > especially in an epithet, e.g. if a measure were condemned simply on
> > > the ground that it is alleged to be "un-English".
> > > 
> > >     * Fallacy of the Consequent, really a species of Irrelevant
> > > Conclusion, wherein a conclusion is drawn from premises that do not
> > > really support it.
> > > 
> > >     * Fallacy of False Cause, or Non Sequitur (L., it does not
> > > follow), wherein one thing is incorrectly assumed as the cause of
> > > another, as when the ancients attributed a public calamity to a
> > > meteorological phenomenon (a special case of this fallacy also
goes by
> > > the Latin term post hoc ergo propter hoc; the fallacy of believing
> > > that temporal succession implies a causal relation).
> > > 
> > >     * Fallacy of Many Questions (Plurium Interrogationum), wherein
> > > several questions are improperly grouped in the form of one, and a
> > > direct categorical answer is demanded, e.g. if a prosecuting counsel
> > > asked the prisoner " What time was it when you met this man? " with
> > > the intention of eliciting the tacit admission that such a
meeting had
> > > taken place. Another example is the classic line, "Is it true
that you
> > > no longer beat your wife?"
> > > 
> > > [edit]
> > > 
> > > Verbal fallacies
> > > 
> > > Verbal fallacies are those in which a false conclusion is
obtained by
> > > improper or ambiguous use of words. They are generally classified as
> > > follows.
> > > 
> > >     * Equivocation consists in employing the same word in two or
more
> > > senses, e.g. in a syllogism, the middle term being used in one sense
> > > in the major and another in the minor premise, so that in fact there
> > > are four not three terms ("All fair things are honourable; This
woman
> > > is fair; therefore this woman is honourable," the second "fair"
being
> > > in reference to complexion).
> > >     * Amphibology is the result of ambiguity of grammatical
structure,
> > > e.g. of the position of the adverb "only" in careless writers ("He
> > > only said that," in which sentence, as experience shows, the adverb
> > > has been intended to qualify any one of the other three words).
> > >     * Fallacy of Composition is a species of Amphibology that
results
> > > from the confused use of collective terms. e.g. "The angles of a
> > > triangle are less than two right angles" might refer to the angles
> > > separately or added together.
> > >     * Division, the converse of the preceding, which consists in
> > > employing the middle term distributively in the minor and
collectively
> > > in the major premise.
> > >     * Accent, which occurs only in speaking and consists of
> > > emphasizing the wrong word in a sentence. e.g., "He is a fairly good
> > > pianist," according to the emphasis on the words, may imply
praise of
> > > a beginner's progress, or an expert's depreciation of a popular
hero,
> > > or it may imply that the person in question is a deplorable
violinist.
> > >     * Figure of Speech, the confusion between the metaphorical and
> > > ordinary uses of a word or phrase.
> > > 
> > > Logical Fallacies
> > > 
> > > http://en.wikipedia.org/wiki/Fallacy
> > > 
> > > The standard Aristotelian logical fallacies are:
> > > 
> > >     * Fallacy of Four Terms (Quaternio terminorum)
> > >     * Fallacy of Undistributed Middle
> > >     * Fallacy of Illicit process of the major or the Illicit minor
> term;
> > >     * Fallacy of Negative Premises.
> > > 
> > > [edit]
> > > 
> > > Other systems of classification
> > > 
> > > Of other classifications of fallacies in general the most famous are
> > > those of Francis Bacon and J. S. Mill. Bacon (Novum Organum,
Aph. 33,
> > > 38 sqq.) divided fallacies into four Idola (Idols, i.e. False
> > > Appearances), which summarize the various kinds of mistakes to which
> > > the human intellect is prone. With these should be compared the
> > > Offendicula of Roger Bacon, contained in the Opus maius, pt. i.
J. S.
> > > Mill discussed the subject in book v. of his Logic, and Jeremy
> > > Bentham's Book of Fallacies (1824) contains valuable remarks.
See Rd.
> > > Whateley's Logic, bk. v.; A. de Morgan, Formal Logic (1847) ; A.
> > > Sidgwick, Fallacies (1883) and other textbooks.
> > > [edit]
> > > 
> > > Fallacies in the media and politics
> > > 
> > > Fallacies are used frequently by pundits in the media and politics.
> > > When one politician says to another, "You don't have the moral
> > > authority to say X", this could be an example of the argumentum ad
> > > hominem or personal attack fallacy; that is, attempting to
disprove X,
> > > not by addressing validity of X but by attacking the person who
> > > asserted X. Arguably, the politician is not even attempting to
make an
> > > argument against X, but is instead offering a moral rebuke
against the
> > > interlocutor. For instance, if X is the assertion:
> > > 
> > >     The military uniform is a symbol of national strength and honor.
> > > 
> > > Then ostensibly, the politician is not trying to prove the contrary
> > > assertion. If this is the case, then there is no logically
fallacious
> > > argument, but merely a personal opinion about moral worth. Thus
> > > identifying logical fallacies may be difficult and dependent upon
> > context.
> > > 
> > > In the opposite direction is the fallacy of argument from
authority. A
> > > classic example is the ipse dixit—"He himself said it" argument—used
> > > throughout the Middle Ages in reference to Aristotle. A modern
> > > instance is "celebrity spokespersons" in advertisements: a
product is
> > > good and you should buy/use/support it because your favorite
celebrity
> > > endorses it.
> > > 
> > > An appeal to authority is always a logical fallacy, though it can be
> > > an appropriate form of rational argument if, for example, it is an
> > > appeal to expert testimony. In this case, the expert witness must be
> > > recognized as such and all parties must agree that the testimony is
> > > appropriate to the circumstances. This form of argument is common in
> > > legal situations.
> > > 
> > > By definition, arguments with logical fallacies are invalid, but
they
> > > can often be (re)written in such a way that they fit a valid
argument
> > > form. The challenge to the interlocutor is, of course, to
discover the
> > > false premise, i.e. the premise that makes the argument unsound.
> > > [edit]
> > > 
> > > General list of fallacies
> > > 
> > > The entries in the following list are neither exhaustive nor
mutually
> > > exclusive; that is, several distinct entries may refer to the same
> > > pattern. As noted in the introduction, these fallacies describe
> > > erroneous or at least suspect patterns of argument in general, not
> > > necessarily argument based on formal logic. Many of the fallacies
> > > listed are traditionally recognized and discussed in works on
critical
> > > thinking; others are more specialized.
> > > 
> > >     * Ad hominem (also called argumentum ad hominem or personal
> > > attack) including:
> > >           o ad hominem abusive (also called argumentum ad personam)
> > >           o ad hominem circumstantial (also called ad hominem
> > > circumstantiae)
> > >           o ad hominem tu quoque (also called you-too argument)
> > >     * Amphibology (also called amphiboly)
> > >     * Appeal to authority (also called argumentum ad verecundiam or
> > > argument by authority)
> > >     * Appeal to emotion including:
> > >           o Appeal to consequences (also called argumentum ad
> > > consequentiam)
> > >           o Appeal to fear (also called argumentum ad metum or
> > > argumentum in terrorem)
> > >           o Appeal to flattery
> > >           o Appeal to pity (also called argumentum ad misericordiam)
> > >           o Appeal to ridicule
> > >           o Appeal to spite (also called argumentum ad odium)
> > >           o Two wrongs make a right
> > >           o Wishful thinking
> > >     * Appeal to the majority (also called Appeal to belief,
Argumentum
> > > ad numerum, Appeal to popularity, Appeal to the people, Bandwagon
> > > fallacy, Argumentum ad populum, Authority of the many, Consensus
> > > gentium, Argument by consensus)
> > >     * Appeal to motive
> > >     * Appeal to novelty (also called argumentum ad novitatem)
> > >     * Appeal to probability
> > >     * Appeal to tradition (also called argumentum ad antiquitatem or
> > > appeal to common practice)
> > >     * Argument from fallacy (also called argumentum ad logicam)
> > >     * Argument from ignorance (also called argumentum ad ignorantiam
> > > or argument by lack of imagination)
> > >     * Argument from silence (also called argumentum ex silentio)
> > >     * Appeal to force (also called argumentum ad baculum)
> > >     * Appeal to wealth (also called argumentum ad crumenam)
> > >     * Appeal to poverty (also called argumentum ad lazarum)
> > >     * Argument from repetition (also called argumentum ad nauseam)
> > >     * Base rate fallacy
> > >     * Begging the question (also called petitio principii, circular
> > > argument or circular reasoning)
> > >     * Conjunction fallacy
> > >     * Continuum fallacy (also called fallacy of the beard)
> > >     * Correlative based fallacies including:
> > >           o Fallacy of many questions (also called complex question,
> > > fallacy of presupposition, loaded question or plurium
interrogationum)
> > >           o False dilemma (also called false dichotomy or
bifurcation)
> > >           o Denying the correlative
> > >           o Suppressed correlative
> > >     * Definist fallacy
> > >     * Dicto simpliciter, including:
> > >           o Accident (also called a dicto simpliciter ad dictum
> > > secundum quid)
> > >           o Converse accident (also called a dicto secundum quid ad
> > > dictum simpliciter)
> > >     * Equivocation
> > >     * Engineering Fallacy
> > >     * Fallacies of distribution:
> > >           o Composition
> > >           o Division
> > >           o Ecological fallacy
> > >     * Fallacies of Presumption
> > >     * False analogy
> > >     * False premise
> > >     * False compromise
> > >     * Faulty generalization including:
> > >           o Biased sample
> > >           o Hasty generalization (also called fallacy of
insufficient
> > > statistics, fallacy of insufficient sample, fallacy of the lonely
> > > fact, leaping to a conclusion, hasty induction, secundum quid)
> > >           o Overwhelming exception
> > >           o Statistical special pleading
> > >     * Gambler's fallacy/Inverse gambler's fallacy
> > >     * Genetic fallacy
> > >     * Guilt by association
> > >     * Historian's fallacy
> > >     * Homunculus fallacy
> > >     * If-by-whiskey (argues both sides)
> > >     * Ignoratio elenchi (also called irrelevant conclusion)
> > >     * Inappropriate interpretations or applications of statistics
> > > including:
> > >           o Biased sample
> > >           o Correlation implies causation
> > >           o Gambler's fallacy
> > >           o Prosecutor's fallacy
> > >           o Screening test fallacy
> > >     * Incomplete comparison
> > >     * Inconsistent comparison
> > >     * Invalid proof
> > >     * Judgemental language
> > >     * Juxtaposition
> > >     * Lump of labour fallacy (also called the fallacy of labour
> > scarcity)
> > >     * Meaningless statement
> > >     * Middle ground (also called argumentum ad temperantiam)
> > >     * Misleading vividness
> > >     * Naturalistic fallacy
> > >     * Negative proof
> > >     * Non sequitur including:
> > >           o Affirming the consequent
> > >           o Denying the antecedent
> > >     * No true Scotsman
> > >     * Package deal fallacy
> > >     * Perfect solution fallacy
> > >     * Poisoning the well
> > >     * Progressive fallacy ("New is improved")
> > >     * Proof by assertion
> > >     * Questionable cause (also called non causa pro causa)
including:
> > >           o Correlation implies causation (also called cum hoc ergo
> > > propter hoc)
> > >           o Fallacy of the single cause
> > >           o Joint effect
> > >           o Post hoc (also called post hoc ergo propter hoc)
> > >           o Regression fallacy
> > >           o Texas sharpshooter fallacy
> > >           o Wrong direction
> > >     * Red herring (also called irrelevant conclusion)
> > >     * Reification (also called hypostatization)
> > >     * Relativist fallacy (also called subjectivist fallacy)
> > >     * Retrospective determinism (it happened so it was bound to)
> > >     * Shifting the burden of proof
> > >     * Slippery slope
> > >     * Special pleading
> > >     * Straw man
> > >     * Style over substance fallacy
> > >     * Sunk cost fallacy
> > >     * Syllogistic fallacies, including:
> > >           o Affirming a disjunct
> > >           o Affirmative conclusion from a negative premise
> > >           o Existential fallacy
> > >           o Fallacy of exclusive premises
> > >           o Fallacy of four terms (also called quaternio terminorum)
> > >           o Fallacy of the undistributed middle
> > >           o Illicit major
> > >           o Illicit minor
> > > 
> > > [edit]
> > > 
> > > General examples
> > > 
> > > Fallacious arguments involve not only formal logic but also
causality.
> > > Others involve psychological ploys such as use of power
relationships
> > > between proposer and interlocutor, appeals to patriotism and
morality,
> > > appeals to ego etc., to establish necessary intermediate
(explicit or
> > > implicit) premises for an argument. Indeed, fallacies very often lay
> > > in unstated assumptions or implied premises in arguments that
are not
> > > always obvious at first glance. One way to obscure a premise is
> > > through enthymeme.
> > > 
> > > We now give a few examples illustrating common errors in reasoning.
> > > Note that providing a critique of an argument has no relation to the
> > > truth of the conclusion. The conclusion could very well be true,
while
> > > the argument itself is not valid. See argument from fallacy.
> > > 
> > > In the following, we view an argument as a dialogue between a
proposer
> > > and an interlocutor.
> > > [edit]
> > > 
> > > Example 1: Material Fallacy
> > > 
> > > James argues:
> > > 
> > >    1. Cheese is food.
> > >    2. Food is delicious.
> > >    3. Therefore, cheese is delicious.
> > > 
> > > This argument claims to prove that cheese is delicious. This
> > > particular argument has the form of a categorical syllogism. Any
> > > argument must have premises as well as a conclusion. In this case we
> > > need to ask what the premises are, that is the set of
assumptions the
> > > proposer of the argument can expect the interlocutor to grant. The
> > > first assumption is almost true by definition: cheese is a foodstuff
> > > edible by humans. The second assumption is less clear as to its
> > > meaning. Since the assertion has no quantifiers of any kind, it
could
> > > mean any one of the following:
> > > 
> > >     * All food is delicious.
> > >     * Most food is delicious.
> > >     * All food is delicious, except for spoiled or moldy food.
> > >     * Some food is disgusting.
> > > 
> > > In any of the last three interpretations, the above syllogism would
> > > then fail to have validated its second premise. James may try to
> > > assume that his interlocutor believes that all food is delicious; if
> > > the interlocutor grants this then the argument is valid. In this
case,
> > > the interlocutor is essentially conceding the point to James.
However,
> > > the interlocutor is more likely to believe that some food is
> > > disgusting, such as a sheep's liver white chocolate torte; and
in this
> > > case James is not much better off than he was before he
formulated the
> > > argument, since he now has to prove the assertion that cheese is a
> > > unique type of universally delicious food, which is a disguised form
> > > of the original thesis. From the point of view of the interlocutor,
> > > James commits the logical fallacy of begging the question.
> > > [edit]
> > > 
> > > Example 2: Verbal Fallacy
> > > 
> > > Barbara argues:
> > > 
> > >    1. Andre is a good tennis player.
> > >    2. Therefore, Andre is 'good', that is to say a morally good
> person.
> > > 
> > > Here the problem is that the word good has different meanings, which
> > > is to say that it is an ambiguous word. In the premise, Barbara says
> > > that Andre is good at some particular activity, in this case tennis.
> > > In the conclusion, she says that Andre is a morally good person.
These
> > > are clearly two different senses of the word "good". The premise
might
> > > be true but the conclusion can still be false: Andre might be
the best
> > > tennis player in the world but a rotten person morally. However,
it is
> > > not legitimate to infer he is a bad person on the ground there has
> > > been a fallacious argument on the part of Barbara. Nothing
concerning
> > > Andre's moral qualities is to be inferred from the premise.
> > > Appropriately, since it plays on an ambiguity, this sort of
fallacy is
> > > called the fallacy of equivocation, that is, equating two
incompatible
> > > terms or claims.
> > > [edit]
> > > 
> > > Example 3: Verbal Fallacy
> > > 
> > > Ramesh argues:
> > > 
> > >    1. Nothing is better than eternal happiness.
> > >    2. Eating a hamburger is better than nothing.
> > >    3. Therefore, eating a hamburger is better than eternal
happiness.
> > > 
> > > This argument has the appearance of an inference that applies
> > > transitivity of the two-placed relation is better than, which in
this
> > > critique we grant is a valid property. The argument is an example of
> > > syntactic ambiguity. In fact, the first premise semantically
does not
> > > predicate an attribute of the subject, as would for instance the
> > assertion
> > > 
> > >     A potato is better than eternal happiness.
> > > 
> > > In fact it is semantically equivalent to the following universal
> > > quantification:
> > > 
> > >     Everything fails to be better than eternal happiness.
> > > 
> > > So instantiating this fact with eating a hamburger, it logically
> > > follows that
> > > 
> > >     Eating a hamburger fails to be better than eternal happiness.
> > > 
> > > Note that the premise A hamburger is better than nothing does not
> > > provide anything to this argument. This fact really means something
> > > such as
> > > 
> > >     Eating a hamburger is better than eating nothing at all.
> > > 
> > > Thus this is a fallacy of composition.
> > > [edit]
> > > 
> > > Example 4: Logical Fallacy
> > > 
> > > In the strictest sense, a logical fallacy is the incorrect
application
> > > of a valid logical principle or an application of a nonexistent
> > principle:
> > > 
> > >    1. Some drivers are men.
> > >    2. Some drivers are women.
> > >    3. Therefore, some drivers are both men and women.
> > > 
> > > This is fallacious. Indeed, there is no logical principle that
states
> > > 
> > >    1. For some x, P(x).
> > >    2. For some x, Q(x).
> > >    3. Therefore for some x, P(x) and Q(x).
> > > 
> > > An easy way to show the above inference is invalid is by using Venn
> > > diagrams. In logical parlance, the inference is invalid, since under
> > > at least one interpretation of the predicates it is not validity
> > > preserving.
> > >
> >
>







------------------------ Yahoo! Groups Sponsor --------------------~--> 
Home is just a click away.  Make Yahoo! your home page now.
http://us.click.yahoo.com/DHchtC/3FxNAA/yQLSAA/UlWolB/TM
--------------------------------------------------------------------~-> 

To subscribe, send a message to:
[EMAIL PROTECTED]

Or go to: 
http://groups.yahoo.com/group/FairfieldLife/
and click 'Join This Group!' 
Yahoo! Groups Links

<*> To visit your group on the web, go to:
    http://groups.yahoo.com/group/FairfieldLife/

<*> To unsubscribe from this group, send an email to:
    [EMAIL PROTECTED]

<*> Your use of Yahoo! Groups is subject to:
    http://docs.yahoo.com/info/terms/
 


Reply via email to