Gus,

Have you tried predicting x1 from y and checking the residual (errors)?  Be
sure to use uniform x1 and x2 when generating
y=x1+x2. Then, generate the residual by using y as the predictor of x1. The
residual will equal x2 (when the predictor is the effect).  Then look at the
relationship between the extremity of the predictor (y) and the absolute
value of the residual (error).  I think you will find the error actually
DECREASE as we move from the mean towards the extremes of the predictor (y).
Thus flipping the regression prediction variables around will lead to very
different error patterns.

The reduction in errors when predicting x1 from y is the basis of
corresponding correlations. Predicting from the effect decreases errors in
the predictors extremes. Predicting from the cause increases errors in the
extremes of the predictor.
The asymmetry allows us to detect which is the cause, which is the effect.

Bill




"Gus Gassmann" <[EMAIL PROTECTED]> wrote in message
[EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
>
>
> [EMAIL PROTECTED] wrote:
>
> > Donald,
> >
> > I think I understand. In the model y=x1+x2, the expected value of y(
when
> > using the regression equation y=a+ b1x1 +e)  is equal to x1. The
expected
> > value of x2 (error) given x1 is zero, since x1 and x2(error) are
> > uncorrelated. Thus, in the extremes of x1, greater errors are expected
when
> > predicting y from x1, since the errors are not all likely to equal zero.
> >
> > The picture changes however when we predict x1 from y.
> > In the regression model x1=y, the errors (x2) will be smaller in the
> > extremes of y since x1 and x2 are correlated in the extremes of y. That
is,
> > when x1 and x2 are uniformly distributed. But when x1 and x2 are not
uniform
> > but normal, the smaller errors expected in the extremes of y (predicting
x1)
> > are not much smaller, since x1 and x2 will be nearer correlated at zero.
>
> This is actually not true, at least not all of it. If you regress x1 on y,
a
> residual
> plot will show you very quickly that there is something missing, namely,
as
> you write, the residuals are small (tend to be negative)  when x1 is small
and
> are large (tend to be positive) when x1 is positive. (This assumes that
the
> variable x2 has mean 0.)
>
> However, since the slope of the regression line will only be close to 1,
the
> regression line y = b_0 + b_1 x_1 will diverge from the line y = x_1.
Again,
> if you try to make predictions or estimates above or below the most
extreme
> observed values of x_1, the divergence of the two lines will necessarily
> widen your confidence band. The distribution of x_2 _may_ (I have not
checked)
> influence the amount of the widening, but it has nothing to do with the
> phenomenon itself.
>
> > The explanation of the latter model is based on Bunge's notion of
conjoint
> > and disjoint causes. In conjoint causation (uniform causes), all the
levels
> > of the causes are combined. A crosstabulation of y by x1and x2 will have
> > data in the corners of the box. Thus the most extreme values of y will
be
> > the combinations of extreme correlated values of x1 and x2.  But with
> > normally distributed causes, the causation is disjoint, the corners of
the
> > cross tabulations will tend to be empty, because there will be so few
> > instances where correlated extremes of the causes are paired.
Consequently,
> > the range of y will be less in the disjoint cause than in the conjoint
> > cause. The extreme value of y will tend to be caused by extremes of
either
> > x1 or x2 (disjoint) but not by both at the same time (conjoint).
> >
> > So we are probably all correct.  When predicting the effect from a
cause,
> > the error is greatest in the extremes of y, the values of x1 and x2
diverge
> > with deviations from the intercept.  When predicting a cause from the
> > effect, the errors will be smallest in the extremes of the effect. This
> > assuming the causes are uniform/conjoint.
> >
> > At least in theory.
>
> "In theory, there is no difference between theory and practice.
>  But in practice, there is."
>



.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to