Adjusted (shrunken) R**2 can be negative.

Cheers,
 
Karl W.
-----Original Message-----
From: Rick Froman [mailto:[email protected]] 
Sent: Wednesday, April 21, 2010 8:47 AM
To: Teaching in the Psychological Sciences (TIPS)
Subject: RE: Re:[tips] Biserial r.

OK, I know that some correlational techniques occasionally produce r greater 
than 1 or less than -1 but I think I am on firm footing when I say that I am 
not going to see a negative r-squared in the set of real numbers used in 
statistical calculations (although it may occur with complex numbers 
http://mathforum.org/library/drmath/view/52613.html). 

Rick

Dr. Rick Froman, Chair
Division of Humanities and Social Sciences
John Brown University
Siloam Springs, AR  72761
[email protected]
________________________________________
From: Mike Palij [[email protected]]

If one comes across a biserial r that is greater than +1.00 or less than -1.00,
then I think one should treat it the same way one might treat a negative
Cronbach's alpha or a negative R-squareed:  it's an indication that something
is seriously wrong and you need to review the validity of your assumptions,
the nature of your data, and the suitability of your analysis for the situation.

-Mike Palij
New York University
[email protected]


On  Tue, 20 Apr 2010 23:31:30 -0500,  Jim Clark  wrote:
> Hi
> Following SPSS simulation generates 1000 samples of 100 x y pairs with known 
> population rho (#r = .9 here), then dichotomizes x to create categorical 
> predictor c, which is then used to calculate rb, the biserial r (I had to 
> track down various algorithms for this, but it seems correct ... mean rb, for 
> example, is very close to rho).  Anyway, it illustrates that for extreme 
> values of rho, rb can in fact exceed 1 (presumably same at other tail).  12 
> of 1000 rbs were > 1 in one simulation I ran.  Perhaps there are other 
> factors that also influence likelihood of getting values beyond normal range 
> for rs (e.g., size of categories).
>
> input program.
> comp #r = .9.
> loop samp = 1 to 1000.
> leave samp.
> loop obs = 1 to 100.
> comp x = rv.norm(0,1).
> comp y = rv.norm(0,1)*SQRT(1-#r**2) + x*#r.
> end case.
> end loop.
> end loop.
> end file.
> end input program.
> comp c = 0.
> if x > -.2 c = 1.
> if c = 0 y0 = y.
> if c = 1 y1 = y.
>
> aggre /outfile = * /presort /break = samp
>  /m0 = mean(y0)   /m1 = mean(y1)   /p = fgt(c, 0)  /q = flt(c, 1)  /sy = 
> sd(y).
>
> compute z = idf.normal(q, 0, 1).
> compute ord = .3989*2.71828**-((z**2)/2).
> compute rb = (m1 - m0)*((p*q/ord)/sy).
> freq rb /forma = notable /hist.
> comp rbx = (rb<-1) or (rb>+1).
> freq rbx.
>
> It is perhaps worth noting that there are other widely used statistics that 
> produce "impossible" values.  The Bonferroni test, for example, can produce 
> ps > 1 if one computes LSD p x # comparisons (as reported in SPSS, for 
> example).  SPSS rounds these to 1.  Perhaps similar convention is adopted for 
> rb?
>
> I'm hard-pressed to decide whether to thank Karl for raising this interesting 
> question, or berate him for taking me away from my marking to do this 
> exercise!  Or perhaps the latter should be a thanks as well?
>
> Take care
> Jim


---
You are currently subscribed to tips as: [email protected].
To unsubscribe click here: 
http://fsulist.frostburg.edu/u?id=13039.37a56d458b5e856d05bcfb3322db5f8a&n=T&l=tips&o=2128
or send a blank email to 
leave-2128-13039.37a56d458b5e856d05bcfb3322db5...@fsulist.frostburg.edu
---
You are currently subscribed to tips as: [email protected].
To unsubscribe click here: 
http://fsulist.frostburg.edu/u?id=13060.c78b93d4d09ef6235e9d494b3534420e&n=T&l=tips&o=2130
or send a blank email to 
leave-2130-13060.c78b93d4d09ef6235e9d494b35344...@fsulist.frostburg.edu

---
You are currently subscribed to tips as: [email protected].
To unsubscribe click here: 
http://fsulist.frostburg.edu/u?id=13090.68da6e6e5325aa33287ff385b70df5d5&n=T&l=tips&o=2138
or send a blank email to 
leave-2138-13090.68da6e6e5325aa33287ff385b70df...@fsulist.frostburg.edu

Reply via email to