On 15 February 2013 21:26, Janesh Devkota <janesh.devk...@gmail.com> wrote:

> Hi I am trying to find the relationship between two variables.
>
> First I fitted a linear model between two variables and I found the
> following results:
> Residual standard error: 0.03253 on 2498 degrees of freedom
> Multiple R-squared: 0.5551, Adjusted R-squared: 0.5549
> F-statistic:  3116 on 1 and 2498 DF,  p-value: < 2.2e-16
>
> Then I used the cor function to see the correlation between two variable
> I get the following result
> -0.7450344
>
>
r is a correlation (it actually stands for regression).

R (upper case) is a multiple correlation. But you only have one predictor,
so it's a correlation.

R squared is R (or r), squared.  So -0.7450433^2 = 0.555.



> How can we interpret the result based on R-squared and correlation ? From
> the p-value we can see that there is very strong relationship between
> variables as it is  way less that 0.001
>
>

The p-value doesn't tell you about the strength of the relationship.


> Can anyone kindly explain the difference between Multiple R squared,
> adjusted R-squared and correlation and how to report these values while
> writing a report ?
>
>
I can suggest a number of books that do this much better than I could in an
email. But you probably have a favorite of your own.

Jeremy

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to