Re: [R] about R squared value

2007-04-23 Thread Bernd Dittmann
Hi Nitish,

R^2 cannot take values of greater than 1.

Per definition (see 
http://en.wikipedia.org/wiki/Coefficient_of_determination)

R^2 := 1- SSE/SST

whereby
SSE = sum of squared errors
SST = total sum of squares

For R^2  1 would require SSE/SST 0.

Since SSE and SST are non-negative (check the formulas, they are the sum 
of squared differences which are neccessarily non-negative), SSE/SST  0 
is impossible.

Bernd

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] p-values and significance

2007-04-11 Thread Bernd Dittmann
Hi Paul,

here's a lm model to illustrate this:


  summary(lm(y~x.1+x.2))

Call:
lm(formula = y ~ x.1 + x.2)

Residuals:
   Min 1Q Median 3QMax
-0.0561359 -0.0054020  0.0004553  0.0056516  0.0515817

Coefficients:
  Estimate Std. Error t value Pr(|t|)
(Intercept)  0.0007941  0.0002900   2.738 0.006278 **
x.1 -0.0446746  0.0303192  -1.473 0.140901
x.2  0.1014467  0.0285513   3.553 0.000396 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 0.009774 on 1134 degrees of freedom
  (64 observations deleted due to missingness)
Multiple R-Squared: 0.01336,Adjusted R-squared: 0.01162
F-statistic: 7.676 on 2 and 1134 DF,  p-value: 0.0004883



summary(lm(...)) computes t-values and the resulting p-values for each 
regressor.
The intercept is significant at 0.6%, similarly, x.2 is significant at 
0.04%. Only x.1 is not significant at a conventional level of 5%. Its p 
is 14%.

Overall significance of the model is given by the F stats (=7.676 at p 
less than 0.05%).

Hope that helped.

Bernd

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] - Nonparametric variance test

2007-03-04 Thread Bernd Dittmann
Hi useRs,

can a variance test for 2 non-normal samples be tested in R? Also, thus 
far I have not been able to find the Friedman two way analysis of variance.

For normal r.v., the var.test is available, but are there any tests 
available for non-normal samples?

Thanks!

Bernd

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] garch and extra explanatory variable

2007-02-27 Thread Bernd Dittmann
Hi useRs,

a daily garch(1,1) model can be extended whereby the variance equation 
incorporates say higher frequency volatility measure.

The variance equation would look something like:

s(t)2 = garch(1,1) + a*v(t-1)

whereby v(t-1) would be the intraday vola of yesterday (a the coef.).

How can this be implemented in R?

I checked garch of tseries. An extended formula cannot be specified. 
fitGarch of fseries might be able to do that. Unfortunately, I am not 
quite sure how to specify in the fseries package.

Or would the estimation have do be done manually?

Comments and hints highly appreciated!

Thanks!

Bernd

[[alternative HTML version deleted]]

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


Re: [R] Calculating the Sharpe ratio

2007-02-20 Thread Bernd Dittmann
Hi Mark,

thanks for your email.
I used your formula for cumul. returns and plugged them into sharpe:

  mysharpe - function(x){
+ return(sharpe(cret(x), r=0, scale=1))
+ }

whereby cret is my cumul. returns function as defined by:

  cret
function(x){
cumprod(diff(log(x))+1)-1
}

For the index series Index I obtain a sharpe ratio (r=0 and scale=1) of:

  mysharpe(Index)
[1] 0.8836429

Do you reckon this result and the method above are correct?

Many thanks in advance!

Bernd




Leeds, Mark (IED) schrieb:
 If the doc says to use cumulated and you didn't, then I supsect the call
 to shaprp in
 Tseries is not correct.  Also, to get PercetnREturns, I hope you did
 diff(log(series))
 Where series is an object containing prices. It's not so clear
 Form your email.

 If you want to send in cumulative returns ( which
 You should do if the doc says to ) you just take the returns ( by doing
 above )
 and then , add 1 to each element, do  a cumprod and then subtract 1 so
 something like :

 rtns-diff(log(priceseries)
 oneplusrtns-1+rtns
 cumprodrtns-cumprod(oneplusreturns)

 cumrtns-cumprodrtns-1.

 Then, the elements in cumrtns represent the cumulative reeturn upto that
 point.

 But, test it out with an easy example to make sure because I didn't.




 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Bernd Dittmann
 Sent: Monday, February 19, 2007 8:39 AM
 To: r-help@stat.math.ethz.ch
 Subject: [R] Calculating the Sharpe ratio

 Hi useRs,

 I am trying to calculate the Sharpe ratio with sharpe of the library
 tseries.

 The documentation requires the univariate time series to be a
 portfolio's cumulated returns. In this case, the example given

 data(EuStockMarkets)
 dax - log(EuStockMarkets[,FTSE])

 is however not the cumulated returns but rather the daily returns of the
 FTSE stock index.

 Is this way of calculating the Sharpe ratio correct?

 Here are my own data:

 yearIndexPercentReturns
 19851170.091
 1986129.90.11
 1987149.90.154
 1988184.80.233
 1989223.10.208
 1990223.20
 1991220.5-0.012
 1992208.1-0.056
 1993202.1-0.029
 1994203.10.005
 1995199.6-0.017
 1996208.60.045
 1997221.70.063
 1998233.70.054
 1999250.50.072
 2000275.10.098
 2001298.60.085
 2002350.60.174
 2003429.10.224
 2004507.60.183
 2005536.60.057
 2006581.30.083


 I calculated the Sharpe ratio in two different ways:
 (1) using natural logs as approximation of % returns, using sharpe of
 tseries.
 (2) using the % returns using a variation the sharpe function.

 In both cases I used the risk free rate r=0 and scale=1 since I am using
 annual data already.

 My results:

 METHOD 1: sharpe:

   index - log(Index)
   sharpe(index, scale=1)
 [1] 0.9614212



 METHOD 2: my own %-based formula:

   mysharp
 function(x, r=0, scale=sqrt(250))
 {
 if (NCOL(x)  1)
 stop(x is not a vector or univariate time series) if (any(is.na(x)))
 stop(NAs in x) if (NROW(x) ==1)
 return(NA)
 else{
 return(scale * (mean(x) - r)/sd(x))
 }
 }



   mysharp(PercentReturns, scale=1)
 [1] 0.982531


 Both Sharp ratios differ only slightly since logs approximate percentage
 changes (returns).


 Are both methods correct, esp. since I am NOT using cumulated returns as

 the manual says?

 If cumulated returns were supposed to be used, could I cumulate the 
 %-returns with cumsum(PercentReturns)?

 Many thanks in advance!

 Bernd

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide
 http://www.R-project.org/posting-guide.html
 and provide commented, minimal, self-contained, reproducible code.
 

 This is not an offer (or solicitation of an offer) to buy/sell the 
 securities/instruments mentioned or an official confirmation.  Morgan Stanley 
 may deal as principal in or own or act as market maker for 
 securities/instruments mentioned or may advise the issuers.  This is not 
 research and is not from MS Research but it may refer to a research 
 analyst/research report.  Unless indicated, these views are the author's and 
 may differ from those of Morgan Stanley research or others in the Firm.  We 
 do not represent this is accurate or complete and we may not update this.  
 Past performance is not indicative of future returns.  For additional 
 information, research reports and important disclosures, contact me or see 
 https://secure.ms.com/servlet/cls.  You should not use e-mail to request, 
 authorize or effect the purchase or sale of any security or instrument, to 
 send transfer instructions, or to effect any other transactions.  We cannot 
 guarantee that any such requests received vi!
 a e-mail will be processed in a timely manner.  This communication is solely

[R] Calculating the Sharpe ratio

2007-02-19 Thread Bernd Dittmann
Hi useRs,

I am trying to calculate the Sharpe ratio with sharpe of the library 
tseries.

The documentation requires the univariate time series to be a 
portfolio's cumulated returns. In this case, the example given

data(EuStockMarkets)
dax - log(EuStockMarkets[,FTSE])

is however not the cumulated returns but rather the daily returns of the 
FTSE stock index.

Is this way of calculating the Sharpe ratio correct?

Here are my own data:

yearIndexPercentReturns
19851170.091
1986129.90.11
1987149.90.154
1988184.80.233
1989223.10.208
1990223.20
1991220.5-0.012
1992208.1-0.056
1993202.1-0.029
1994203.10.005
1995199.6-0.017
1996208.60.045
1997221.70.063
1998233.70.054
1999250.50.072
2000275.10.098
2001298.60.085
2002350.60.174
2003429.10.224
2004507.60.183
2005536.60.057
2006581.30.083


I calculated the Sharpe ratio in two different ways:
(1) using natural logs as approximation of % returns, using sharpe of 
tseries.
(2) using the % returns using a variation the sharpe function.

In both cases I used the risk free rate r=0 and scale=1 since I am using 
annual data already.

My results:

METHOD 1: sharpe:

  index - log(Index)
  sharpe(index, scale=1)
[1] 0.9614212



METHOD 2: my own %-based formula:

  mysharp
function(x, r=0, scale=sqrt(250))
{
if (NCOL(x)  1)
stop(x is not a vector or univariate time series)
if (any(is.na(x)))
stop(NAs in x)
if (NROW(x) ==1)
return(NA)
else{
return(scale * (mean(x) - r)/sd(x))
}
}



  mysharp(PercentReturns, scale=1)
[1] 0.982531


Both Sharp ratios differ only slightly since logs approximate percentage 
changes (returns).


Are both methods correct, esp. since I am NOT using cumulated returns as 
the manual says?

If cumulated returns were supposed to be used, could I cumulate the 
%-returns with cumsum(PercentReturns)?

Many thanks in advance!

Bernd

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


[R] manual construction of boxwhisker plot

2006-04-15 Thread Bernd Dittmann
Dear useRs,

how can I construct a boxwhisker plot based on the vector fivenum?

The challenge I face is as follow: I have a table such as

x   |   fivenum
---
... |   (.)
... |   (.)

and so forth


For each observation x I have generated a vector containing the fivenum 
estimates.

The first challenge is to group my fivenum vectors into groups based on 
a selection criterion of x, say for 0  x  6, all fivenum vectors would 
be collected in that group.

Once all my fivenum vectors are in their respective groups, I wish to 
generate a bw plot for each group.


How could I possibly do that? Also, what would be the most convenient 
approach.

Looking forward to your suggestions.

Many thanks in advance!

Sincerely,

Bernd Dittmann

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] question reg. conditional regression

2006-04-13 Thread Bernd Dittmann
Hi useRs,

I have been running a regression of the following kind:



  summary(lm(dx[2:2747] ~ 0 + (dx[1:2746]15)))

Call:
lm(formula = dx[2:2747] ~ 0 + (dx[1:2746]  15))

Residuals:
  Min1QMedian3Q   Max
-46.35871  -3.15871   0.04129   3.04129  30.04129

Coefficients:
  Estimate Std. Error t value Pr(|t|)   
dx[1:2746]  15FALSE -0.041290.11467  -0.3600.719   
dx[1:2746]  15TRUE   3.493330.88309   3.956 7.82e-05 ***
---
Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

Residual standard error: 5.924 on 2712 degrees of freedom
Multiple R-Squared: 0.005784,   Adjusted R-squared: 0.005051
F-statistic: 7.889 on 2 and 2712 DF,  p-value: 0.0003835



In this model, I have lagged the differences series dx (whereby I define 
dx:= diff(x, difference = 1) )and regressed next period's change to this 
period's change on the condition that this period's change is greater 
than 15.

As shown in the summary above, for dx[1:2746]  15 true, my coefficient 
is significant (t = 3.956).

My question however is whether I interpret the result correctly.

Is it indeed implied that, if the condition of dx[1:2746]  15 is 
fulfilled, then dx[2:2747] changes by 3.49333 the next period?
Alternatively, if this period's charge is = 15, then there is no 
significant change (-0.04129, t=-0.360) the next period.



Thank you!

Sincerely,

Bernd Dittmann

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] How to implement an iterative unit root test

2006-04-08 Thread Bernd Dittmann
Thank you for your suggestion, Andy.

Luckily, the fMultivar package has already implemented such a rolling 
function: rollFun.

Thus I tried the following:


myfunction - function(x, n = 5)
{
rollFun(x = x, n = n, FUN = adfTest)
}


This however does not return the tau values (or alternatively, the 
p.values) I am looking for.
How do I need to define the function FUN to obtain these values?


Many thanks!

Sincerely,

Bernd



Andy Bunn schrieb:
 Does this get you started?

 library(tseries)
 ?adf.test
 foo - matrix(rnorm(1000),ncol=10,nrow=100)
 bar - apply(foo,2,adf.test)
 sapply(bar, [[, statistic)
 sapply(bar, [[, p.value)


 HTH, Andy

 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Bernd Dittmann
 Sent: Wednesday, April 05, 2006 8:58 PM
 To: r-help@stat.math.ethz.ch
 Subject: [R] How to implement an iterative unit root test


 Hello,

 How can an interative unit root test be implemented in R?
 More specifically, given a time series, I wish to perform the Dickey 
 Fuller Test on a daily basis for say the last 100 observations. It would 
 be interative in the sense that this test would be repeated each day for 
 the last 100 observations.
 Given the daily Dickey Fuller estimates of delta for the autoregressive 
 process

 d(Y(t)) = delta * Y(t-1) + u(t)

 , the significance of delta would be computed. If possible, I would like 
 to extract that value and record it in a table, that is a table 
 containing the tau-values of a each day's calculations.


 How can such a test be done in R? More specifically, how can it be 
 programmed to iteratively perform the test and also how to extract the 
 t-values on a daily basis?


 Thank you.

 Sincerely,

 Bernd Dittmann

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide!
 http://www.R-project.org/posting-guide.html




__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


Re: [R] How to implement an iterative unit root test

2006-04-08 Thread Bernd Dittmann
Thank you for the suggestion.

I ran it and oddly enough I am getting contradicting results:


  rollFun(x[1:100], 10, FUN = function(x) adfTest(x)$statistic)
NULL
Warning messages:
1: p-value smaller than printed p-value in: adfTest(x)
...
...
...

These error messages appear for each single calculation.


However, performing the unit root test for that very interval (although 
not repetitive) the results are:

  adfTest(x[1:100])

Title:
Augmented Dickey-Fuller Test

Test Results:
  PARAMETER:
Lag Order: 1
  STATISTIC:
Dickey-Fuller: -0.1627
  P VALUE:
0.5612

Description:
Sat Apr 08 19:11:40 2006


I checked with the help pages of the adfTest and fMultivar, but can 
simply not figure out why I am receiving these error messages above.


How could I fix this?

Many thanks!

Sincerely,

Bernd Dittmann

Gabor Grothendieck schrieb:
 Try this:

   
 library(fMultivar)
 set.seed(1)
 x - rnorm(25)
 rollFun(x, 15, FUN = function(x) adf.test(x)$p.value)
 
  [1] 0.1207730 0.3995849 0.3261577 0.4733004 0.5776586 0.6400228 0.6758550
  [8] 0.6897812 0.3792858 0.6587171 0.5675147
   
 rollFun(x, 15, FUN = function(x) adf.test(x)$statistic)
 
  [1] -3.185471 -2.453590 -2.646336 -2.260086 -1.986146 -1.822440 -1.728381
  [8] -1.691824 -2.506875 -1.773368 -2.012774

 Also, rapply in the zoo package and running in the gtools package are two 
 other
 rolling routines.

 On 4/8/06, Bernd Dittmann [EMAIL PROTECTED] wrote:
   
 Thank you for your suggestion, Andy.

 Luckily, the fMultivar package has already implemented such a rolling
 function: rollFun.

 Thus I tried the following:


 myfunction - function(x, n = 5)
 {
rollFun(x = x, n = n, FUN = adfTest)
 }


 This however does not return the tau values (or alternatively, the
 p.values) I am looking for.
 How do I need to define the function FUN to obtain these values?


 Many thanks!

 Sincerely,

 Bernd



 Andy Bunn schrieb:
 
 Does this get you started?

 library(tseries)
 ?adf.test
 foo - matrix(rnorm(1000),ncol=10,nrow=100)
 bar - apply(foo,2,adf.test)
 sapply(bar, [[, statistic)
 sapply(bar, [[, p.value)


 HTH, Andy

 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Bernd Dittmann
 Sent: Wednesday, April 05, 2006 8:58 PM
 To: r-help@stat.math.ethz.ch
 Subject: [R] How to implement an iterative unit root test


 Hello,

 How can an interative unit root test be implemented in R?
 More specifically, given a time series, I wish to perform the Dickey
 Fuller Test on a daily basis for say the last 100 observations. It would
 be interative in the sense that this test would be repeated each day for
 the last 100 observations.
 Given the daily Dickey Fuller estimates of delta for the autoregressive
 process

 d(Y(t)) = delta * Y(t-1) + u(t)

 , the significance of delta would be computed. If possible, I would like
 to extract that value and record it in a table, that is a table
 containing the tau-values of a each day's calculations.


 How can such a test be done in R? More specifically, how can it be
 programmed to iteratively perform the test and also how to extract the
 t-values on a daily basis?


 Thank you.

 Sincerely,

 Bernd Dittmann

 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide!
 http://www.R-project.org/posting-guide.html



   
 __
 R-help@stat.math.ethz.ch mailing list
 https://stat.ethz.ch/mailman/listinfo/r-help
 PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

 

   


[[alternative HTML version deleted]]



___ 
Does your mail provider give you FREE antivirus protection?

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] How to implement an iterative unit root test

2006-04-05 Thread Bernd Dittmann
Hello,

How can an interative unit root test be implemented in R?
More specifically, given a time series, I wish to perform the Dickey 
Fuller Test on a daily basis for say the last 100 observations. It would 
be interative in the sense that this test would be repeated each day for 
the last 100 observations.
Given the daily Dickey Fuller estimates of delta for the autoregressive 
process

d(Y(t)) = delta * Y(t-1) + u(t)

, the significance of delta would be computed. If possible, I would like 
to extract that value and record it in a table, that is a table 
containing the tau-values of a each day's calculations.


How can such a test be done in R? More specifically, how can it be 
programmed to iteratively perform the test and also how to extract the 
t-values on a daily basis?


Thank you.

Sincerely,

Bernd Dittmann

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html