Hi all,

When I tried to estimate a VAR (package vars) of a rather large dataset with 5 
lags:


> dim(trial.var) 
[1] 20388     2 


I ran into memory troubles:


> summary(VAR(trial.var, type="none", p=5)) 
Error: cannot allocate vector of size 3.1 Gb 
In addition: Warning messages: 
1: In diag(resids %*% solve(Sigma) %*% t(resids)) : 
  Reached total allocation of 1535Mb: see help(memory.size) 
2: In diag(resids %*% solve(Sigma) %*% t(resids)) : 
  Reached total allocation of 1535Mb: see help(memory.size) 
3: In diag(resids %*% solve(Sigma) %*% t(resids)) : 
  Reached total allocation of 1535Mb: see help(memory.size) 
4: In diag(resids %*% solve(Sigma) %*% t(resids)) : 
  Reached total allocation of 1535Mb: see help(memory.size) 


Luckily, I was able to slice and dice my dataset into individual days with ca. 
3000 lines each and estimated each subset.

Now, I nonetheless would like to run the VAR over the whole set.

Is there any way I can extend the memory used by R? Perhaps forcing it? I am 
running R on a XP box with 1GB RAM. 


Many thanks for any pointers.

Bernd
------------------
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to