Re: [R] glm-poisson fitting 400.000 records

2011-10-24 Thread D_Tomas
Many thanks for your replies. I appreciate that. I tried what you suggested and it did work for the Poisson model (glm, poisson familly). Unfortunately, the negative binomial (glm.nb) did not work as I work the following message: Warning messages: 1: In ifelse(y mu, d.res, -d.res) : Reached

Re: [R] glm-poisson fitting 400.000 records

2011-10-22 Thread Uwe Ligges
On 21.10.2011 23:14, Ken wrote: Your memory shouldn't be capped there, Where? You cannot know from the output below. try ?memory.size and ?memory.limit. Background less things. Good luck, Ken Hutchison On Oct 21, 2554 BE, at 11:57 AM, D_Tomastomasm...@hotmail.com wrote: My

[R] glm-poisson fitting 400.000 records

2011-10-21 Thread D_Tomas
Hi, I am trying to fi a glm-poisson model to 400.000 records. I have tried biglm and glmulti but i have problems... can it really be the case that 400.000 are too many records??? I am thinking of using random samples of my dataset. Many thanks, -- View this message in context:

Re: [R] glm-poisson fitting 400.000 records

2011-10-21 Thread Ben Bolker
D_Tomas tomasmeca at hotmail.com writes: Hi, I am trying to fi a glm-poisson model to 400.000 records. I have tried biglm and glmulti but i have problems... can it really be the case that 400.000 are too many records??? I am thinking of using random samples of my dataset. I

Re: [R] glm-poisson fitting 400.000 records

2011-10-21 Thread D_Tomas
My apologies for my vague comment. My data comprises 400.000 x 21 (17 explanatory variables, plus response variable, plus two offsets). If I build the full model (only linear) I get: Error: cannot allocate vector of size 112.3 Mb I have a 4GB RAM laptop... Would i get any improvemnt on a

Re: [R] glm-poisson fitting 400.000 records

2011-10-21 Thread Ken
Your memory shouldn't be capped there, try ?memory.size and ?memory.limit. Background less things. Good luck, Ken Hutchison On Oct 21, 2554 BE, at 11:57 AM, D_Tomas tomasm...@hotmail.com wrote: My apologies for my vague comment. My data comprises 400.000 x 21 (17 explanatory