Dear Paul,

Thanks a lot for your attention!  Reading carefully your answer and perhaps 
many emails from Dr. B Ripley, I think the best solution to have a lot allocate 
memory is run a linux 64 bit.  The only thing that is very strange for me is 
how the performance of MAC is much better to allocate memory than linux and 
windows both 32bit.  I am using a OS linux 32 with a windows partition.  I 
believe that is the problem.  R just understand that I have 2Gb of Ram, 
although I have 4 Gb available.  Is that true?  Somebody else have this problem 
before?

Thanks,
Ricardo 



It can't be such a small R program if it is trying to allocate one
vector of 231 mb.   Think for a minute on how much contiguous storage
you are asking for.

I went and read that thread you referred to in r-help. B Ripley
clearly states you should run a 64bit OS if you hope to claim such a
large piece of contiguous memory.

Beyond that, it is hard to say what the fix is.  You can arrive at the
"vector too big" problem in many ways.  Some you can fix, some you can
avoid.  If you have such a small program, you should post it here so
we can at least try it and see what we get?  My first thought would be
re-organize your code so you don't try to ask for such gigantic
vectors.

If that is impossible, in the CRAN, look for the packages that are
supposed to help with big vectors.  As I recally, there is one family
of packages with names like "big..." and another with a name like
"ff...".  These are both trying to work around the "can't get a
gigantic piece of contiguous memory" problem. I would tell you
details, but I'm not able to get an answer from r-project.org right
now, so I can't look them up.

I experimented with these for a day last winter and concluded that
they are still too experimental for me, but they did make a lot of
progress lately.

>
> I have reading carefully the instruction in ?Memory.  Using the function gc() 
> I got very low numbers of memory (please sea below).  I know that it has been 
> posted several times at r-help 
> (http://tolstoy.newcastle.edu.au/R/help/05/06/7565.html#7627qlink2).  However 
> I did not find yet the solution to improve my memory issue in Linux.  
> Somebody cold please give some instruction how to improve my memory under 
> linux?
>
>> gc()
>          used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 170934  4.6     350000  9.4   350000  9.4
> Vcells 195920  1.5     786432  6.0   781384  6.0
>
> INCREASING THE R MEMORY FOLLOWING THE INSTRUCTION IN  ?Memory
>
> I started R with:
>
> R --min-vsize=10M --max-vsize=4G --min-nsize=500k --max-nsize=900M
>> gc()
>          used (Mb) gc trigger (Mb) limit (Mb) max used (Mb)
> Ncells 130433  3.5     500000 13.4      25200   500000 13.4
> Vcells  81138  0.7    1310720 10.0         NA   499143  3.9
>
> It increased but not so much!
>
> Please, please let me know.  I have read all r-help about this matter, but 
> not solution. Thanks for your attention!
>
> Ricardo
>
>
>
>        [[alternative HTML version deleted]]
>
>
> _______________________________________________
> R-SIG-Debian mailing list
> r-sig-deb...@r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-debian
>
>



-- 
Paul E. Johnson
Professor, Political Science
1541 Lilac Lane, Room 504
University of Kansas

_______________________________________________
R-SIG-Debian mailing list
r-sig-deb...@r-project.org
https://stat.ethz.ch/mailman/listinfo/r-sig-debian



      
        [[alternative HTML version deleted]]

_______________________________________________
R-sig-Geo mailing list
R-sig-Geo@stat.math.ethz.ch
https://stat.ethz.ch/mailman/listinfo/r-sig-geo

Reply via email to