I’m doing some text analytic in R and facing some issue on which I need
help.
As a part of the algorithm I’ve to generate the *svd matrix* (dimension
reduction) for a large binary matrix (parent). I’m using the svd function
for that. The size (memory occupied) of the svd tends to get very large and
if the amount of data (no. of rows / cols ) of the parent matrix is very
large then the size of svd gets tremendously unhandlable. It looks like R
runs the processes on system RAM and if the size of the svd matrix gets
larger than the available RAM then it (and the system) hangs!! *Is there a
way to handle internal objects in R with very large sizes (specially more
than the available RAM)??* Any suggestion for the problem I’m facing?

-- 
Regards,

Siddharth Arun,
Contact No. - +91 8880065278

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to