I have a large database on sql Server 2012 Developers edition,  Windows 7 
ultimate edition,
some of my tables are as large as 10GB,
I am running R15.2 with a 64-bit build

I have been connecting fine to the database and extracting info.  but  it seams 
this was the first time I tried to pull a large (1/2 gb) amount of data in one 
query, The query didn't have anything fancy, it was code that always worked!

R dropped the work without providing an error message.  
I got the "sand clock" running for a couple of seconds, as if R had stared 
communication with the database, but 
then nothing. I looked at my windows task manger, and CPU utilization was at 
zero.

I ran memory.size() function to confirm availability of memory and it read 24 
thousand I don't remember the rest, I have 24GBs of ram on my computer.  the 
size of the other R objects in memory was around 2GB
I used RODBC to connect to the database, I understand the number you get when 
you run memory.size is in thousands of MBs, so a read of 24,000 means 24GB, 
which is consistent with the amount of ram in my machine.


Is there anything that I missed?  is there another way to check availability of 
memory? or allocated memory for an R session?
Are there issues with RODBC which might cause a failure of data transfer when 
the amount of data requested is "large"?

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to