hi,

over the past few days, i collected some amount of data, i.e. a script
examined more than 120.000 XML documents. for a while this ran just fine, but
at a certain point the machine ran out of RAM. i then included a manual call
to gc() in the loop, which actually already removed all processed objects with
rm() so i didn't think i'd run into problems, but that still didn't really
help as much as i had thought.

i just examined this a bit. calling gc() in RKWard *looks* like it really
frees memory, but on the system level i don't notice a significant reduction.
only after i close RKWard the memory level goes back considerably. can it be
that the memory freed by R is still occupied by RKWard throughout a session?


viele grüße :: m.eik

--
dipl. psych. meik michalke
institut f"ur experimentelle psychologie
abt. f"ur diagnostik und differentielle psychologie
heinrich-heine-universit"at d-40204 d"usseldorf

Attachment: signature.asc
Description: This is a digitally signed message part.

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
RKWard-devel mailing list
RKWard-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/rkward-devel

Reply via email to