I ran your code and did not see any growth:
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 463828 24.8 818163 43.7 818163 43.7
Vcells 546318 4.21031040 7.9 909905 7.0
1 (1) - eval : <33.6 376.6> 376.6 : 48.9MB
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 471049 2
THANKS a lot!
This actually solved the problem even without calling free() explicitly:
xmlTreeParse(..., useInternalNodes=TRUE)
Best, Peter
Am 21.12.2012 19:48, schrieb Milan Bouchet-Valat:
Le vendredi 21 décembre 2012 à 18:41 +0100, Peter Meißner a écrit :
Yeah, thanks,
I know: !DO NOT USE
Le vendredi 21 décembre 2012 à 18:41 +0100, Peter Meißner a écrit :
> Yeah, thanks,
> I know: !DO NOT USE RBIND! !
>
> But it does not help, although using a predefined list to store results
> as suggested there, it does not help.
>
> The problems seems to stem from the XML-package and not from
Yeah, thanks,
I know: !DO NOT USE RBIND! !
But it does not help, although using a predefined list to store results
as suggested there, it does not help.
The problems seems to stem from the XML-package and not from the way I
store the data until saved.
Best, Peter
Am 21.12.2012 18:33, sch
Circle 2 of 'The R Inferno' may help you.
http://www.burns-stat.com/pages/Tutor/R_inferno.pdf
In particular, it has an example of how to do what
Duncan suggested.
Pat
On 21/12/2012 15:27, Peter Meißner wrote:
Here is an working example that reproduces the behavior by creating 1000
xml-files
I'll consider it. But in fact the whole data does not fit into memory at
once with the overhead to create it in addition - I think. That was one
of the reasons I wanted to do it chunk by chunk in the first place.
Thanks, Best, Peter
Am 21.12.2012 15:07, schrieb Duncan Murdoch:
On 12-12-20 6:2
Here is an working example that reproduces the behavior by creating 1000
xml-files and afterwards parsing them.
At my PC, R starts with about 90MB of RAM with every cycle another
10-12MB are further added to the RAM-usage so I end up with 200MB RAM
usage.
In the real code one chunk-cycle eat
On 12-12-20 6:26 PM, Peter Meissner wrote:
Hey,
I have an double loop like this:
chunk <- list(1:10, 11:20, 21:30)
for(k in 1:length(chunk)){
print(chunk[k])
DummyCatcher <- NULL
for(i in chunk[k]){
print("i load something")
dummy <- 1
Thanks for your answer,
yes, I tried 'gc()' it did not change the bahavior.
best, Peter
Am 21.12.2012 13:37, schrieb jim holtman:
have you tried putting calls to 'gc' at the top of the first loop to
make sure memory is reclaimed? You can print the call to 'gc' to see
how fast it is growing.
have you tried putting calls to 'gc' at the top of the first loop to
make sure memory is reclaimed? You can print the call to 'gc' to see
how fast it is growing.
On Thu, Dec 20, 2012 at 6:26 PM, Peter Meissner
wrote:
> Hey,
>
> I have an double loop like this:
>
>
> chunk <- list(1:10, 11:20, 21:
Hey,
I have an double loop like this:
chunk <- list(1:10, 11:20, 21:30)
for(k in 1:length(chunk)){
print(chunk[k])
DummyCatcher <- NULL
for(i in chunk[k]){
print("i load something")
dummy <- 1
print("i do something")
11 matches
Mail list logo