Dear gbm users,
When running predict.gbm() on a large dataset (150,000 rows, 300 columns,
500 trees), I notice that the memory used by R grows beyond reasonable
limits. My 14GB of RAM are often not sufficient. I am interpreting this as a
memory leak since there should be no reason to expand memory
All of the memory allocations for predictions use R's allocVector(). See
gbm_pred in gbmentry.cpp. That is, R's memory manager is doing all the
work. gbm_pred is not allocating memory separate from R. It's just
creating R objects within R that can be deleted or garbage collected.
Make sure
2 matches
Mail list logo