Hi, Profiling shows that 65-70% of the time of my program is spent inside a single function -- this is not surprising, as it is inside an optimize call inside a loop (this is a dynamic programming problem). I would like to speed this up.
The function does very little: has a single argument, evaluates a spline at that argument, does some simple arithmetic with it (adding constants, multiplication). With R being a functional programming language, I implemented this by calling several functions inside the function: ## RHS of bellman equation f <- function(knext,k,ei) { util(consf(knext,k))+quickeval(knext,gridsecpp,Vkbarpp) } where quickeval evaluates a spline at knext (on gridsecpp, pp-form Vkbarpp), util is a function in the environment, and so is consf: ## consumption consf <- function(knext,k) { rp*k+W+knext*A } A, W, and rp are constants in the environment. Then I call optimize(f, lower=...,upper=...,k=...) to find the maximum. Questions: 1. does function calling give a significant overhead in R? If so, I would rewrite the function into a single one. I tried to test this by > f <- function(x) 1+x > g <- function(x) f(x) > x <- rnorm(1e6) > system.time(sapply(x,f)) [1] 11.315 0.157 11.735 0.000 0.000 > system.time(sapply(x,g)) [1] 8.850 0.140 9.283 0.000 0.000 > system.time(for (i in seq_along(x)) f(x[i])) [1] 2.466 0.036 2.884 0.000 0.000 > system.time(for (i in seq_along(x)) g(x[i])) [1] 3.548 0.045 4.165 0.000 0.000 but I find that hard to interpret -- the overhead looks significant in the first case, but something strange (at least to my limited knowledge) is happening with sapply. 2. Do calls to .C or .Fortran carry large overhead? If they don't, I would recode f in either. Thanks, Tamas ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel