On 13 December 2012 17:52, Simon Urbanek <simon.urba...@r-project.org> wrote: > Because it's far beyond what you can handle without changing a lot of other > things. 500k expressions will require at least about 320Mb of stack (!) in > the eval() chain alone -- compare that to the 8Mb stack size which is default > in most OSes, so you'll hit the wall way before that limit is reached. >
Thank you for the explanation. Sorry to be dummy on this but why one need a stack? I thought pointing to itself has no memory cost for a function. Is it about how compilers designed or about R being dynamic language? > >> While some algorithms are only "solvable" feasibility with recursion >> and 500K sounds not too much i.e. graph algorithms >> for example dependency trees with large nodes easily reach to that number. >> > > I don't see how "large nodes" have anything to do with this since we are > talking about expression depth, not about sizes of any kind. > Again, in any realistic example you'll hit other limits first anyway. I was thinking about very big tree with large depth, so each recursion step may correspond to one leaf. Well not sure what would be the application that has million depth, maybe a genetic algorithm. Cheers, Mehmet ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.