Spark is a research language that does not work, as I've discovered and discussed with you before. It cannot be determined the max stack usage at compile time, again, this is the halting problem.
What?! It's easily solvable: Forbid recursion and indirect function calls and it's guaranteed that the program only requires a fixed size stack and you can compute an upper bound of the required stack size at compile-time. Which is BTW exactly what OpenCL does as GPUs tend to have no stacks. In what way is Spark a "research language that does not work"? And how many language design issues need to be discovered until you admit that Safe-D is a "research language that does not work"?
