When I train a GBTRegressor model from a DataFrame in the latest
1.6.4-Snapshot, with a high number for the hyper-parameter maxIter, say
500, we have java.lang.StackOverflowError; GBTRegressor does work with
maxIter set about 100.

Does this make sense? Are there any known solutions? This is running in
within executors with 18G of RAM each; I'm not clear how to debug this
short of somehow getting more JVM Heap perhaps.

Thanks!

Reply via email to