apache.org>>
主题: Re: Did anybody come across this random-forest issue with spark 2.0.1.
Please increase the value of "maxMemoryInMB" of your RandomForestClassifier
or RandomForestRegressor.
It's a warning which will not affect the result but may lead your training
;user@spark.apache.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
主题: Re: Did anybody come across this random-forest issue with spark 2.0.1.
Did you also upgrade to Java from v7 to v8?
On Mon, Oct 17, 2016 at 7:19 PM 张建鑫(市场部)
mailto:zhangjian...@didichuxing.com>> wro
e.org<mailto:user@spark.apache.org>"
mailto:user@spark.apache.org>>
主题: Re: Did anybody come across this random-forest issue with spark 2.0.1.
Did you also upgrade to Java from v7 to v8?
On Mon, Oct 17, 2016 at 7:19 PM 张建鑫(市场部)
mailto:zhangjian...@didichuxing.com>> wrote:
D
Hi,
Below are what I typed in my scale-sql command line based on your first email,
the result is different with yours. Just for your reference.
My spark version is 1.6.1
import org.apache.spark.ml.feature._
import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.mllib