Created a JIRA to track it: https://issues.apache.org/jira/browse/SPARK-4664
Best Regards,
Shixiong Zhu
2014-12-01 13:22 GMT+08:00 Shixiong Zhu :
> Sorry. Should be not greater than 2048. 2047 is the greatest value.
>
> Best Regards,
> Shixiong Zhu
>
> 2014-12-01 13:20 GMT+08:00 Shixiong Zhu :
>
Sorry. Should be not greater than 2048. 2047 is the greatest value.
Best Regards,
Shixiong Zhu
2014-12-01 13:20 GMT+08:00 Shixiong Zhu :
> 4096MB is greater than Int.MaxValue and it will be overflow in Spark.
> Please set it less then 4096.
>
> Best Regards,
> Shixiong Zhu
>
> 2014-12-01 13:14 G
4096MB is greater than Int.MaxValue and it will be overflow in Spark.
Please set it less then 4096.
Best Regards,
Shixiong Zhu
2014-12-01 13:14 GMT+08:00 Ke Wang :
> I meet the same problem, did you solve it ?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.
I meet the same problem, did you solve it ?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-akka-frameSize-setting-problem-tp3416p20063.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
Hi,
I just run a simple example to generate some data for the ALS
algorithm. my spark version is 0.9, and in local mode, the memory of my
node is 108G
but when I set conf.set("spark.akka.frameSize", "4096"), it
then occurred the following problem, and when I do not set this, it runs
well