When I was testing spark, I faced this issue, this issue is not related to
memory shortage, It is because your configurations are not correct. Try to
pass you current Jar to to the SparkContext with SparkConf's setJars
function and try again.

On Thu, Apr 24, 2014 at 8:38 AM, wxhsdp <wxh...@gmail.com> wrote:

> by the way, codes run ok in spark shell
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-set-spark-executor-memory-and-heap-size-tp4719p4720.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to