Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
age will be added to the discussion >> below: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Ja >> va-Heap-Error-tp27669p27707.html >> To unsubscribe from Spark Java Heap Error, click here >> <http://apache-spark-user-list.1001560.n3.nabble.com/template/N

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
on > below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27707.html > To unsubscribe from Spark Java Heap Error, click here > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=276

Re: Spark Java Heap Error

2016-09-13 Thread neil90
can do df.cache(StorageLevel.MEMORY_AND_DISK). -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27707.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - T

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
t; Memory is close to half of 16gb available. > > -- > If you reply to this email, your message will be added to the discussion > below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27704.html > To unsubscribe

Re: Spark Java Heap Error

2016-09-13 Thread neil90
Double check your Driver Memory in your Spark Web UI make sure the driver Memory is close to half of 16gb available. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27704.html Sent from the Apache Spark User List mailing list

Re: Spark Java Heap Error

2016-09-12 Thread Baktaawar
va:745) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27696.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Spark Java Heap Error

2016-09-09 Thread Baktaawar
xtraJavaOptions -XX:+PrintGCDetails -Dkey=value > > You might need to change your spark.driver.maxResultSize settings if you > plan on doing a collect on the entire rdd/dataframe. > > -- > If you reply to this email, your message will be added to the discussion &

Re: Spark Java Heap Error

2016-09-07 Thread neil90
2g spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value You might need to change your spark.driver.maxResultSize settings if you plan on doing a collect on the entire rdd/dataframe. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-