age will be added to the discussion
>> below:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Ja
>> va-Heap-Error-tp27669p27707.html
>> To unsubscribe from Spark Java Heap Error, click here
>> <http://apache-spark-user-list.1001560.n3.nabble.com/template/N
on
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-
> Java-Heap-Error-tp27669p27707.html
> To unsubscribe from Spark Java Heap Error, click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=276
can do
df.cache(StorageLevel.MEMORY_AND_DISK).
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27707.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
T
t; Memory is close to half of 16gb available.
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-
> Java-Heap-Error-tp27669p27704.html
> To unsubscribe
Double check your Driver Memory in your Spark Web UI make sure the driver
Memory is close to half of 16gb available.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27704.html
Sent from the Apache Spark User List mailing list
va:745)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27696.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
xtraJavaOptions -XX:+PrintGCDetails -Dkey=value
>
> You might need to change your spark.driver.maxResultSize settings if you
> plan on doing a collect on the entire rdd/dataframe.
>
> --
> If you reply to this email, your message will be added to the discussion
&
2g
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value
You might need to change your spark.driver.maxResultSize settings if you
plan on doing a collect on the entire rdd/dataframe.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-