Hi,

Spark version :spark-1.5.2-bin-hadoop2.6 ,using pyspark. 

I am running a machine learning program, which runs perfectly by specifying 2G 
for —driver-memory.
However the program cannot be run with default 1G, driver crashes with OOM 
error.

What is the recommended configuration for —driver-memory…? Please suggest.

Thanks and regards,
Anand.

Reply via email to