Stati, Change SPARK_REPL_OPTS to SPARK_SUBMIT_OPTS and try again. I faced the same issue and making this change worked for me. I looked at the spark-shell file under the bin dir and found SPARK_SUBMIT_OPTS being used.
SPARK_SUBMIT_OPTS="-XX:MaxPermSize=256m" bin/spark-shell --master spark://machu:7077 --total-executor-cores 12 --packages com.databricks:spark-csv_2.10:1.0.3 --packages joda-time:joda-time:2.8.1 -SparklineData -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-OutOfMemoryError-PermGen-space-tp23472p23702.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org