Hao --
Did you ever figure this out? I just ran into the same issue, changed
spark-env.sh and got it working--but I'd much rather keep this configuration
in my application code.
-- Greg
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/set-spark-local-dir-o
Hi,
When running spark on ec2 cluster, I find setting spark.local.dir on driver
program doesn't take effect.
INFO:
- standalone mode
- cluster launched via python script along with spark
- instance type R3.large
- ebs attached (using persistent-hdfs)
- spark version: 1.0.0 prebuilt-hadoop1,sbt do