I'm using spark 1.1 and the provided ec2 scripts to start my cluster (r3.8xlarge machines). From the spark-shell, I can verify that the environment variables are set scala> System.getenv("SPARK_LOCAL_DIRS")res0: String = /mnt/spark,/mnt2/spark However, when I look on the workers, the directories for /mnt/spark and /mnt2/spark do not exist. Am I missing something? Has anyone else noticed this? A colleague was started a cluster (using the ec2 scripts) but for m3.xlarge machines and both /mnt/spark and /mnt2/spark directories were created. Thanks. Darin.