Re: Spark stream job is take up /TMP with 100%

2016-02-19 Thread Holden Karau
Thats a good question, you can find most of what you are looking for in the configuration guide at http://spark.apache.org/docs/latest/configuration.html - you probably want to change the spark.local.dir to point to your scratch directory. Out of interest what problems have you been seeing with YAR

Spark stream job is take up /TMP with 100%

2016-02-19 Thread Sutanu Das
We have a Spark steaming job and when running in LOCAL mode, it takes up /TMP at 100% and Fails with error below, this doesn't happen in YARN Mode but in YARN we have performance issues. How can I re-direct Spark Local Subffle from /TMP to /other_filesystem_location (where we have lots of space