Hi everyone,
everytime our data comes and new updates occur in our cluster, an
undesirable file is being created in workers' directories.In order to
cleanup automatically I changed the variable value Spark (Standalone) Client
Advanced Configuration Snippet (Safety Valve) for spark-conf/spark-env.sh in
Gateway Default Group->Advanced Settings  as :

/*export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true
-Dspark.worker.cleanup.interval=60 -Dspark.worker.cleanup.appDataTtl=60"*/

by using cloudera manager.
After i make the cluster restart, it makes change in spark/conf/spark-env.sh 
but  it does not make cleanup.Does anyone know where the mistake is or
another way of cleaning up automatically ?
i am using CDH 4 and Spark 1.2.2 in the cluster.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Cleaning-up-workers-directories-automatically-tp12597.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to