hi, Is there any way to control the stderr file max size when running Spark in Standalone Cluster mode? The file size becomes huge for every workers and makes the disk run out of space. If there is no way to do that, can I disable it?
Thanks, Abed. -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Control-stderr-file-size-in-Standalone-Cluster-tp21551.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org