I'm running a local spark master ("local[n]").

I cannot seem to turn off the parquet logging. I tried:

1) Setting a log4j.properties on the classpath.
2) Setting a log4j.properties file in a spark install conf directory and
pointing to the install using setSparkHome
3) Editing the log4j-default.properties file in the spark-core jar that I'm
using
4) Changing the JAVA_HOME/jre/lib/logging.properties (since Parquet uses
java.util.logging)
5) adding the following code as the first lines in my main:

      java.util.logging.Logger.getLogger("parquet").addHandler(
      new java.util.logging.Handler() {
        def close(): Unit = {}
        def flush(): Unit = {}
        def publish(x: java.util.logging.LogRecord): Unit = {}
      })

NOTHING seems to change the default log level console output in parquet when
it runs in a worker.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-do-I-turn-off-Parquet-logging-in-a-worker-tp18955.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to