Re: Spark logging questions

2019-06-08 Thread Jacek Laskowski
Hi, What are "the spark driver and executor threads information" and "spark application logging"? Spark uses log4j so set up logging levels appropriately and you should be done. Pozdrawiam, Jacek Laskowski https://about.me/JacekLaskowski The Internals of Spark SQL

Spark logging questions

2019-06-07 Thread test test
Hello, How can we dump the spark driver and executor threads information in spark application logging.? PS: submitting spark job using spark submit Regards Rohit

RE: submitting a spark job using yarn-client and getting NoClassDefFoundError: org/apache/spark/Logging

2016-11-16 Thread David Robison
I’ve gotten a little further along. It now submits the job via Yarn, but now the jobs exit immediately with the following error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging at java.lang.ClassLoader.defineClass1(Nat

Re: NoClassDefFoundError: org/apache/spark/Logging in SparkSession.getOrCreate

2016-10-17 Thread Saisai Shao
.master("local") > .appName("DecisionTreeExample") > .getOrCreate(); > > Running this in the eclipse debugger, execution fails in getOrCreate() > with this exception > > Exception in t

NoClassDefFoundError: org/apache/spark/Logging in SparkSession.getOrCreate

2016-10-15 Thread Brad Cox
lassDefFoundError: org/apache/spark/Logging at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defin

Spark Logging : log4j.properties or log4j.xml

2016-08-24 Thread John Jacobs
One can specify "-Dlog4j.configuration=" or "-Dlog4j.configuration=". Is there any preference to using one over other? All the spark documentation talks about using "log4j.properties" only ( http://spark.apache.org/docs/latest/configuration.html#configuring-logging). So is only "log4j.properties"

Spark logging

2016-07-10 Thread SamyaMaiti
xecutors. Is it feasible? I am using org.apache.log4j.Logger. Regards, Sam -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-logging-tp27319.html Sent from the Apache Spark User List mailing list archive at Na

spark logging best practices

2016-07-08 Thread vimal dinakaran
Hi, http://stackoverflow.com/questions/29208844/apache-spark-logging-within-scala What is the best way to capture spark logs without getting task not serialzible error ? The above link has various workarounds. Also is there a way to dynamically set the log level when the application is running

spark logging issue

2014-12-11 Thread Sourav Chandra
Hi, I am using spark 1.1.0 and setting below properties while creating spark context. *spark.executor.logs.rolling.maxRetainedFiles = 10* *spark.executor.logs.rolling.size.maxBytes = 104857600* *spark.executor.logs.rolling.strategy = size* Even though I am setting to rollover after 100 MB,

Deadlock between spark logging and wildfly logging

2014-11-28 Thread Charles
between spark logging thread and wildfly logging thread. Can I control the spark logging in the driver application? How can I turn it off in the driver application? How can I control the level of spark logs in the driver application? 2014-11-27 14:39:26,719 INFO [akka.event.slf4j.Slf4jLogger

Re: Deadlock between spark logging and wildfly logging

2014-11-28 Thread Sean Owen
between spark logging thread and wildfly logging thread. Can I control the spark logging in the driver application? How can I turn it off in the driver application? How can I control the level of spark logs in the driver application? 2014-11-27 14:39:26,719 INFO

Re: Deadlock between spark logging and wildfly logging

2014-11-28 Thread Charles
-between-spark-logging-and-wildfly-logging-tp20009p20013.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail

RE: Spark logging strategy on YARN

2014-07-07 Thread Andrew Lee
: kudryavtsev.konstan...@gmail.com Subject: Spark logging strategy on YARN Date: Thu, 3 Jul 2014 22:26:48 +0300 To: user@spark.apache.org Hi all, Could you please share your the best practices on writing logs in Spark? I’m running it on YARN, so when I check logs I’m bit confused

Spark logging strategy on YARN

2014-07-03 Thread Kostiantyn Kudriavtsev
Hi all, Could you please share your the best practices on writing logs in Spark? I’m running it on YARN, so when I check logs I’m bit confused… Currently, I’m writing System.err.println to put a message in log and access it via YARN history server. But, I don’t like this way… I’d like to use

Spark, Logging Issues: slf4j or log4j

2014-07-02 Thread Shivani Rao
Hello Spark fans, I am unable to figure out how Spark figures out which logger to use. I know that Spark decides upon this at the time of initialization of the Spark Context. From Spark documentation it is clear that Spark uses log4j, and not slf4j, but I have been able to successfully get spark

Centralized Spark Logging solution

2014-06-24 Thread Robert James
We need a centralized spark logging solution. Ideally, it should: * Allow any Spark process to log at multiple levels (info, warn, debug) using a single line, similar to log4j * All logs should go to a central location - so, to read the logs, we don't need to check each worker by itself

Re: Spark Logging

2014-06-10 Thread Surendranauth Hiraman
/configuration.html . -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Logging-tp7340p7343.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -- SUREN HIRAMAN, VP TECHNOLOGY Velos Accelerating Machine Learning 440 NINTH