Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
This worked for me locally:
spark-1.4.1-bin-hadoop2.4/bin/spark-submit --conf
spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--conf
spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar:/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--verbose --class runner.SparkRunner target/simspark-0.1-SNAPSHOT.jar


Now I am going to try it out on our mesos cluster.
I assumed spark.executor.extraClassPath takes csv as jars the way
--jars takes it but it should be : separated like a regular classpath
jar.

Thanks for your help!
-Utkarsh


On Mon, Aug 24, 2015 at 5:05 PM, Utkarsh Sengar utkarsh2...@gmail.com
wrote:

 I get the same error even when I set the SPARK_CLASSPATH: export
 SPARK_CLASSPATH=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.1.jar:/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
 And I run the job like this: /spark-1.4.1-bin-hadoop2.4/bin/spark-submit
 --class runner.SparkRunner
 target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

 I am not able to find the code in spark which adds these jars before the
 spark classes in classpath. Or maybe its a bug. Any suggestions on
 workarounds?

 Thanks,
 -Utkarsh


 On Mon, Aug 24, 2015 at 4:32 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:

 I assumed that's the case beacause of the error I got and the
 documentation which says: Extra classpath entries to append to the
 classpath of the driver.

 This is where I stand now:
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version1.4.1/version
 exclusions
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 /exclusions
 /dependency

 And no exclusions from my logging lib.

 And I submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit
 --class runner.SparkRunner --conf
 spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
 --conf
 spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
 --conf
 spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
 --conf
 spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
 target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

 And I get the same error:
 Caused by: java.lang.ClassCastException:
 org.slf4j.impl.Log4jLoggerFactory cannot be cast to
 ch.qos.logback.classic.LoggerContext
 at
 com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
 at
 com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
 at com.opentable.logging.Log.clinit(Log.java:31)
 ... 16 more


 Thanks,
 -Utkarsh

 On Mon, Aug 24, 2015 at 4:11 PM, Marcelo Vanzin van...@cloudera.com
 wrote:

 On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  That didn't work since extraClassPath flag was still appending the
 jars at
  the end, so its still picking the slf4j jar provided by spark.

 Out of curiosity, how did you verify this? The extraClassPath
 options are supposed to prepend entries to the classpath, and the code
 seems to be doing that. If it's not really doing that in some case,
 it's a bug that needs to be fixed.

 Another option is those is setting the SPARK_CLASSPATH env variable,
 which is deprecated, but might come in handy in case there is actually
 a bug in handling those options.


 --
 Marcelo




 --
 Thanks,
 -Utkarsh




 --
 Thanks,
 -Utkarsh




-- 
Thanks,
-Utkarsh


Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
 Now I am going to try it out on our mesos cluster.
 I assumed spark.executor.extraClassPath takes csv as jars the way --jars
 takes it but it should be : separated like a regular classpath jar.

Ah, yes, those options are just raw classpath strings. Also, they
don't cause jars to be copied to the cluster. You'll need the jar to
be available at the same location on all cluster machines.

-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
So do I need to manually copy these 2 jars on my spark executors?



On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin van...@cloudera.com
wrote:

 On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  Now I am going to try it out on our mesos cluster.
  I assumed spark.executor.extraClassPath takes csv as jars the way
 --jars
  takes it but it should be : separated like a regular classpath jar.

 Ah, yes, those options are just raw classpath strings. Also, they
 don't cause jars to be copied to the cluster. You'll need the jar to
 be available at the same location on all cluster machines.

 --
 Marcelo




-- 
Thanks,
-Utkarsh


Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Marcelo Vanzin
On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
 So do I need to manually copy these 2 jars on my spark executors?

Yes. I can think of a way to work around that if you're using YARN,
but not with other cluster managers.

 On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin van...@cloudera.com
 wrote:

 On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  Now I am going to try it out on our mesos cluster.
  I assumed spark.executor.extraClassPath takes csv as jars the way
  --jars
  takes it but it should be : separated like a regular classpath jar.

 Ah, yes, those options are just raw classpath strings. Also, they
 don't cause jars to be copied to the cluster. You'll need the jar to
 be available at the same location on all cluster machines.


-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-25 Thread Utkarsh Sengar
Looks like I stuck then, I am using mesos.
Adding these 2 jars to all executors might be a problem for me, I will
probably try to remove the dependency on the otj-logging lib then and just
use log4j.

On Tue, Aug 25, 2015 at 2:15 PM, Marcelo Vanzin van...@cloudera.com wrote:

 On Tue, Aug 25, 2015 at 1:50 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  So do I need to manually copy these 2 jars on my spark executors?

 Yes. I can think of a way to work around that if you're using YARN,
 but not with other cluster managers.

  On Tue, Aug 25, 2015 at 10:51 AM, Marcelo Vanzin van...@cloudera.com
  wrote:
 
  On Tue, Aug 25, 2015 at 10:48 AM, Utkarsh Sengar utkarsh2...@gmail.com
 
  wrote:
   Now I am going to try it out on our mesos cluster.
   I assumed spark.executor.extraClassPath takes csv as jars the way
   --jars
   takes it but it should be : separated like a regular classpath jar.
 
  Ah, yes, those options are just raw classpath strings. Also, they
  don't cause jars to be copied to the cluster. You'll need the jar to
  be available at the same location on all cluster machines.


 --
 Marcelo




-- 
Thanks,
-Utkarsh


Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
Hi Utkarsh,

Unfortunately that's not going to be easy. Since Spark bundles all
dependent classes into a single fat jar file, to remove that
dependency you'd need to modify Spark's assembly jar (potentially in
all your nodes). Doing that per-job is even trickier, because you'd
probably need some kind of script to inject the correct binding into
Spark's classpath.

That being said, that message is not an error, it's more of a noisy
warning. I'd expect slf4j to use the first binding available - in your
case, logback-classic. Is that not the case?


On Mon, Aug 24, 2015 at 2:50 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
 Continuing this discussion:
 http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html

 I am getting this error when I use logback-classic.

 SLF4J: Class path contains multiple SLF4J bindings.
 SLF4J: Found binding in
 [jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 SLF4J: Found binding in
 [jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

 I need to use logback-classic for my current project, so I am trying to
 ignore slf4j-log4j12 from spark:
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version1.4.1/version
 exclusions
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 /exclusions
 /dependency

 Now, when I run my job from Intellij (which sets the classpath), things work
 perfectly.

 But when I run my job via spark-submit:
 ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner
 spark-0.1-SNAPSHOT-jar-with-dependencies.jar
 My job fails because spark-submit sets up the classpath and it re-adds the
 slf4j-log4j12.

 I am not adding spark jar to the uber-jar via the maven assembly plugin:
  dependencySets
 dependencySet
 ..
 useTransitiveDependenciesfalse/useTransitiveDependencies
 excludes
 excludeorg.apache.spark:spark-core_2.10/exclude
 /excludes
 /dependencySet
 /dependencySets

 So how can I exclude slf4j-log4j12.jar when I submit a job via
 spark-submit (on a per job basis)?

 --
 Thanks,
 -Utkarsh



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
Hi Marcelo,

When I add this exclusion rule to my pom:
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.10/artifactId
version1.4.1/version
exclusions
exclusion
groupIdorg.slf4j/groupId
artifactIdslf4j-log4j12/artifactId
/exclusion
/exclusions
/dependency

The SparkRunner class works fine (from IntelliJ) but when I build a jar and
submit it to spark-submit:

I get this error:
Caused by: java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerFactory
cannot be cast to ch.qos.logback.classic.LoggerContext
at
com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
at
com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
at com.opentable.logging.Log.clinit(Log.java:31)

Which is this here (our logging lib is open sourced):
https://github.com/opentable/otj-logging/blob/master/logging/src/main/java/com/opentable/logging/AssimilateForeignLogging.java#L68

Thanks,
-Utkarsh




On Mon, Aug 24, 2015 at 3:04 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Hi Utkarsh,

 Unfortunately that's not going to be easy. Since Spark bundles all
 dependent classes into a single fat jar file, to remove that
 dependency you'd need to modify Spark's assembly jar (potentially in
 all your nodes). Doing that per-job is even trickier, because you'd
 probably need some kind of script to inject the correct binding into
 Spark's classpath.

 That being said, that message is not an error, it's more of a noisy
 warning. I'd expect slf4j to use the first binding available - in your
 case, logback-classic. Is that not the case?


 On Mon, Aug 24, 2015 at 2:50 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  Continuing this discussion:
 
 http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html
 
  I am getting this error when I use logback-classic.
 
  SLF4J: Class path contains multiple SLF4J bindings.
  SLF4J: Found binding in
 
 [jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  SLF4J: Found binding in
 
 [jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 
  I need to use logback-classic for my current project, so I am trying to
  ignore slf4j-log4j12 from spark:
  dependency
  groupIdorg.apache.spark/groupId
  artifactIdspark-core_2.10/artifactId
  version1.4.1/version
  exclusions
  exclusion
  groupIdorg.slf4j/groupId
  artifactIdslf4j-log4j12/artifactId
  /exclusion
  /exclusions
  /dependency
 
  Now, when I run my job from Intellij (which sets the classpath), things
 work
  perfectly.
 
  But when I run my job via spark-submit:
  ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner
  spark-0.1-SNAPSHOT-jar-with-dependencies.jar
  My job fails because spark-submit sets up the classpath and it re-adds
 the
  slf4j-log4j12.
 
  I am not adding spark jar to the uber-jar via the maven assembly plugin:
   dependencySets
  dependencySet
  ..
  useTransitiveDependenciesfalse/useTransitiveDependencies
  excludes
  excludeorg.apache.spark:spark-core_2.10/exclude
  /excludes
  /dependencySet
  /dependencySets
 
  So how can I exclude slf4j-log4j12.jar when I submit a job via
  spark-submit (on a per job basis)?
 
  --
  Thanks,
  -Utkarsh



 --
 Marcelo




-- 
Thanks,
-Utkarsh


Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
I get the same error even when I set the SPARK_CLASSPATH: export
SPARK_CLASSPATH=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.1.jar:/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
And I run the job like this: /spark-1.4.1-bin-hadoop2.4/bin/spark-submit
--class runner.SparkRunner
target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

I am not able to find the code in spark which adds these jars before the
spark classes in classpath. Or maybe its a bug. Any suggestions on
workarounds?

Thanks,
-Utkarsh


On Mon, Aug 24, 2015 at 4:32 PM, Utkarsh Sengar utkarsh2...@gmail.com
wrote:

 I assumed that's the case beacause of the error I got and the
 documentation which says: Extra classpath entries to append to the
 classpath of the driver.

 This is where I stand now:
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version1.4.1/version
 exclusions
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 /exclusions
 /dependency

 And no exclusions from my logging lib.

 And I submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class
 runner.SparkRunner --conf
 spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
 --conf
 spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
 --conf
 spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
 --conf
 spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
 target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

 And I get the same error:
 Caused by: java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerFactory
 cannot be cast to ch.qos.logback.classic.LoggerContext
 at
 com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
 at
 com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
 at com.opentable.logging.Log.clinit(Log.java:31)
 ... 16 more


 Thanks,
 -Utkarsh

 On Mon, Aug 24, 2015 at 4:11 PM, Marcelo Vanzin van...@cloudera.com
 wrote:

 On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  That didn't work since extraClassPath flag was still appending the
 jars at
  the end, so its still picking the slf4j jar provided by spark.

 Out of curiosity, how did you verify this? The extraClassPath
 options are supposed to prepend entries to the classpath, and the code
 seems to be doing that. If it's not really doing that in some case,
 it's a bug that needs to be fixed.

 Another option is those is setting the SPARK_CLASSPATH env variable,
 which is deprecated, but might come in handy in case there is actually
 a bug in handling those options.


 --
 Marcelo




 --
 Thanks,
 -Utkarsh




-- 
Thanks,
-Utkarsh


Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
Hi Utkarsh,

A quick look at slf4j's source shows it loads the first
StaticLoggerBinder in your classpath. How are you adding the logback
jar file to spark-submit?

If you use spark.driver.extraClassPath and
spark.executor.extraClassPath to add the jar, it should take
precedence over the log4j binding embedded in the Spark assembly.


On Mon, Aug 24, 2015 at 3:15 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
 Hi Marcelo,

 When I add this exclusion rule to my pom:
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_2.10/artifactId
 version1.4.1/version
 exclusions
 exclusion
 groupIdorg.slf4j/groupId
 artifactIdslf4j-log4j12/artifactId
 /exclusion
 /exclusions
 /dependency

 The SparkRunner class works fine (from IntelliJ) but when I build a jar and
 submit it to spark-submit:

 I get this error:
 Caused by: java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerFactory
 cannot be cast to ch.qos.logback.classic.LoggerContext
 at
 com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
 at
 com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
 at com.opentable.logging.Log.clinit(Log.java:31)

 Which is this here (our logging lib is open sourced):
 https://github.com/opentable/otj-logging/blob/master/logging/src/main/java/com/opentable/logging/AssimilateForeignLogging.java#L68

 Thanks,
 -Utkarsh




 On Mon, Aug 24, 2015 at 3:04 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Hi Utkarsh,

 Unfortunately that's not going to be easy. Since Spark bundles all
 dependent classes into a single fat jar file, to remove that
 dependency you'd need to modify Spark's assembly jar (potentially in
 all your nodes). Doing that per-job is even trickier, because you'd
 probably need some kind of script to inject the correct binding into
 Spark's classpath.

 That being said, that message is not an error, it's more of a noisy
 warning. I'd expect slf4j to use the first binding available - in your
 case, logback-classic. Is that not the case?


 On Mon, Aug 24, 2015 at 2:50 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  Continuing this discussion:
 
  http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html
 
  I am getting this error when I use logback-classic.
 
  SLF4J: Class path contains multiple SLF4J bindings.
  SLF4J: Found binding in
 
  [jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  SLF4J: Found binding in
 
  [jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
 
  I need to use logback-classic for my current project, so I am trying to
  ignore slf4j-log4j12 from spark:
  dependency
  groupIdorg.apache.spark/groupId
  artifactIdspark-core_2.10/artifactId
  version1.4.1/version
  exclusions
  exclusion
  groupIdorg.slf4j/groupId
  artifactIdslf4j-log4j12/artifactId
  /exclusion
  /exclusions
  /dependency
 
  Now, when I run my job from Intellij (which sets the classpath), things
  work
  perfectly.
 
  But when I run my job via spark-submit:
  ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner
  spark-0.1-SNAPSHOT-jar-with-dependencies.jar
  My job fails because spark-submit sets up the classpath and it re-adds
  the
  slf4j-log4j12.
 
  I am not adding spark jar to the uber-jar via the maven assembly plugin:
   dependencySets
  dependencySet
  ..
  useTransitiveDependenciesfalse/useTransitiveDependencies
  excludes
  excludeorg.apache.spark:spark-core_2.10/exclude
  /excludes
  /dependencySet
  /dependencySets
 
  So how can I exclude slf4j-log4j12.jar when I submit a job via
  spark-submit (on a per job basis)?
 
  --
  Thanks,
  -Utkarsh



 --
 Marcelo




 --
 Thanks,
 -Utkarsh



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
That didn't work since extraClassPath flag was still appending the jars
at the end, so its still picking the slf4j jar provided by spark.
Although I found this flag: --conf spark.executor.userClassPathFirst=true
(http://spark.apache.org/docs/latest/configuration.html) and tried this:

➜  simspark git:(bulkrunner) ✗ spark-1.4.1-bin-hadoop2.4/bin/spark-submit
--class runner.SparkRunner --jars
/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar,/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
--conf spark.executor.userClassPathFirst=true --conf
spark.driver.userClassPathFirst=true
target/ds-tetris-simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

But this led to another error: com.typesafe.config.ConfigException$Missing:
No configuration setting found for key 'akka.version'

Thanks,
-Utkarsh

On Mon, Aug 24, 2015 at 3:25 PM, Marcelo Vanzin van...@cloudera.com wrote:

 Hi Utkarsh,

 A quick look at slf4j's source shows it loads the first
 StaticLoggerBinder in your classpath. How are you adding the logback
 jar file to spark-submit?

 If you use spark.driver.extraClassPath and
 spark.executor.extraClassPath to add the jar, it should take
 precedence over the log4j binding embedded in the Spark assembly.


 On Mon, Aug 24, 2015 at 3:15 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  Hi Marcelo,
 
  When I add this exclusion rule to my pom:
  dependency
  groupIdorg.apache.spark/groupId
  artifactIdspark-core_2.10/artifactId
  version1.4.1/version
  exclusions
  exclusion
  groupIdorg.slf4j/groupId
  artifactIdslf4j-log4j12/artifactId
  /exclusion
  /exclusions
  /dependency
 
  The SparkRunner class works fine (from IntelliJ) but when I build a jar
 and
  submit it to spark-submit:
 
  I get this error:
  Caused by: java.lang.ClassCastException:
 org.slf4j.impl.Log4jLoggerFactory
  cannot be cast to ch.qos.logback.classic.LoggerContext
  at
 
 com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
  at
 
 com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
  at com.opentable.logging.Log.clinit(Log.java:31)
 
  Which is this here (our logging lib is open sourced):
 
 https://github.com/opentable/otj-logging/blob/master/logging/src/main/java/com/opentable/logging/AssimilateForeignLogging.java#L68
 
  Thanks,
  -Utkarsh
 
 
 
 
  On Mon, Aug 24, 2015 at 3:04 PM, Marcelo Vanzin van...@cloudera.com
 wrote:
 
  Hi Utkarsh,
 
  Unfortunately that's not going to be easy. Since Spark bundles all
  dependent classes into a single fat jar file, to remove that
  dependency you'd need to modify Spark's assembly jar (potentially in
  all your nodes). Doing that per-job is even trickier, because you'd
  probably need some kind of script to inject the correct binding into
  Spark's classpath.
 
  That being said, that message is not an error, it's more of a noisy
  warning. I'd expect slf4j to use the first binding available - in your
  case, logback-classic. Is that not the case?
 
 
  On Mon, Aug 24, 2015 at 2:50 PM, Utkarsh Sengar utkarsh2...@gmail.com
  wrote:
   Continuing this discussion:
  
  
 http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html
  
   I am getting this error when I use logback-classic.
  
   SLF4J: Class path contains multiple SLF4J bindings.
   SLF4J: Found binding in
  
  
 [jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
   SLF4J: Found binding in
  
  
 [jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
  
   I need to use logback-classic for my current project, so I am trying
 to
   ignore slf4j-log4j12 from spark:
   dependency
   groupIdorg.apache.spark/groupId
   artifactIdspark-core_2.10/artifactId
   version1.4.1/version
   exclusions
   exclusion
   groupIdorg.slf4j/groupId
   artifactIdslf4j-log4j12/artifactId
   /exclusion
   /exclusions
   /dependency
  
   Now, when I run my job from Intellij (which sets the classpath),
 things
   work
   perfectly.
  
   But when I run my job via spark-submit:
   ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class
 runner.SparkRunner
   spark-0.1-SNAPSHOT-jar-with-dependencies.jar
   My job fails because spark-submit sets up the classpath and it re-adds
   the
   slf4j-log4j12.
  
   I am not adding spark jar to the uber-jar via the maven assembly
 plugin:
dependencySets
   dependencySet
   ..
  
  useTransitiveDependenciesfalse/useTransitiveDependencies
   

Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Marcelo Vanzin
On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar utkarsh2...@gmail.com wrote:
 That didn't work since extraClassPath flag was still appending the jars at
 the end, so its still picking the slf4j jar provided by spark.

Out of curiosity, how did you verify this? The extraClassPath
options are supposed to prepend entries to the classpath, and the code
seems to be doing that. If it's not really doing that in some case,
it's a bug that needs to be fixed.

Another option is those is setting the SPARK_CLASSPATH env variable,
which is deprecated, but might come in handy in case there is actually
a bug in handling those options.


-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Exclude slf4j-log4j12 from the classpath via spark-submit

2015-08-24 Thread Utkarsh Sengar
I assumed that's the case beacause of the error I got and the documentation
which says: Extra classpath entries to append to the classpath of the
driver.

This is where I stand now:
dependency
groupIdorg.apache.spark/groupId
artifactIdspark-core_2.10/artifactId
version1.4.1/version
exclusions
exclusion
groupIdorg.slf4j/groupId
artifactIdslf4j-log4j12/artifactId
/exclusion
/exclusions
/dependency

And no exclusions from my logging lib.

And I submit this task: spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class
runner.SparkRunner --conf
spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--conf
spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar
--conf
spark.driver.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
--conf
spark.executor.extraClassPath=/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar
target/simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

And I get the same error:
Caused by: java.lang.ClassCastException: org.slf4j.impl.Log4jLoggerFactory
cannot be cast to ch.qos.logback.classic.LoggerContext
at
com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
at
com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
at com.opentable.logging.Log.clinit(Log.java:31)
... 16 more


Thanks,
-Utkarsh

On Mon, Aug 24, 2015 at 4:11 PM, Marcelo Vanzin van...@cloudera.com wrote:

 On Mon, Aug 24, 2015 at 3:58 PM, Utkarsh Sengar utkarsh2...@gmail.com
 wrote:
  That didn't work since extraClassPath flag was still appending the
 jars at
  the end, so its still picking the slf4j jar provided by spark.

 Out of curiosity, how did you verify this? The extraClassPath
 options are supposed to prepend entries to the classpath, and the code
 seems to be doing that. If it's not really doing that in some case,
 it's a bug that needs to be fixed.

 Another option is those is setting the SPARK_CLASSPATH env variable,
 which is deprecated, but might come in handy in case there is actually
 a bug in handling those options.


 --
 Marcelo




-- 
Thanks,
-Utkarsh