[ 
https://issues.apache.org/jira/browse/SPARK-4371?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14209588#comment-14209588
 ] 

Sean Owen commented on SPARK-4371:
----------------------------------

Aha, so it's a web app containing Spark. Although it's sort of unsupported, I 
think you can make 'embedded Spark' work in many cases. I would think servlet 
containers isolate the classloader of webapps for this reason, so you can use 
slf4j 1.7 while the container uses what it likes, but, is that not happening? 
How about later versions of JBoss?

> Spark crashes with JBoss Logging 3.6.1
> --------------------------------------
>
>                 Key: SPARK-4371
>                 URL: https://issues.apache.org/jira/browse/SPARK-4371
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Florent Pellerin
>
> When using JBoss-logging which itself depends on slf4j 1.6.1,
> Since SLF4JBridgeHandler.removeHandlersForRootLogger() was added in slf4j 
> 1.6.5,
> Since spark/Logging.scala is doing at line 147:
> bridgeClass.getMethod("removeHandlersForRootLogger").invoke(null)
> Spark is crashing:
> java.lang.ExceptionInInitializerError: null
>         at java.lang.Class.getMethod(Class.java:1670)
>         at org.apache.spark.Logging$.<init>(Logging.scala:147)
>         at org.apache.spark.Logging$.<clinit>(Logging.scala)
>         at 
> org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:104)
>         at org.apache.spark.Logging$class.log(Logging.scala:51)
>         at org.apache.spark.SecurityManager.log(SecurityManager.scala:143)
>         at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
>         at org.apache.spark.SecurityManager.logInfo(SecurityManager.scala:143)
>         at 
> org.apache.spark.SecurityManager.setViewAcls(SecurityManager.scala:208)
>         at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:167)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:151)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> I suggest Spark should at least silently swallow the exception.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to